Multimodal user interfaces in ubiquitous sensorised environments
Kernchen, R, Presser, M, Mossner, K, Tafazolli, R, Palaniswami, M, Krishnamachari, B and Challa, S (2004) Multimodal user interfaces in ubiquitous sensorised environments Proceedings of the 2004 Intelligent Sensors, Sensor Networks & Information Processing Conference . 397 - 401.
|PDF - Published Version|
Official URL: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumb...
This paper aims to show where ubiquitous sensor environments in combination with multimodal user interfaces can be used to further enhance the user (and usage) experience. The challenge in multimodal user interface design is to combine the available user input modalities with the available sensor data in a meaningful way. A conceptual framework for sensorised modality input composition and semantically controlled output decomposition in Ubiquitous Sensorised Environments is introduced and its individual components, the Ubiquitous Sensorised Environment, the input processing component, the situation aware application component and the output preparation component are described. The paper also illustrates the functionality by means of an example, namely user mood awareness and application. Finally considerable focus is on the sensors nodes, classed in to independent, parasitic and viral sensors and some examples of what sensor payloads need to be employed to facilitate mood sensing are given.
|Uncontrolled Keywords:||Science & Technology, Technology, Computer Science, Artificial Intelligence, Engineering, Electrical & Electronic, Remote Sensing, Telecommunications, Computer Science, Engineering|
|Divisions:||Faculty of Engineering and Physical Sciences > Electronic Engineering > Centre for Communication Systems Research|
|Deposited By:||Melanie Hughes|
|Deposited On:||07 Oct 2010 11:31|
|Last Modified:||16 Feb 2013 16:13|
Repository Staff Only: item control page