it is discussed how the availability of processing power and advanced sensing
technology can enable a shift in HCI from explicit interaction, such as direct
manipulation GUIs, towards a more implicit interaction based on situational
context. In the paper, an algorithm is given based on a number of questions
to identify applications that can facilitate implicit interaction. An XML-based
language to describe implicit HCI is proposed. The language uses contextual
variables that can be grouped using different types of semantics as well as actions
that are called by triggers. The term of perception is discussed and four basic
approaches are identified that are useful when building context-aware applications.
Two examples, a wearable context awareness component and a sensor-board,
show how sensor-based perception can be implemented. It is also discussed how
situational context can be exploited to improve input and output of mobile devices.
Interactive systems have been the dominant computing paradigm over
recent years. This paradigm is characterized by the fact that human user and the
system communicate and interact explicitly using different modalities. However to
come closer to visions of Ambient Intelligence, Calm Computing, Disappearing
Computing, and Ubiquitous Computing new forms of interaction are required.
Observing humans interacting with each other and new possibilities given by
emerging technologies indicate that a new interaction model is needed. In this
chapter we present the concept of implicit human computer interaction (iHCI) that
takes the users context into account when creating new user interfaces for ambient