Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Implicit and Cross-Device Interaction - Lecture 10 - Next Generation User Interfaces (4018166FNR)

Implicit and Cross-Device Interaction - Lecture 10 - Next Generation User Interfaces (4018166FNR)

This lecture forms part of a course on Next Generation User Interfaces given at the Vrije Universiteit Brussel.

Beat Signer

April 29, 2024
Tweet

More Decks by Beat Signer

Other Decks in Education

Transcript

  1. 2 December 2005 Next Generation User Interfaces Implicit and Cross-Device

    Interaction Prof. Beat Signer Department of Computer Science Vrije Universiteit Brussel beatsigner.com
  2. Beat Signer - Department of Computer Science - [email protected] 2

    April 29, 2024 Implicit Human-Computer Interaction ▪ Over the last decade, we have seen a clear trend towards smart environments and living spaces where sensors and information processing is embedded into everyday objects as foreseen in Mark Weiser’s vision of ubiquitous computing with the goal to simplify the use of technology ▪ In Implicit Human-Computer Interaction (iHCI), we try to use contextual factors (e.g.various sensor input) to build human-centred anticipatory user interfaces based on naturally occurring human interactive behaviour ▪ Context-aware computing can be used to design implicit human-computer interaction
  3. Beat Signer - Department of Computer Science - [email protected] 3

    April 29, 2024 Implicit Human-Computer Interaction … ▪ Implicit Human-Computer Interaction (iHCI) is orthogonal to (traditional) explicit HCI ▪ implicit communication channels (incidental interaction) can help in building more natural human-computer interaction [https://www.interaction-design.org/encyclopedia/context-aware_computing.html]
  4. Beat Signer - Department of Computer Science - [email protected] 4

    April 29, 2024 Context ▪ Context-aware systems often focus on location as the only contextual factor ▪ However, even if location is an important factor, it is only one context dimension Context is any information that can be used to character- ize the situation of an entity. An entity is a person, place, or object that is considered relevant to the interaction between a user and an application, including the user and applications themselves. A.K. Dey, 2000
  5. Beat Signer - Department of Computer Science - [email protected] 5

    April 29, 2024 Example: Car Navigation ▪ Various contextual factors can be taken into account when designing the inter- face of a car navigation system ▪ current location (GPS) ▪ traffic information ▪ daylight - automatically adapt screen brightness ▪ weather ▪ current user task - e.g.touch is disabled while driving and only voice input can be used ▪ …
  6. Beat Signer - Department of Computer Science - [email protected] 6

    April 29, 2024 Everyday Examples ▪ Systems that take user actions as input and try to output an action that is a proactive anticipation of what the users need ▪ simple motion detector at doors that open the door automatically to allow humans with shopping carts to pass through ▪ escalators that move slowly when not in use but speed up when they sense a person pass the beginning of the escalator ▪ smartphones and tablets automatically changing between landscape and portrait mode based on their orientation ▪ smart meeting rooms that keep track of the number of people in a meeting room and alter the temperature and light appropriately ▪ …
  7. Beat Signer - Department of Computer Science - [email protected] 7

    April 29, 2024 Exercise: Context-aware Digital Signage
  8. Beat Signer - Department of Computer Science - [email protected] 8

    April 29, 2024 Contextual Factors ▪ Human factors ▪ user ▪ social environment ▪ task ▪ .. ▪ Physical environment ▪ location ▪ infrastructure ▪ conditions ▪ …
  9. Beat Signer - Department of Computer Science - [email protected] 9

    April 29, 2024 From Sensor Input to Context ▪ How do we compute the perceived context from a single or multiple sensor inputs? ▪ machine learning techniques ▪ rule-based solutions ▪ … ▪ How should we model context? ▪ e.g.generic context models without application-specific notion of context ▪ How to trigger implicit interactions based on context? ▪ How to author new context elements? ▪ relationships with sensor input, existing context elements as well as application logic
  10. Beat Signer - Department of Computer Science - [email protected] 10

    April 29, 2024 User-Context Perception Model (UCPM) Musumba and Nyongesa, 2013
  11. Beat Signer - Department of Computer Science - [email protected] 11

    April 29, 2024 Things Going Wrong ▪ What if the implicit interaction with a system goes wrong? ▪ is it really the wrong system behaviour or is the user just not aware of all factors taken into account (awareness mismatch)? ▪ The quality of implicit human-computer interaction as perceived by the user is directly related to the awareness mismatch ▪ Fully-automated vs.semi-automated systems ▪ sometimes it might be better to not fully automate the interaction since wrong implicit interactions might result in a bad user experience ▪ keep the user in the loop
  12. Beat Signer - Department of Computer Science - [email protected] 12

    April 29, 2024 Intelligibility ▪ Improved system intelligibility might increase a user's trust, satisfaction and acceptance of implicit interactions ▪ Users may ask the following questions (Lim et al., 2009) ▪ What: What did the system do? ▪ Why: Why did the system do X? ▪ Why Not: Why did the system not do X? ▪ What If: What would the system do if Y happens? ▪ How To: How can I get the system to do Z, given the current context? ▪ Explanations should be provided on demand only in order to avoid information overload ▪ feedback easier for rule-based solutions than for machine learning-based approaches
  13. Beat Signer - Department of Computer Science - [email protected] 13

    April 29, 2024 Context Modelling Toolkit (CMT) ▪ Multi-layered context modelling approach ▪ seamless transition between end users, expert users and programmers ▪ Beyond simple "if this then that" rules ▪ reusable situations ▪ Client-server architecture ▪ server: context reasoning based on Drools rule engine ▪ client: sensor input as well as applications End User Expert User Functions Actions Template Filled in template Situation Situations Facts Rule Programmer Rule (4) (5) (6) (7) (8) Trullemans, Holsbeeke and Signer, 2017
  14. Beat Signer - Department of Computer Science - [email protected] 14

    April 29, 2024 Context Modelling Toolkit (CMT) … Trullemans, Holsbeeke and Signer, 2017
  15. Beat Signer - Department of Computer Science - [email protected] 15

    April 29, 2024 HCI and iHCI in Smart Environments Smart meeting room in the WISE lab
  16. Beat Signer - Department of Computer Science - [email protected] 16

    April 29, 2024 Some Guidelines for Implicit HCI ▪ Always first investigate what users want/have to do ▪ as a second step see what might be automated ▪ use context-awareness as a source to make things easier ▪ The definition of a feature space with factors that will influence the system helps in realising context-aware implicit interactions ▪ find parameters which are characteristic for a context to be detected and find means to measure those parameters ▪ Always try to minimise the awareness mismatch ▪ increase intelligibility by providing information about the used sensory information (context) in the user interface
  17. Beat Signer - Department of Computer Science - [email protected] 17

    April 29, 2024 Some Guidelines for Implicit HCI … ▪ Designing proactive applications and implicit HCI is a very difficult task because the system must anticipate what users want ▪ always investigate whether a fully-automated solution is best or whether the user should be given some choice (control)
  18. Beat Signer - Department of Computer Science - [email protected] 18

    April 29, 2024 Affective Computing ▪ Computing that takes into account the recognition, interpretation, modelling, processing and synthesis of human affects (emotions) ▪ Implicit human-computer interaction can also be based on recognised human emotions Rosalind W. Picard
  19. Beat Signer - Department of Computer Science - [email protected] 19

    April 29, 2024 Emotions ▪ External events ▪ behaviour of others, change in a current situation, … ▪ Internal events ▪ thoughts, memories, sensations, ... Emotions are episodes of coordinated changes in several components (neurophysiological activation, motor expression, subjective feelings, action tendencies and cognitive processes) in response to external or internal events of major significance to the organism. Klaus R. Scherer, Psychological Models of Emotion, 2000
  20. Beat Signer - Department of Computer Science - [email protected] 20

    April 29, 2024 Emotion Classification ▪ Different models to classify emotions ▪ Discrete models treat emotions as discrete and different constructs ▪ Ekman’s model ▪ … ▪ Dimensional models characterise emotions via dimensional values ▪ Russell’s model ▪ Plutchik’s model ▪ PAD emotional state model ▪ …
  21. Beat Signer - Department of Computer Science - [email protected] 21

    April 29, 2024 Ekman’s Emotions Model ▪ Theory of the universality of six basic facial emotions ▪ anger ▪ fear ▪ disgust ▪ surprise ▪ happiness ▪ sadness ▪ Discrete categories can be used as labels for emotion recognition algorithms ▪ multiple existing databases rely on Ekman’s model
  22. Beat Signer - Department of Computer Science - [email protected] 22

    April 29, 2024 Russell’s Circumplex Model of Affect ▪ Emotions are mapped to two dimensions ▪ valence (x-axis) - intrinsic attractiveness or aversiveness ▪ arousal (y-axis) - reactiveness to a stimuli
  23. Beat Signer - Department of Computer Science - [email protected] 23

    April 29, 2024 Pluchik’s Wheel of Emotions ▪ Three-dimensional "extension" of Russell’s circumplex model ▪ 8 basic emotions ▪ joy vs.sadness ▪ trust vs.disgust ▪ fear vs.anger ▪ surprise vs. anticipation ▪ 8 advanced emotions ▪ optimism (anticipation + joy) ▪ love (joy + trust) ▪ submission (trust + fear)
  24. Beat Signer - Department of Computer Science - [email protected] 24

    April 29, 2024 Pluchik’s Wheel of Emotions … ▪ 8 advanced emotions … ▪ awe (fear + surprise) ▪ disapproval (surprise + sadness) ▪ remorse (sadness + disgust) ▪ contempt (disgust + anger) ▪ aggressiveness (anger + anticipation)
  25. Beat Signer - Department of Computer Science - [email protected] 25

    April 29, 2024 PAD Emotional State Model ▪ Representation of emotional states via three numerical dimensions ▪ pleasure-displeasure ▪ arousal-nonarousal ▪ dominance-submissiveness ▪ Example ▪ anger is a quite unpleasant, quite aroused and moderately dominant emotion
  26. Beat Signer - Department of Computer Science - [email protected] 26

    April 29, 2024 Self-Assessment of PAD Values ▪ Self-Assessment Manikin (SAM) is a language neutral form that can be used to assess the PAD values ▪ each row represents five values for one of the dimensions - pleasure - arousal - dominance
  27. Beat Signer - Department of Computer Science - [email protected] 27

    April 29, 2024 Emotion Recognition ▪ Emotions can be manifested via different modalities ▪ acoustic features (voice pitch, intonation, etc.) ▪ verbal content (speech) ▪ visual facial features ▪ body pose and gestures ▪ biosignals (physiological monitoring) - pulse, heart rate, … ▪ In general, artificial intelligence algorithms are used for an accurate recognition of emotions ▪ Potential multimodal fusion of multiple modalities ▪ improve emotion recognition accuracy by observing multiple modalities
  28. Beat Signer - Department of Computer Science - [email protected] 28

    April 29, 2024 Acoustic Feature Recognition ▪ Behaviour and evolution of acoustic features over time is meaningful for emotion detection ▪ Typical features ▪ intonation ▪ intensity ▪ pitch ▪ duration
  29. Beat Signer - Department of Computer Science - [email protected] 29

    April 29, 2024 Facial Emotion Recognition ▪ Find face parts ▪ use orientation or prominent features such as the eyes and the nose ▪ Extract facial features ▪ geometry based ▪ appearance based (textures) ▪ Classification through ▪ support vector machines ▪ neural networks ▪ fuzzy logic systems ▪ active appearance models
  30. Beat Signer - Department of Computer Science - [email protected] 30

    April 29, 2024 Facial Action Coding System (FACS) ▪ Used to describe changes, contraction or relaxations of muscles of the face ▪ Based on so-called Action Units (AUs) ▪ description for component movement or facial actions ▪ combination of AUs leads to facial expressions - e.g. sadness = AU 1+4+15 ▪ https://www.cs.cmu.edu/~fac e/facs.htm
  31. Beat Signer - Department of Computer Science - [email protected] 31

    April 29, 2024 Body Pose and Gestures ▪ Body language carries rich emotional information ▪ body movement, gestures and posture ▪ relative behaviour (e.g.approach/depart, looking/turning away) ▪ Detailed features extracted from motion capture
  32. Beat Signer - Department of Computer Science - [email protected] 32

    April 29, 2024 Biosignals ▪ Different emotions lead to different biosignal activities ▪ anger: increased heart rate and skin temperature ▪ fear: increased heart rate but decreased skin temperature ▪ happiness: decreased heart rate and no change in skin temperature ▪ Advantages ▪ hard to control deliberately (fake) ▪ can be continuously processed ▪ Disadvantages ▪ user has to be equipped with sensors ▪ Challenge ▪ wearable biosensors
  33. Beat Signer - Department of Computer Science - [email protected] 33

    April 29, 2024 Emotiv EPOC Neuroheadset ▪ Non-invasive EEG device ▪ 14 sensors ▪ Integrated gyroscope ▪ Wireless ▪ Low cost ▪ Average sensor sensibility ▪ mainly due to sensor non-invasiveness
  34. Beat Signer - Department of Computer Science - [email protected] 34

    April 29, 2024 Emotiv EPOC Neuroheadset …
  35. Beat Signer - Department of Computer Science - [email protected] 35

    April 29, 2024 From Signals to Labelled Emotions ▪ Five potential channels ▪ visual: face ▪ visual: body movement ▪ acoustic: speech content ▪ acoustic: acoustic features ▪ physiological: heart rate, blood pressure, temperature, galvanic skin response (GSR), electromyography (EMG) ▪ Associating emotion descriptors ▪ machine learning problem ▪ SVMs, HMMs, NNs? ▪ rely on only a single modality or fusion of multiple modalities? ▪ associate emotion descriptors before or after fusing modalities? - i.e.feature- or decision-level fusion?
  36. Beat Signer - Department of Computer Science - [email protected] 36

    April 29, 2024 EUD of Cross-Device and IoT Applications ▪ Rapid prototyping platform for cross-device and IoT applications (eSPACE) ▪ End-user authoring ▪ customised distribution of user interface components ▪ mashup tool for digital and physical (IoT) components Sanctorum and Signer, 2019
  37. Beat Signer - Department of Computer Science - [email protected] 37

    April 29, 2024 eSPACE Interaction View Sanctorum, 2020
  38. Beat Signer - Department of Computer Science - [email protected] 38

    April 29, 2024 eSPACE Rules View Sanctorum, 2020
  39. Beat Signer - Department of Computer Science - [email protected] 39

    April 29, 2024 Cross-Platform IoT Solution ▪ Many different IoT plat- forms ▪ rules normally not compatible ▪ NLP-based solution to translate proprietary rules to high-level EUPont model ▪ high-level rules managed in personal Solid Pods ▪ rules can be executed on different IoT platforms via an EUPont runtime Attoh and Signer, 2024
  40. Beat Signer - Department of Computer Science - [email protected] 40

    April 29, 2024 References ▪ M. Weiser, The Computer for the 21st Century, Scientific American, 265(3), September 1991 ▪ https://dx.doi.org/10.1145/329124.329126 ▪ A. Schmidt, Context-Awareness, Context-Aware User Interfaces and Implicit Interactions ▪ https://www.interaction-design.org/encyclopedia/context- aware_computing.html ▪ G.W. Musumba and H.O. Nyongesa, Context Awareness in Mobile Computing: A Review, International Journal of Machine Learning and Applications, 2(1), 2013 ▪ https://dx.doi.org/10.4102/ijmla.v2i1.5
  41. Beat Signer - Department of Computer Science - [email protected] 41

    April 29, 2024 References … ▪ B.Y. Lim, A.K. Dey and D. Avrahami, Why and Why Not Explanations Improve the Intelligibility of Context-aware Intelligent Systems, Proceedings of CHI 2009, Boston, USA, April 2009 ▪ https://doi.org/10.1145/1518701.1519023 ▪ S. Trullemans, L. Van Holsbeeke and B. Signer, The Context Modelling Toolkit: A Unified Multi-Layered Context Modelling Approach, Proceedings of the ACM on Human-Computer Interaction (PACMHCI), 1(1), June 2017 ▪ https://beatsigner.com/publications/trullemans_EICS2017.pdf
  42. Beat Signer - Department of Computer Science - [email protected] 42

    April 29, 2024 References … ▪ J.A. Russel, A Circumplex Model of Affect, Journal of Personality and Social Psychology, 39(6), 1980 ▪ https://content.apa.org/doi/10.1037/h0077714 ▪ R.W. Picard, Affective Computing, MIT Technical Report No. 321, 1995 ▪ https://affect.media.mit.edu/pdfs/95.picard.pdf ▪ E. Attoh and B. Signer, Towards a Write Once Run Anywhere Approach in End-User IoT Development, Proceedings of IoTBDS 2024, April 2024 ▪ https://beatsigner.com/publications/attoh_IOTBDS2024.pdf
  43. Beat Signer - Department of Computer Science - [email protected] 43

    April 29, 2024 References … ▪ A. Sanctorum and B. Signer, A Unifying Reference Framework and Model for Adaptive Distributed Hybrid User Interfaces, Proceedings of RCIS 2019, May 2019 ▪ https://beatsigner.com/publications/sanctorum_RCIS2019.pdf ▪ A. Sanctorum, eSPACE: Conceptual Foundations for End-User Authoring of Cross-Device and Internet of Things Applications, PhD Thesis, Vrije Universiteit Brussel, July 2020 ▪ https://beatsigner.com/theses/PhDThesisAudreySanctorum.pdf