/
C Baranauskas et al Eds INTERACT 2007 LNCS 4662 Part I pp 51 C Baranauskas et al Eds INTERACT 2007 LNCS 4662 Part I pp 51

C Baranauskas et al Eds INTERACT 2007 LNCS 4662 Part I pp 51 - PDF document

madeline
madeline . @madeline
Follow
342 views
Uploaded On 2021-06-15

C Baranauskas et al Eds INTERACT 2007 LNCS 4662 Part I pp 51 - PPT Presentation

ID: 842676

state emotional collective 2007 emotional state 2007 collective emotion people general determine affective eindhoven computing interaction bartneck mahmud mubin

Share:

Link:

Embed:

Download Presentation from below link

Download Pdf The PPT/PDF document "C Baranauskas et al Eds INTERACT 2007 LN..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.


Presentation Transcript

1 C. Baranauskas et al. (Eds.): INTERACT 2
C. Baranauskas et al. (Eds.): INTERACT 2007, LNCS 4662, Part I, pp. 511–514, 2007. © IFIP International Federation for Information Processing 2007 TEMo-Chine: Tangible Emotion Machine Omar Mubin 1 , Abdullah Al Mahmud 1 , and Christoph Bartneck 1 User-System Interaction Program 2 Department of Industrial Design Eindhoven University of Technology P.O. Box 5

2 13, 5600 MB Eindhoven, The Netherlands
13, 5600 MB Eindhoven, The Netherlands {O.Mubin, A.Al-Mahmud, C.Bartneck}@tue.nl Abstract. We examine whether or not it is possible to determine, recognize exhibit similar emotional expression and interaction modalities, which could be used to determine general emotional states. 1 Introduction In a public environment it would be interesting to have a feel o

3 f the collective emotional state of peo
f the collective emotional state of people in general, for example is the majority of the community bidirectional approach (i.e., the system responds as well based on the input) using Artificial Intelligence and various other paradigms. There have been numerous works in the area of Affective Computing (computing that relates to, arises from and influences emoti

4 ons [3]) that have employed physiologica
ons [3]) that have employed physiological measures such as blood [1, 4]. Our goal was to ascertain if we could actually generalize the emotional state via affective feedback from a collective group of people. We chose to investigate if physical actions could be used to interpret basic emotional state and direction (e.g., targeted object or avat ar) of an emoti