Emotion-sensitive Human-Computer Interaction (HCI): ?· Emotion-sensitive Human-Computer Interaction…

  • Published on
    21-Jul-2018

  • View
    213

  • Download
    1

Transcript

Emotion-sensitive Human-Computer Interaction (HCI):State of the art - Seminar paper Caroline VoeffrayUniversity of Fribourg1700 FribourgFribourgSwitzerlandcaroline.voeffray@unifr.chABSTRACTEmotion-sensitive Human-Computer Interaction (HCI) is ahot topic. HCI is the study, planning and design of the in-teraction between users an computer systems. Indeed todaya lot of research is moving towards this direction. To attractusers, more and more developers add the emotional side totheir applications. To simplify the HCI, systems must bemore natural, efficacious, persuasive and trustworthy. Andto do that, system must be able to sense and response appro-riately to the users emotional states. This paper presentsa short overview of the existing emotion-sensitive HCI ap-plications. This paper focused on what is done today andbrings out what are the most important features to take intoaccount for emotion-sensitive HCI. The most important find-ing is that applications can help people to feel better withtechnologies used in the daily life.General TermsEmotion in HCIKeywordsEmotion HCI, Affective HCI, Affective technology, Affectivesystems1. INTRODUCTIONFirst affective-applications must collect emmotionals datawith facial recognition, voice analysis and detecting physi-ological signals. Then the system interpret collected dataand must be adapting to the userIn the past, the study of usability and emotions were sep-arated but for some time, the area of human-computer in-teraction has evolved and today emotions take more spacein our life and the affective computing appeared. Affec-tive computing is a popular and innovative research areaSeminar emotion recognition : http://diuf.unifr.ch/main/diva/teaching/seminars/emotion-recognitionmainly in artificial intelligence. Affective computing consistof recognition, expression, modeling, communicating andresponding to emotions [4]. Research in this domain con-tributes to many different domains such as aircraft, jobs inscientific sectors like lawyers or police, anthropology, neurol-ogy, psychiatry or in behavioral science, but for this paperwe will focus especially on HCI. In this paper we will ap-proach different elements of Emotion-sensitive HCI. Firstlywe briefly discuss the role of emotions and how they can bedetected. Then, we explain the research areas related to af-fective computing. Finally we make a state of the art surveyof existing affective application.2. ROLE OF EMOTIONSEmotions are an important factor of life and they play anessential role to understand users behavior with computerinteraction [1]. In addition, the emotional intelligence playsa major role to measure aspects of success in life [2]. Recentresearches in human-computer interaction dont focus onlyon the cognitive approach, but on the emotions part too.Both approaches are very important, indeed take into acountthat emotions of the user solve some important aspects of thedesign in HCI systems. Additionnaly the human-machineinteraction could be better if the machine can adapt its be-havior according to users; and this system is seen more nat-ural, efficacious, persuasive, and trustworthy by users [2].The following question is asked Which is the connection be-tween emotions and design? and the respond is that Ourfeelings strongly influence our perceptions and often framehow we think or how we refer to our experiences at a laterdate, emotion is the differentiator in our experience [1].In our society, positive and negative emotions influence theconsumption of product and to understand the decisionalprocess, it could be crucial to measure the emotional ex-pression [1]. Emotions are an important part in our life, itswhy affective computing was developed. As say in [3] af-fective computing is to develop computer-based system thatrecognize and express emotions in the same way humans do.In human-computer interaction, the nonverbal communica-tion plays an important role, in that we can identify thedifficulties that users stumble upon, by measuring the emo-tions in facial expressions [1]. Therefore in [3] we talk abouta number of studies that have investigated peoples reac-tions and responses to computers that have been designedto be more human-like. And several studies have reporteda positive impact of computers that were designed to flatterand praise users when they did something right. With thesesystem, users have a better opinion of themselves [3].By cons, we have to be careful because sometimes, the sameexpression of an emotion can have a different significationin different countries and cultures. A smile in Europe is thesign of happiness, pleasure or irony. But for japanese peo-ple, it could simply imply their agreement with the appliedpunishment or could be the sign of indignation associatedwith the person applying the punishment [1]. These ethnicdifferences should make us aware of different results with theuse of the same affective system in different countries.3. EMOTIONAL CUESTo understand how emotions affect peoples behavior wemust understand the relationship between the different cuessuch as facial expressions, body language, gestures, tone ofvoice, etc. [3]. Before creating an affective system, we mustexamine how people express emotions and how we perceiveother peoples feelings.To have an intelligent HCI system that responds appropri-ately to the users affective feedback, the first step is thatthe system must be able to detect and interpret the usersemotional states automatically [2]. The visual channel (e.g.,facial expression) and the auditory (e.g. vocal caps) like vo-cal reactions are the most important features in the humanrecognition of affective feedback. But other elements needto be taken into account, for example body movements andthe physiological reactions. When a person judge the emo-tional state of someone else, it relies mainly on his facialand vocal expressions. However some emotions are harderto differentiate than others and need to consider other typesof signals such as gestures, posture or physiological signals.A lot of research have been done in the field of face and ges-ture recognition, especially to recognize facial expressions.Facial expressions can be seen as communicative signals orcan be considered as being expression of emotions [6]. Andthey can be associated with basic emotions like happiness,surprise, fear, anger, disgust or sadness [1].Another tool to detect emotions is the emotional speechrecognition. In the voice several factors can vary depend-ing on emotions, such as pitch, loudness, voice quality andrhythm.In case of human-computer interaction, we can detect emo-tions by monitoring the nervous system because for somefeelings, physiological signs are very marked [1]. We canmeasure the blood pressure, the skin conductivity, the rate ofbreathing or the finger temperature. Changing in physiolog-ical signals means a change in the users behavior. Unfortu-nately, physiological signals play a secondary role in humanrecognition of affective states, these signals are neglectedbecause to detect somenones clamminess or heart rate, weshould be in a physical contact with the person. The analy-sis of the tactile channel is harder because, the person mustbe wired to collect data and its usually perceived as beinguncomfortable and unpleasant [2]. Several techniques areavailable to capture this physiological signs; Electomyogram(EMG) for evaluating and recording the electrical activityproduced by muscles, Electrocardiogram to measure the ac-tivity of the heart with electrodes attached to the skin orskin conductance sensors can be used [6] .All these methods to collect data are very useful but theythey do not seem always easy to use. Firstly, available tech-nologies are restrictive and some parameters must absolutelybe taken into account to have valid data. We must differenti-ate users by gender, age, socio-geographical origin, physicalcondition or pathology. Moreover we need to remove thenoise of collected data like unwanted noise, image againstthe light or physical characteristics such as beard, glasses,hat, etc. Secondly, to collect physiological signals, we mustuse some intrusive methods like electrodes, chest trap orwearable computer [6].4. EMOTION-SENSITIVE APPLICATION4.1 Research areasThe automatic recognition of human affective states canbe used in many domains besides the HCI. In fact the as-sessment of different emotions like annoyance, inattentionor stress can be highly valuable in some situations. Theaffective-sensitive monitoring done by a computer could pro-vide prompts for better performance. Especially for certainjobs like aircraft and air traffic controller, nuclear powerplant surveillance or all jobwhere we drive a vehicle. Inthese professions attention to a crucial task is essential. Inscientific sectors like lawyers, police or security agents, mon-itoring and interpreting affective behavioral signs can bevery useful. These information could help in critical situ-ations such as to know the veracity of testimonies. Anotherarea where the computer analysis of human emotion can bebenefit is the automatic affect-based indexing of digital vi-sual material. Detection of pain, rage and fear in scenescould provide a good tool for violent-content-based index-ing of movies, video material and digital libraries. Finallymachine analysis of human affective states would also facil-itate research in anthropology, neurology, psychiatry or inbehavioral science. In these domains sensitivity, reliabilityand precision are recurring problems, this kind of emotionrecognition can help them to advance in their research [2].In this paper we focused on HCI and one of the most impor-tant goal of HCIs application is to design technologies thatcan help people feeling better. For example how an affectivesystem can calm a crying child or prevent strong feeling ofloneliness and negative emotions [3].A domain where recognition of emotions in HCI is widelyused is the evaluation of interface. The design of an interfacecan influence strongly the emotional impact of the systemon the users [3]. The appareance of an interface such as com-bination of shapes, fonts, colors, balance, white space andgraphical elements determine the first users feeling. More-over there can have a positive or a negative effect on peoplesperception of the systems usability. For example with good-looking interfaces users are more tolerant because the systemis more satisfying and pleasant to use. For example if thewaiting time to download a website is long, with a good-looking interface, the user is prepared to wait a few moreseconds. On the contrary, computer interfaces can cause touser frustration and negative reactions like anger or disgust.It happens when something is too complex, if the systemcrashes or doesnt work properly, or if the appareance of theinterface is not adapted. Therefore the recognition of emo-tions is important to evaluate the usability and the interfaceof applications. Emotions felt by user play an important rolein the success of a new application.Another big research area is the creation of intelligent robotsand avatars that behave like humans [3]. The goal of thisdomain is to develop computer-based systems that recognizeand express emotions like humans. There are two major sec-tors, affective interface for children like dolls and animatedpets or intelligent robots developed to interact and help peo-ple.Safety driving or soldier training are other examples of emo-tionnal research areas. Sensing devices can be used to mea-sure different physiological signals to detect stress or frus-tration. For drivers, panic and sleepiness levels can be mea-sured and if these signals are too high, the system alerts theuser to be more careful or advise to stop for a break. Thephysiological data from soldiers can be used to design a bet-ter training plan without frustration, confusion or panic [6].Finally, sensing devices can also be used to measure thebody signals of patients that are having tele-home healthcare. Collected data can be sent to a doctor or a nurse forfurther decisions [6], [5] .4.2 ApplicationsIn this section, we make a state of the art of different typesof existing affective-applications.4.2.1 Emoticons and companionThe evolution of emotions in technology is explain in thisbook [3]. The first expressive interfaces were designed byemoticons, sounds, icons and virtual agents. For example inthe 1980s and 1990s when an Apples computer was booted,the user saw the happy Mac icon on the screen. The mean-ing of this smiling icon was that the computer was workingcorrectly. Moreover, the smile is a sign of friendliness andmay encourage the user to feel good and smile back.An existing technique to help users is the use of friendlyagents at the interface. This companion helps users to feelbetter and encourages them to try things out. An exampleis the infamous Clippy, the paper clip with human-like qual-ities as part of their windows 98 operating system. Clippyappears in the users screen when the system thinks that theuser need some help to make some tasks [3].Later, users have also found ways to express emotions throughthe computer by using emoticons. The combination of key-board symbols that simulate facial expressions, allows usersto express their feelings and emotions [3].4.2.2 Online chat with animated textResearches demonstrated that animated text was effectiveto convey a speakers tone of voice, affection and emotion.Therefore an affective chat system uses animated text asso-ciated with emotional information is proposed in [9]. Thischat application focuses only on text messages because itssimpler than video, the size of data is smaller and concernsless the privacy. A GSR (Galvanic Skin Response) sensorwas attached to the middle and to the index fingers of theusers non-dominant hand to collected necessary affective in-formation. An animated text is created according to thesedata. With this method, the peaks and troughs of GSRdata are analyze to detect emotions in real-time and ap-plies it to the online chat interface. Because its difficult toobtain valence information from physiological sensors, theuser specifies manually the animation tag for the type ofemotion, whereas physiological data are used to detect theintensity of emotion. Twenty types of animations are avail-able to change the speed, size, color and interaction of thetext. The user specifies the emotion with a tag before themessage. Through this animated chat, the user can deter-mine the affective state of his partner.4.2.3 Interactive Control of MusicWe discover an interface to mix pieces of music by movingtheir body in different emotional styles [8]. In this applica-tion, the users body is a part of the interface. The systemanalyzes the body motions and produces a mix of musicto represent expressed emotions in real time. To classifybody motions into emotions, the machine learning is used.This approach give very satisfactory results. To begin, atraining phase is needed, during which the system observesthe user moving in response to particular emotional musicpieces. After the training part, the system is able to recog-nize users natural movements associated with each emotion.This phase is necessary because each participant has a verydifferent way to express his emotions and body motions arevery high-dimensional. There are variations even if the sameuser makes several times the same movement. A database offilm music classified by emotions is used. The users detectedemotion are mapped with corresponding music in databaseand a music mix is produced.4.2.4 Tele-home health careInteresting affective applications are developed in the healthcare, for example an Tele-home health care (Tele-HHC) ap-plication discovered in [5]. The goal of this system is toprovide communication between patient and medical profes-sionnal via multimedia and empathetic avatars when hands-on care is not required. The Tele-HHC can be used to collectdifferent vital sign data remotely. This system is used to ver-ify compliance with medicine regimes, to improve diet com-pliance or to assess mental or emotional status. Knowingthe patients emotional state can significantly improve hiscare. The system monitors and responds to patients. An al-gorithm map the physiological data collected with emotions.After that, the system synthesizes all informations for use bythe medical staff. For this research a system which observethe user via multi-sensory devices is used. A wireless non-invasive wearable computer is employed to collect all neededphysiological signals and mapping these to emotional states.A library of avatars is built, and these avatars can be usedin three different ways. Firstly, avatars can assist the user tounderstand his emotional state. Secondly they can be usedto mirror the users emotions to confirm it. Finally they cananimate a chat session showing empathic expressions.4.2.5 Arthur and D.W Aardvarks:Systems related to children is common research area. Posi-tive emotions play an important role in learning and mentalgrowth of children. Therefore an affective interface can beimportant in achieving the learning goals. Two animated,interactive plush dolls ActiMates Arthur and D.W were de-veloped [10]. These two characters are creations of childrensFigure 1: ActiMates D.W and ActiMates Arthurauthor Marc Brown, and are familiar to children for morethan 15 years. These dolls look like animals but have a hu-man behavior with their own personalities. They can moveand speak. Seven sensors located in their body allows chil-dren to interact with them. Each sensors has one definedfunction. When the dolls speak three emotional interactionsare used: humour, praise and affection. Arthur and D.W.can ask questions, offer opinions, share jokes and give com-pliments to children. Children interact with these dolls toplay games or to listen joke and secret. The goal of thesedolls is to promote the mental growth of children throughthe systematic use of social responses to positive affect dur-ing their playful learning efforts.4.2.6 Affective diaryAn affective diary to express innerthoughts and collect expe-riences or memories was developed [7]. This diary capturesthe physical aspects of experiences and emotions. The mostimportant element that differs from other traditional diariesis the addition of data collected by sensors like pulse, skinconductivity, pedometer, body movement and posture, etc.Collected data are uploaded via a mobile phone and used toexpress emotions. Other materials from the mobile phonelike texts, MMS messages, photographs, videos, sounds, etc.can be combined and added to events or memories to createthe diary. With this system, a graphical representation ofthe emotions of the day can be created. For that, the usermust load all data into the computer, then a movie with am-biguous, abstract colourful body shap is created. This filmis consolidated by an expressive sound or music played inbackground. The user can modify and organize the movie,edit the content, change the body state or the color, addtext, etc.5. CONCLUSIONWe addressed the role of emotions in life and HCI. We havehighlighted the fact that emotions are an important partof the human-computer interaction. The different featuresavailable to automatically detect emotions of users are thevisual, the auditory and the physiological channels. By com-bining all these data the affective system must be able todetermine the emotion of the current user. Several researchareas like aircraft, scientific sectors or HCI used the emotionsrecognition. This subject is hot and attractive but relativelynew. Not real applications are already commercialize, cur-rently there are only prototypes of affective application.This is a topic that has a future and will be increasinglypresent in our technologies life because emotions play animportant role in our life. Affective interfaces and applica-tions can help people to feel better with computers and inthe daily life. Stress, frustation and angry can be reducedwith sensible and adaptive systems.Finally the challenge today is to improve technologies tocollect data from different features. Existing systems mustbe improve to really identify emotions despite factors thatcan skew results. Tools to automatically recognize emotionalcues must keep only key items and must throw all the noises.6. REFERENCES[1] Irina Cristescu. Emotions in human-computerinteraction: the role of non-verbal behavior ininteractive systems. Revista Informatica Economica,vol. 2 n a46, 110-116, 2008.[2] M. Pantic and L. J. M. Rothkrantz. Toward anaffect-sensitive multimodal Human-ComputerInteraction. proc. of the IEEE, vol. 91 n a9, 1370-1390,September 2003.[3] Yvonne Rogers, Helen Sharp, Jenny Preece. InteractionDesign : beyond human-computer interaction. Wiley; 3edition, chapter 5, June 2011.[4] Luo Qi. Affective Emotion Processing Research andApplication. International Symposiums on InformationProcessing, 768-769, 2008.[5] C. Lisetti, F. nasoz, C. leRouge, O. Ozyer, K. Alvarez.Developing multimodal intelligent affective interfacesfor tele-home health care. International journal ofHuman-Computer Studies, vol.59, n a46, 245-255, July2003.[6] Christine L. Lisetti, Fatma Nasoz. MAUI: aMultimodal Affective User Interface. MULTIMEDIA02 Proceedings of the tenth ACM internationalconference on Multimedia, 161-170, 2002.[7] Madelene lindstrZm, Anna StNhl, kristina HZZk, PetraSundstrZm, Jarmo Laaksolathi, Marco Combetto, AlexTaylor, Roberto Bresin. Affective Diary - Designing forBodily Expressiveness and Self-Reflection . CHI EA 06CHI 06 extended abstracts on Human factors incomputing systems, 1037-1042, april 2006.[8] Daniel Bernhardt, Peter Robinson. Interactive Controlof Music using Emotional Body Expressions. CHI EA08 CHI 08 extended abstracts on Human factors incomputing systems, 3117-3122, april 2008.[9] Hua Wang, Helmut Prendinger, Takeo Igarash.Communicating Emotions in Online Chat usingPhysiological Sensors and Animated Text. CHI EA 04CHI 04 extended abstracts on Human factors incomputing systems, 1171-1174, april 2004.[10] Erik Strommen, Kristin Alexander. EmotionalInterfaces for interactive Aardvarks: Designing Affectinto Social Interfaces for Children. Proceedings of ACMCHI99 (May 1-3, 1999, Pittsburgh, PA) Conference onHuman Factors in Computing Systems, 528-535, may1999.

Recommended

View more >