SlideShare una empresa de Scribd logo
1 de 33
Affective
  Computing



        Saumya Srivastava
          M.Tech (HCI)
Introduction

 Affective computing is the study and development of systems and devices
  that can recognize, interpret, process, and simulate human affects.
                                                              - Rosalind Picard

 Originated with Rasalind Picard’s 1995 paper on “Affective Computing”.

 Motivation for Research : Ability to simulate Empathy (i.e. The machine should
    interpret the emotional state of humans and adapt its behavior to them, giving an
    appropriate response for those emotions).(Video :Technology to measure Emotions)

 It is a Empirical research motivated from the theoretical foundation of
  psychology and neuroscience.(Video : Emotional Technology)
                                                                             - Eva Hudicka
                              (Author: “Affective Computing : Theory, Methods & Applications”)
Objective
 To develop a computing device with its capacity to gather cues to user
  emotion from a variety of sources.
   In Simple words, produce “emotion aware machines”.

 Facial expression, posture, gesture, speech, force or rhythm of key stroke,
  temperature change of hand on mouse can signify changes in user‟s.
  emotional state, detected and interpreted by a computer.

 There exist a limitless range of applications :
   E-Learning
      Tutor expands explanation when user is found in a state of confusion,
      adds information when user is found in a state of curiosity etc.
   E-Therapy
      Provide psychological health services (i.e. online counseling) revealing
      the emotional state as in real world session. Through Affective
      Computing, the patient‟s posture, face expression and gesture in real
      world leads to accurate evaluation of psychological state.
PSYCHOLOGICAL THEORIES OF EMOTION
                                                                                    [3]



 Ekman, Friesen and Ellsworth (1972) categorized emotions into 6 groups
  namely fear, surprise, disgust, anger, happiness and sadness all of which can
  facially expressed.
 Ekman and Friesen (1978) developed Facial Action Coding System (FAC),
  which uses muscle movements to quantify emotions.
 According to Ekman, every primary emotion has a adaptive value irrespective
  of individual and culture.
 Automated Version of FAC was given by Ballett et. al. (1999).
 Later, Plutnik(1980) argued for 8 basic pairs of emotions which can be
  combined to produce secondary emotion.
 Drawback : One was forced to choose among the 8 emotion pairs.
 Russell proposed variation in 2 dimension i.e. Valence (X-axis) and Arousal (Y-
  axis)
               e.g. Happy + Content = Pleasure / Displeasure
                       true         true            Pleasure
                       true         false           Displeasure
                       false        true            Displeasure
                       false        false           Displeasure
PSYCHOLOGICAL THEORIES OF EMOTION
                                                                                          [3]



                          CONTEMPT
                                                                         (Continue)


AGGRESIVENESS                                  REMOVE
                     ANGER      DANGER



           ANTICIPATION                  SADNESS

OPTIMISM                                               DISSAPPOINTMENT

              JOY                        SURPRISE



                  ACCEPTANCE      FEAR
           LOVE                               AWE


                          SUBMISSION           Figure : “The Emotion Wheel” by Plutchik
COMPONENTS OF EMOTIONS
                                                                         [3]




 Subjective experience (feeling of fear and so on).

 Physiological Changes in Autonomic Nervous System(ANS) and Endocrine
  System (Glands and Hormones released from them).
  e.g. trembling with fear precedes conscious control of them

 Behavior evoked (such as running away or fainting due to fear)
SOME THEORIES
                                                                                 [3]




 JAMES-LANGE THEORY
   Introduced in 1890 by James and Lange.
   Argues that action precedes emotion (brain interprets action as emotion).
    e.g. something scary moving towards us → pulse starts rising up →
        interpreting our state of body → we are afraid(Fear).



       Perception of
                                   Visceral and skeletal
      Emotion arousing                                          Interpretation
                                         Changes
         Stimulus




                                                     Feedback loop
SOME THEORIES
                                                                                [3]




                                                                   (Continue)
 CANNON - BARD THEORY
   Introduced in 1920 by Cannon and Bard.
   Argues that emotion arousing stimulus precedes Action followed from
    cognitive Appraisal .


                                                 Experience of
                      Sends signals to Cortex      Emotion

            Perception of
           emotion arousing
              stimulus

             Sends signals to Hypothalamus        Physiological
                                                (Bodily) Changes
SOME THEORIES
                                                                                 [3]



                                                                   (Continue)
 SCHACHTER – SINGER THEORY
    Introduced in 1960.
    Adrenaline Experiment: Participant were told that they would be injected
     with an injection of vitamin and tested whether their vision would be
     affected or not.
     Group A: Accurate information was given resulting in sweating, tremor and
     a feeling of jittery.
     Group B : False information was provided.
     Group C : Told nothing
     Group D : Injected with Saline (not a vitamin having no side-affects).
    Results
      1. A & D : Didn‟t feel like being involved.
      2. B & C : Shared their emotion State.
    Criticism
     1. Less ambiguity about what was happening.
     2. Emotion is not just based on behavior but also depends on past
          experience and source of information.
     3. Unexplained emotion state arousal leads to negative experience.
SOME THEORIES
                                                                                        [3]



                                                                        (Continue)

                                 Awareness of
                              Physiological Arousal
                                                                 Interpreting the
  Perception of
                                                              arousal as a particular
 emotion arousing          Thalmus sends impulses to cortex
                                                                emotion given the
    stimulus
                                                                     context
                                   Physiological
                                 (bodily) Changes



 Further, Lazus(1982) performed experiments on “Cognitive Labeling” and
  proposed a notion “Cognitive Appraisal”. Acc. To this theory, Evaluation of the
  situation precedes Affection Reaction.
 Zajonc(1984) argues that Emotional response precedes Cognitive processing.
Areas of Affective Computing
                                                                                          [3]




     Detecting Emotional Information                 (Basic capabilities in a computer
     to discriminate emotions)
      Input : Getting a large variety of i/p signals. E.g. Face, Hand gesture,
       posture, gait, respiration, electro thermal response, ECG, temperature, blood
       pressure, blood volume, Ecteomyogram*.
      Pattern Recognition : Feature Extraction and their classification of signals.
       E.g. Analysis of Video motion features(to discriminate a frown from a
       smile)
      Reasoning : Predicts underlying emotion based about how emotions are
       generated and expressed.
      Learning : Factors tends to emotion (of an individual) which helps better to
       recognize a person‟s emotion.
      Bias : If a system has emotions, then recognizing ambiguous emotion
       becomes easier.
      Output : Recognize expression and likely underlying emotion.

* A test that measures the activity of the muscles.
Areas of Affective Computing
                                                                                     [3]



                                                                      (Continued)
   Recognizing Emotional Information
     We are in need in development of systems which moderate their responses
      to respond to user frustration/stress/anxiety in response to computer
      recognition of emotion.
     Exception : This concept can‟t be implemented in tele-healthcare.
     Lisetti et. al. (2003) designed a application in this regard to resolve this
      issue.

             Wearable Sensors & other
                                                    Embodied Avtars
                     Devices

     Helps communication between patient and clinician.
     Result : 90% success recognizing SADDNESS.
               80% success recognizing FEAR.
               80% success recognizing ANGER.
               70% success recognizing FRUSTRATION.
Areas of Affective Computing
                                                                                             [3]



                                                                             (Continued)


 AFFECTIVE WEARABLES
   Sensors & tools can be used in recognizing affective patterns, but these tools
   require a lot of attention/ maintenance.




    Figure : Wearer’s Blood Volume Pressure   Figure : Sample & transmit biometric data to
            using photoplethysmography                 larger computer for analysis
Areas of Affective Computing
                                                                                 [3]



                                                                   (Continued)
 Expressing Emotional
   Need of Computers to express emotions :
    1. Computers expressing emotions can improve the quality and
       effectiveness of communication between people and technologies.
    2. How people can communicate with computer such that they can express
       their emotions?
    3. How technology can stimulate and support new modes of affective
       communication between people.
   Efforts made :
    1. Schiano and her colleagues(2000) tested an early prototype of simple
       robot. Drawback : It had no emotions.
    2. An Experimental application at MIT, the „Relational Agent‟(Bickmore,
       2003), designed to sustain long-term relationships. The agent expressed
       to emotions. Drawback : Didn‟t convince the reality of „feelings‟.
    3. By Contrast, „Kismet‟ an expressive robot at MIT is equipped with
       auditory and proprioceptive (touch) sensory inputs. Kismet can express
       emotion through vocalization, facial expression and adjustment of Gaze
       direction and head orientation.
Areas of Affective Computing
                                                                                  [3]



                                                                    (Continued)
 Expressing Emotional



                               Evolution over the
                                     years




Figure : MS Office Assistant                        Figure : Kismet Robot
What has been done?[5]
         Emotion Recognition and synthesis in the focus of many FP5, FP6
         and FP7 projects.
             Starting from ERMIS (emotion aware agents) → HUMAINE
               (network of excellence) to Callas (emotion in art and
               entertainment).



                                                What can be done?[5]
         Add Observable Manifestations which provide cues about user‟s
         subjective experience.
             A Smile may indicate successful completion of task or retrieval
                 of what user was looking for…
             Instead of cryptic “retry” button or asking user to verify results.
             People may frown to indicate displeasure or difficulty to read
                 , nod to agree, shrug shoulders when indifferent etc.
How can this be done? [5]
         We can recognize :
             Facial Features and cues
             Head Pose/Eye Gaze (to estimate attention)
             Hand Gestures (usually fixed vocabulary , signs)
             Directions and Commands (usually fixed vocabulary)
             Anger in speech (useful in call centers)

                                         Affective Interactions [5]
         When computers can sense affective cues :
            Users cannot read text off the screen and frown/approach
               screen?
                Redraw text with larger font!
            Call centre user is angry?
                Redirect to human operator!
            Users not familiar with/cannot use mouse/keyboard?
                Spoken commands/hand gestures are another option!
            Users not comfortable with on-screen text?
                Use virtual characters and speech synthesis!
Current State Of Art
                                                                                                    [1]



Rosaline Picard, in her book “Affective Computing & Intelligent Interaction” addresses
a research paper on “Expressive Face Animation – Synthesis based on dynamic
Mapping Method”[1] which talks of SPEECH DRIVEN FACE ANIMATION
SYSTEM WITH EXPRESSIONS .
    Up till now…
                                                                                   Sequence of
                                      SPEECH DRIVEN FACE
          Audio Stream                                                             Corresponding
                                          ANIMATION                                Face movements
                         (For e.g. Multimodal HCI, Visual Reality, Video Phones)

          Work had been done on Lip Movement resulting in inaccuracy &
           discontinuity.
          In speech recognition System, Yamamoto E. built a phoneme recognition
           model using Hidden Markov Model, directly mapping phoneme to the lip
           shape.
          Drawback : Phoneme had to be linked to a language, and since the phoneme
           varied from person to person and region to region, the Efficiency degraded.
Current State Of Art
                                                                                  [1]




                                                                   (Continued)
 Progress
    To overcome the drawback, Neural Networks was now used for Audio
     Visual Mapping using the Gaussian Mixture Model (GMM). (Demonstration 1)
      (Demonstration 2)




  Non Verbal Information          Verbal Information        Set of Message with
                                                          speaker‟s emotional State
    Explains the relation between neutral facial deformation & a expressive
     facial deformation via GMM together with joint probability distribution.

    Result : An encouraging quantitative evaluation a synthesized face showing
     a realistic quality.
Released Applications
                                                                                      [6]




 Spatio-Temporal Emotional Mapper for Social System [Demonstration]
       Developed by the Dept. of Informatics Engineering of Faculty of Science &
       Technology, University of Coimbra.

    This tool gathers from a society of agents their emotional arousals and self
     rated motivation as well as their location in order to plot a map of a city or
     geographical region with information about the motivation and emotional
     state of the agents that inhabit it.

    It is open source application.(source code).
Research Groups & their Work
 AffQuake[2]

   AffQuake is an attempt to in-cooperate signals that relate to the players
    emotions involved while playing.

   Quake II alters with the modification in behavior of the play w.r.t. average
    skin conductance level.

   For e.g. Excitement increase the size of the avatars, giving benefit to see
    farther, but at the same time making him a easier target.

   Performed at MIT Media Lab, MIT.

   Group Members :
           a) Carson J. Reynolds
           b) Rosalind W Picard
Research Groups & their Work
 Affective Tangibles[2]

    Objective : To develop physical objects that can grasped, squeezed, thrown or
     otherwise manipulated via a natural display of affect.
    People generally express their frustration through the use of motor skills. In
     simple words, people often increase their intensity of muscle movement
     when experiencing frustrating interactions.
    Constructed tangibles include a Pressure Mouse, affective pinwheels that are
     mapped to skin conductance, and a voodoo doll that can be shaken to express
     frustration.
    Performed at MIT Media Lab, MIT.
    Group Members :
              a) Carson J. Reynolds
              b) Rosalind W Picard



                              Figure : Pressure Mouse
Research Groups & their Work
 Affective Learning Companion[2]
   A powerful research tool, exploring a variety of social-emotional skills in
    HCI.
   The platform enables a computational agent to sense and respond, in real
    time, to a user's non-verbal emotional cues, using video, postural
    movements, mouse pressure, physiology, and other behaviors communicated
    by the user to infer.
   An animated agent, recently developed allowing the study of factors helping
    learners develop the ability to persevere during frustrating learning episodes.
   Performed at MIT Media Lab, MIT.
   Group Members :
            a) Selene Atenea Mota
            b) Rosalind W. Picard
            c) Ashish Kapoor
            d) Barry Kort
            e) Hyungil Ahn
            f) Ken Perlin
            g) Winslow Burleson
Research Groups & their Work
 The Galvactivator[2]

    A glove-like wearable device that senses the wearer's skin conductivity and
     maps its values to a bright LED display.
    Increases in skin conductivity across the palm tend to be good indicators of
     physiological arousal, causing the galvactivator display to glow brightly.
    Applications : Self-feedback for stress management, Facilitation of
     conversation between two people & new ways of visualizing mass
     excitement levels in performance situations or visualizing aspects of arousal
     and attention in learning situations.
    Performed at MIT Media Lab, MIT.
    Group Members :
             a) Rosalind W. Picard
             b) Jonny Farringdon
             c) Nancy Tilbury (Philips Research Laboratories)
             d) Jocelyn Scheirer
Research Groups & their Work
 Learning & Pattern Recognition[2]

   This project developed efficient versions of Bayesian techniques for a variety
    of inference problems, including curve fitting, mixture-density estimation,
    principal-components analysis (PCA), automatic relevance determination,
    and spectral analysis.
   Performed at MIT Media Lab, MIT.
   Group Members :
            a) Rosalind W. Picard
            b) Thomas Minka
            c) Yuan Qi
Research Groups & their Work
 Robotic Computer[2]

        A robotic computer that moves its monitor "head" and "neck," but that has no
         explicit face, is being designed to interact with users in a natural way for
         applications such as learning, rapport-building, interactive teaching, and
         posture improvement.
        In all these applications, the robot will need to move in subtle ways that
         express its state and promote appropriate movements in the user, but that
         don't distract or annoy.
        Goal : Giving the system the ability to recognize states of the user and also to
         have subtle expressions.
        Performed at MIT Media Lab, MIT.
        Group Members :
                  a) Carson J. Reynolds
                  b) Rosalind W Picard

Note     :   Other publication associated with Affective    computing   are   available   @
             http://affect.media.mit.edu/publications.php
Research Groups & their Work
 Agent-Dysl Project[5]

    Problem : Children with dyslexia experience problems in reading off a
     computer screen
      Common errors: skipping words, changing word or syllable
       sequence, easily distracted/frustrated.
    Solution : A screen reading software which
      Helps them read in correct order by highlighting words and syllables.
      Checks and monitors their progress.
      Looks for signs of distraction or frustration.




 Figure : User leans towards the screen? Font size   Figure : User looks away? Highlighting stops
         increased
Key Issues for Further Research
                                                                                   [3]

 The critical issues that the Interactive Systems Designers are facing :

   In which domain does affective capability make a positive difference to HCI,
    and where is it irrelevant or even obstructive?

   How precise do we need to be identifying human emotions- perhaps it is
    enough to identify a general positive or negative feeling? What techniques
    best detect emotional states for this purpose?

   How do we evaluate the contribution of affect to overall success of design?
References
[1] Panrong Yin, Linye Zhao, Lexing Huang and Jianhua Tao, “Expressive Face Animation
       Synthesis based on Dynamic Mapping Method” Published at National Laborotary of
       Pattern Recognition , Springer-Verlag Berlin Heidelberg 2011.
[2]    Site : http://www. affect.media.mit.edu
[3]    DAVID BENYON 2010, Designing Interactive Systems- A Comprehensive guide to HCI
       and interaction design ,Addison Wesley-Second Edition
[4]    Site : http://www.agent-dysl.eu
[5]    DR. KOSTAS KARPOUZIS, “Technology Potential : Affective Computing”, Image, video
       and multimedia systems lab, National Technical University of Athens.
[6]   Site : https://github.com/lfac-pt/Spatiotemporal-Emotional-Mapper-for-Social-Systems
[7] ZHIHONG ZENG, Member, IEEE Computer Society, MAJA PANTIC, Senior Member,
      IEEE, GLENN I. ROISMAN & THOMAS S. HUANG, Fellow, IEEE, “A Survey of Affect
      Recognition Methods: Audio,Visual, and Spontaneous Expressions” .
Projects
 Problem Definition 1 : Designing an interface integrating emotion detection
  for video surveillance.

 Problem Definition 2: A 3-D Avatar reflecting the emotion as per scenario in
  gaming environment.

Más contenido relacionado

La actualidad más candente

Human Computer Interaction
Human Computer InteractionHuman Computer Interaction
Human Computer InteractionBHAKTI PATIL
 
Hci chapter-1
Hci chapter-1Hci chapter-1
Hci chapter-1devid8
 
FACE RECOGNITION TECHNOLOGY
FACE RECOGNITION TECHNOLOGYFACE RECOGNITION TECHNOLOGY
FACE RECOGNITION TECHNOLOGYJASHU JASWANTH
 
Internet of Things - module 1
Internet of Things -  module 1Internet of Things -  module 1
Internet of Things - module 1Syed Mustafa
 
Haptic technology
Haptic technologyHaptic technology
Haptic technologyjalan1234
 
Haptic Technology slideshare
Haptic  Technology slideshareHaptic  Technology slideshare
Haptic Technology slideshareTayabaZahid
 
Human Emotion Recognition using Machine Learning
Human Emotion Recognition using Machine LearningHuman Emotion Recognition using Machine Learning
Human Emotion Recognition using Machine Learningijtsrd
 
Face Recognition Attendance System
Face Recognition Attendance System Face Recognition Attendance System
Face Recognition Attendance System Shreya Dandavate
 
Augmented Reality (AR)
Augmented Reality (AR)Augmented Reality (AR)
Augmented Reality (AR)Samsil Arefin
 
Hand Gesture Recognition using Neural Network
Hand Gesture Recognition using Neural NetworkHand Gesture Recognition using Neural Network
Hand Gesture Recognition using Neural NetworkBhagwat Singh Rathore
 
Real time Communication
Real time CommunicationReal time Communication
Real time CommunicationAryan goyal
 
Ambient intelligence
Ambient intelligenceAmbient intelligence
Ambient intelligencechandrika95
 
Ubiquitous Computing
Ubiquitous ComputingUbiquitous Computing
Ubiquitous Computingu065932
 
Iot and cloud computing
Iot and cloud computingIot and cloud computing
Iot and cloud computingeteshagarwal1
 
Internet of things (IoT)
Internet of things (IoT)Internet of things (IoT)
Internet of things (IoT)Prakash Honnur
 

La actualidad más candente (20)

Human Computer Interaction
Human Computer InteractionHuman Computer Interaction
Human Computer Interaction
 
Hci chapter-1
Hci chapter-1Hci chapter-1
Hci chapter-1
 
FACE RECOGNITION TECHNOLOGY
FACE RECOGNITION TECHNOLOGYFACE RECOGNITION TECHNOLOGY
FACE RECOGNITION TECHNOLOGY
 
Chapter1(hci)
Chapter1(hci)Chapter1(hci)
Chapter1(hci)
 
Internet of Things - module 1
Internet of Things -  module 1Internet of Things -  module 1
Internet of Things - module 1
 
Haptic technology
Haptic technologyHaptic technology
Haptic technology
 
Haptic Technology slideshare
Haptic  Technology slideshareHaptic  Technology slideshare
Haptic Technology slideshare
 
Human Emotion Recognition using Machine Learning
Human Emotion Recognition using Machine LearningHuman Emotion Recognition using Machine Learning
Human Emotion Recognition using Machine Learning
 
Face Recognition Attendance System
Face Recognition Attendance System Face Recognition Attendance System
Face Recognition Attendance System
 
Augmented Reality (AR)
Augmented Reality (AR)Augmented Reality (AR)
Augmented Reality (AR)
 
Hand Gesture Recognition using Neural Network
Hand Gesture Recognition using Neural NetworkHand Gesture Recognition using Neural Network
Hand Gesture Recognition using Neural Network
 
Real time Communication
Real time CommunicationReal time Communication
Real time Communication
 
EMOTION DETECTION USING AI
EMOTION DETECTION USING AIEMOTION DETECTION USING AI
EMOTION DETECTION USING AI
 
Ambient intelligence
Ambient intelligenceAmbient intelligence
Ambient intelligence
 
E ball technology
E ball technologyE ball technology
E ball technology
 
HCI
HCIHCI
HCI
 
Hand Gesture Recognition
Hand Gesture RecognitionHand Gesture Recognition
Hand Gesture Recognition
 
Ubiquitous Computing
Ubiquitous ComputingUbiquitous Computing
Ubiquitous Computing
 
Iot and cloud computing
Iot and cloud computingIot and cloud computing
Iot and cloud computing
 
Internet of things (IoT)
Internet of things (IoT)Internet of things (IoT)
Internet of things (IoT)
 

Destacado

Applications of Emotions Recognition
Applications of Emotions RecognitionApplications of Emotions Recognition
Applications of Emotions RecognitionFrancesco Bonadiman
 
EMOTION BASED COMPUTING
EMOTION BASED COMPUTINGEMOTION BASED COMPUTING
EMOTION BASED COMPUTINGRishikese MR
 
Affective computing with primary and secondary emotions in a virtual human
Affective computing with primary and secondary emotions in a virtual humanAffective computing with primary and secondary emotions in a virtual human
Affective computing with primary and secondary emotions in a virtual human小均 張
 
Smalltalk Presentation
Smalltalk PresentationSmalltalk Presentation
Smalltalk PresentationMichel Alves
 
Squeak apres uca sp katia godoy
Squeak apres uca sp katia godoySqueak apres uca sp katia godoy
Squeak apres uca sp katia godoyRenata Aquino
 
Affective Computing and Human robot i coursework
Affective Computing and Human robot i courseworkAffective Computing and Human robot i coursework
Affective Computing and Human robot i courseworkSujith Kumar Anand
 
自閉症と感情コンピューティング
自閉症と感情コンピューティング自閉症と感情コンピューティング
自閉症と感情コンピューティング由来 藤原
 
Emotional Wellbeing. HED 44025-001
Emotional Wellbeing. HED 44025-001Emotional Wellbeing. HED 44025-001
Emotional Wellbeing. HED 44025-001Brandi Hoffman
 
Positive Computing: Technology for wellbeing
Positive Computing: Technology for wellbeing Positive Computing: Technology for wellbeing
Positive Computing: Technology for wellbeing Dorian Peters
 
Affect and cognition
Affect and cognitionAffect and cognition
Affect and cognitionAeda Ec
 
Emotion recognition using facial expressions and speech
Emotion recognition using facial expressions and speechEmotion recognition using facial expressions and speech
Emotion recognition using facial expressions and speechLakshmi Sarvani Videla
 
Emotion Detection from Text
Emotion Detection from TextEmotion Detection from Text
Emotion Detection from TextIJERD Editor
 
Emotion Detection in text
Emotion Detection in text Emotion Detection in text
Emotion Detection in text kashif kashif
 
HUMAN EMOTION RECOGNIITION SYSTEM
HUMAN EMOTION RECOGNIITION SYSTEMHUMAN EMOTION RECOGNIITION SYSTEM
HUMAN EMOTION RECOGNIITION SYSTEMsoumi sarkar
 
Intelligent building
Intelligent buildingIntelligent building
Intelligent buildingMohd. Ikhwan
 

Destacado (20)

Affective Computing
Affective Computing Affective Computing
Affective Computing
 
Applications of Emotions Recognition
Applications of Emotions RecognitionApplications of Emotions Recognition
Applications of Emotions Recognition
 
EMOTION BASED COMPUTING
EMOTION BASED COMPUTINGEMOTION BASED COMPUTING
EMOTION BASED COMPUTING
 
Psicomotricidad
Psicomotricidad Psicomotricidad
Psicomotricidad
 
Affective computing with primary and secondary emotions in a virtual human
Affective computing with primary and secondary emotions in a virtual humanAffective computing with primary and secondary emotions in a virtual human
Affective computing with primary and secondary emotions in a virtual human
 
Aula squeak etoys
Aula squeak etoysAula squeak etoys
Aula squeak etoys
 
Smalltalk Presentation
Smalltalk PresentationSmalltalk Presentation
Smalltalk Presentation
 
Squeak apres uca sp katia godoy
Squeak apres uca sp katia godoySqueak apres uca sp katia godoy
Squeak apres uca sp katia godoy
 
Affective Computing and Human robot i coursework
Affective Computing and Human robot i courseworkAffective Computing and Human robot i coursework
Affective Computing and Human robot i coursework
 
自閉症と感情コンピューティング
自閉症と感情コンピューティング自閉症と感情コンピューティング
自閉症と感情コンピューティング
 
Emotional Wellbeing. HED 44025-001
Emotional Wellbeing. HED 44025-001Emotional Wellbeing. HED 44025-001
Emotional Wellbeing. HED 44025-001
 
Positive Computing: Technology for wellbeing
Positive Computing: Technology for wellbeing Positive Computing: Technology for wellbeing
Positive Computing: Technology for wellbeing
 
Affect and cognition
Affect and cognitionAffect and cognition
Affect and cognition
 
[ASM]Lab3
[ASM]Lab3[ASM]Lab3
[ASM]Lab3
 
Emotion recognition using facial expressions and speech
Emotion recognition using facial expressions and speechEmotion recognition using facial expressions and speech
Emotion recognition using facial expressions and speech
 
Emotion Detection from Text
Emotion Detection from TextEmotion Detection from Text
Emotion Detection from Text
 
Emotion Detection in text
Emotion Detection in text Emotion Detection in text
Emotion Detection in text
 
HUMAN EMOTION RECOGNIITION SYSTEM
HUMAN EMOTION RECOGNIITION SYSTEMHUMAN EMOTION RECOGNIITION SYSTEM
HUMAN EMOTION RECOGNIITION SYSTEM
 
Intelligent Buildings
Intelligent BuildingsIntelligent Buildings
Intelligent Buildings
 
Intelligent building
Intelligent buildingIntelligent building
Intelligent building
 

Similar a Affective computing

Similar a Affective computing (20)

Introductory Psychology: Emotion
Introductory Psychology: EmotionIntroductory Psychology: Emotion
Introductory Psychology: Emotion
 
Cognition, biology and emotion pp
Cognition, biology and emotion ppCognition, biology and emotion pp
Cognition, biology and emotion pp
 
Emotions
EmotionsEmotions
Emotions
 
Emotions.ppt
Emotions.pptEmotions.ppt
Emotions.ppt
 
Emotion
EmotionEmotion
Emotion
 
Emotion
EmotionEmotion
Emotion
 
Emotions
EmotionsEmotions
Emotions
 
Perceived emotion in the brain
Perceived emotion in the brainPerceived emotion in the brain
Perceived emotion in the brain
 
2 backup of pain and nerve conduction
2 backup of pain and nerve conduction2 backup of pain and nerve conduction
2 backup of pain and nerve conduction
 
Emotion ReviewVol. 4 No. 2 (April 2012) 163 –168© The Au.docx
Emotion ReviewVol. 4 No. 2 (April 2012) 163 –168© The Au.docxEmotion ReviewVol. 4 No. 2 (April 2012) 163 –168© The Au.docx
Emotion ReviewVol. 4 No. 2 (April 2012) 163 –168© The Au.docx
 
Emotion ReviewVol. 4 No. 2 (April 2012) 163 –168© The Au.docx
Emotion ReviewVol. 4 No. 2 (April 2012) 163 –168© The Au.docxEmotion ReviewVol. 4 No. 2 (April 2012) 163 –168© The Au.docx
Emotion ReviewVol. 4 No. 2 (April 2012) 163 –168© The Au.docx
 
EMOTIONS and INTELLIGENCE in Psychology
EMOTIONS and INTELLIGENCE in PsychologyEMOTIONS and INTELLIGENCE in Psychology
EMOTIONS and INTELLIGENCE in Psychology
 
[系列活動] Emotion-AI: 運用人工智慧實現情緒辨識
[系列活動] Emotion-AI: 運用人工智慧實現情緒辨識[系列活動] Emotion-AI: 運用人工智慧實現情緒辨識
[系列活動] Emotion-AI: 運用人工智慧實現情緒辨識
 
Theory of emotion
Theory of emotionTheory of emotion
Theory of emotion
 
Psychology
PsychologyPsychology
Psychology
 
Chapter 7: EMOTIONS
Chapter 7: EMOTIONSChapter 7: EMOTIONS
Chapter 7: EMOTIONS
 
Emotion Introduction
Emotion IntroductionEmotion Introduction
Emotion Introduction
 
26 emotions
26 emotions26 emotions
26 emotions
 
Emotion
EmotionEmotion
Emotion
 
Emotion
EmotionEmotion
Emotion
 

Affective computing

  • 1. Affective Computing Saumya Srivastava M.Tech (HCI)
  • 2. Introduction  Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. - Rosalind Picard  Originated with Rasalind Picard’s 1995 paper on “Affective Computing”.  Motivation for Research : Ability to simulate Empathy (i.e. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response for those emotions).(Video :Technology to measure Emotions)  It is a Empirical research motivated from the theoretical foundation of psychology and neuroscience.(Video : Emotional Technology) - Eva Hudicka (Author: “Affective Computing : Theory, Methods & Applications”)
  • 3. Objective  To develop a computing device with its capacity to gather cues to user emotion from a variety of sources. In Simple words, produce “emotion aware machines”.  Facial expression, posture, gesture, speech, force or rhythm of key stroke, temperature change of hand on mouse can signify changes in user‟s. emotional state, detected and interpreted by a computer.  There exist a limitless range of applications :  E-Learning Tutor expands explanation when user is found in a state of confusion, adds information when user is found in a state of curiosity etc.  E-Therapy Provide psychological health services (i.e. online counseling) revealing the emotional state as in real world session. Through Affective Computing, the patient‟s posture, face expression and gesture in real world leads to accurate evaluation of psychological state.
  • 4. PSYCHOLOGICAL THEORIES OF EMOTION [3]  Ekman, Friesen and Ellsworth (1972) categorized emotions into 6 groups namely fear, surprise, disgust, anger, happiness and sadness all of which can facially expressed.  Ekman and Friesen (1978) developed Facial Action Coding System (FAC), which uses muscle movements to quantify emotions.  According to Ekman, every primary emotion has a adaptive value irrespective of individual and culture.  Automated Version of FAC was given by Ballett et. al. (1999).  Later, Plutnik(1980) argued for 8 basic pairs of emotions which can be combined to produce secondary emotion.  Drawback : One was forced to choose among the 8 emotion pairs.  Russell proposed variation in 2 dimension i.e. Valence (X-axis) and Arousal (Y- axis) e.g. Happy + Content = Pleasure / Displeasure true true Pleasure true false Displeasure false true Displeasure false false Displeasure
  • 5. PSYCHOLOGICAL THEORIES OF EMOTION [3] CONTEMPT (Continue) AGGRESIVENESS REMOVE ANGER DANGER ANTICIPATION SADNESS OPTIMISM DISSAPPOINTMENT JOY SURPRISE ACCEPTANCE FEAR LOVE AWE SUBMISSION Figure : “The Emotion Wheel” by Plutchik
  • 6. COMPONENTS OF EMOTIONS [3]  Subjective experience (feeling of fear and so on).  Physiological Changes in Autonomic Nervous System(ANS) and Endocrine System (Glands and Hormones released from them). e.g. trembling with fear precedes conscious control of them  Behavior evoked (such as running away or fainting due to fear)
  • 7. SOME THEORIES [3]  JAMES-LANGE THEORY  Introduced in 1890 by James and Lange.  Argues that action precedes emotion (brain interprets action as emotion). e.g. something scary moving towards us → pulse starts rising up → interpreting our state of body → we are afraid(Fear). Perception of Visceral and skeletal Emotion arousing Interpretation Changes Stimulus Feedback loop
  • 8.
  • 9. SOME THEORIES [3] (Continue)  CANNON - BARD THEORY  Introduced in 1920 by Cannon and Bard.  Argues that emotion arousing stimulus precedes Action followed from cognitive Appraisal . Experience of Sends signals to Cortex Emotion Perception of emotion arousing stimulus Sends signals to Hypothalamus Physiological (Bodily) Changes
  • 10.
  • 11. SOME THEORIES [3] (Continue)  SCHACHTER – SINGER THEORY  Introduced in 1960.  Adrenaline Experiment: Participant were told that they would be injected with an injection of vitamin and tested whether their vision would be affected or not. Group A: Accurate information was given resulting in sweating, tremor and a feeling of jittery. Group B : False information was provided. Group C : Told nothing Group D : Injected with Saline (not a vitamin having no side-affects).  Results 1. A & D : Didn‟t feel like being involved. 2. B & C : Shared their emotion State.  Criticism 1. Less ambiguity about what was happening. 2. Emotion is not just based on behavior but also depends on past experience and source of information. 3. Unexplained emotion state arousal leads to negative experience.
  • 12. SOME THEORIES [3] (Continue) Awareness of Physiological Arousal Interpreting the Perception of arousal as a particular emotion arousing Thalmus sends impulses to cortex emotion given the stimulus context Physiological (bodily) Changes  Further, Lazus(1982) performed experiments on “Cognitive Labeling” and proposed a notion “Cognitive Appraisal”. Acc. To this theory, Evaluation of the situation precedes Affection Reaction.  Zajonc(1984) argues that Emotional response precedes Cognitive processing.
  • 13.
  • 14. Areas of Affective Computing [3]  Detecting Emotional Information (Basic capabilities in a computer to discriminate emotions)  Input : Getting a large variety of i/p signals. E.g. Face, Hand gesture, posture, gait, respiration, electro thermal response, ECG, temperature, blood pressure, blood volume, Ecteomyogram*.  Pattern Recognition : Feature Extraction and their classification of signals. E.g. Analysis of Video motion features(to discriminate a frown from a smile)  Reasoning : Predicts underlying emotion based about how emotions are generated and expressed.  Learning : Factors tends to emotion (of an individual) which helps better to recognize a person‟s emotion.  Bias : If a system has emotions, then recognizing ambiguous emotion becomes easier.  Output : Recognize expression and likely underlying emotion. * A test that measures the activity of the muscles.
  • 15. Areas of Affective Computing [3] (Continued)  Recognizing Emotional Information  We are in need in development of systems which moderate their responses to respond to user frustration/stress/anxiety in response to computer recognition of emotion.  Exception : This concept can‟t be implemented in tele-healthcare.  Lisetti et. al. (2003) designed a application in this regard to resolve this issue. Wearable Sensors & other Embodied Avtars Devices  Helps communication between patient and clinician.  Result : 90% success recognizing SADDNESS. 80% success recognizing FEAR. 80% success recognizing ANGER. 70% success recognizing FRUSTRATION.
  • 16. Areas of Affective Computing [3] (Continued)  AFFECTIVE WEARABLES Sensors & tools can be used in recognizing affective patterns, but these tools require a lot of attention/ maintenance. Figure : Wearer’s Blood Volume Pressure Figure : Sample & transmit biometric data to using photoplethysmography larger computer for analysis
  • 17. Areas of Affective Computing [3] (Continued)  Expressing Emotional  Need of Computers to express emotions : 1. Computers expressing emotions can improve the quality and effectiveness of communication between people and technologies. 2. How people can communicate with computer such that they can express their emotions? 3. How technology can stimulate and support new modes of affective communication between people.  Efforts made : 1. Schiano and her colleagues(2000) tested an early prototype of simple robot. Drawback : It had no emotions. 2. An Experimental application at MIT, the „Relational Agent‟(Bickmore, 2003), designed to sustain long-term relationships. The agent expressed to emotions. Drawback : Didn‟t convince the reality of „feelings‟. 3. By Contrast, „Kismet‟ an expressive robot at MIT is equipped with auditory and proprioceptive (touch) sensory inputs. Kismet can express emotion through vocalization, facial expression and adjustment of Gaze direction and head orientation.
  • 18. Areas of Affective Computing [3] (Continued)  Expressing Emotional Evolution over the years Figure : MS Office Assistant Figure : Kismet Robot
  • 19. What has been done?[5] Emotion Recognition and synthesis in the focus of many FP5, FP6 and FP7 projects.  Starting from ERMIS (emotion aware agents) → HUMAINE (network of excellence) to Callas (emotion in art and entertainment). What can be done?[5] Add Observable Manifestations which provide cues about user‟s subjective experience.  A Smile may indicate successful completion of task or retrieval of what user was looking for…  Instead of cryptic “retry” button or asking user to verify results.  People may frown to indicate displeasure or difficulty to read , nod to agree, shrug shoulders when indifferent etc.
  • 20. How can this be done? [5] We can recognize :  Facial Features and cues  Head Pose/Eye Gaze (to estimate attention)  Hand Gestures (usually fixed vocabulary , signs)  Directions and Commands (usually fixed vocabulary)  Anger in speech (useful in call centers) Affective Interactions [5] When computers can sense affective cues :  Users cannot read text off the screen and frown/approach screen?  Redraw text with larger font!  Call centre user is angry?  Redirect to human operator!  Users not familiar with/cannot use mouse/keyboard?  Spoken commands/hand gestures are another option!  Users not comfortable with on-screen text?  Use virtual characters and speech synthesis!
  • 21. Current State Of Art [1] Rosaline Picard, in her book “Affective Computing & Intelligent Interaction” addresses a research paper on “Expressive Face Animation – Synthesis based on dynamic Mapping Method”[1] which talks of SPEECH DRIVEN FACE ANIMATION SYSTEM WITH EXPRESSIONS .  Up till now… Sequence of SPEECH DRIVEN FACE Audio Stream Corresponding ANIMATION Face movements (For e.g. Multimodal HCI, Visual Reality, Video Phones)  Work had been done on Lip Movement resulting in inaccuracy & discontinuity.  In speech recognition System, Yamamoto E. built a phoneme recognition model using Hidden Markov Model, directly mapping phoneme to the lip shape.  Drawback : Phoneme had to be linked to a language, and since the phoneme varied from person to person and region to region, the Efficiency degraded.
  • 22. Current State Of Art [1] (Continued)  Progress  To overcome the drawback, Neural Networks was now used for Audio Visual Mapping using the Gaussian Mixture Model (GMM). (Demonstration 1) (Demonstration 2) Non Verbal Information Verbal Information Set of Message with speaker‟s emotional State  Explains the relation between neutral facial deformation & a expressive facial deformation via GMM together with joint probability distribution.  Result : An encouraging quantitative evaluation a synthesized face showing a realistic quality.
  • 23. Released Applications [6]  Spatio-Temporal Emotional Mapper for Social System [Demonstration]  Developed by the Dept. of Informatics Engineering of Faculty of Science & Technology, University of Coimbra.  This tool gathers from a society of agents their emotional arousals and self rated motivation as well as their location in order to plot a map of a city or geographical region with information about the motivation and emotional state of the agents that inhabit it.  It is open source application.(source code).
  • 24. Research Groups & their Work  AffQuake[2]  AffQuake is an attempt to in-cooperate signals that relate to the players emotions involved while playing.  Quake II alters with the modification in behavior of the play w.r.t. average skin conductance level.  For e.g. Excitement increase the size of the avatars, giving benefit to see farther, but at the same time making him a easier target.  Performed at MIT Media Lab, MIT.  Group Members : a) Carson J. Reynolds b) Rosalind W Picard
  • 25. Research Groups & their Work  Affective Tangibles[2]  Objective : To develop physical objects that can grasped, squeezed, thrown or otherwise manipulated via a natural display of affect.  People generally express their frustration through the use of motor skills. In simple words, people often increase their intensity of muscle movement when experiencing frustrating interactions.  Constructed tangibles include a Pressure Mouse, affective pinwheels that are mapped to skin conductance, and a voodoo doll that can be shaken to express frustration.  Performed at MIT Media Lab, MIT.  Group Members : a) Carson J. Reynolds b) Rosalind W Picard Figure : Pressure Mouse
  • 26. Research Groups & their Work  Affective Learning Companion[2]  A powerful research tool, exploring a variety of social-emotional skills in HCI.  The platform enables a computational agent to sense and respond, in real time, to a user's non-verbal emotional cues, using video, postural movements, mouse pressure, physiology, and other behaviors communicated by the user to infer.  An animated agent, recently developed allowing the study of factors helping learners develop the ability to persevere during frustrating learning episodes.  Performed at MIT Media Lab, MIT.  Group Members : a) Selene Atenea Mota b) Rosalind W. Picard c) Ashish Kapoor d) Barry Kort e) Hyungil Ahn f) Ken Perlin g) Winslow Burleson
  • 27. Research Groups & their Work  The Galvactivator[2]  A glove-like wearable device that senses the wearer's skin conductivity and maps its values to a bright LED display.  Increases in skin conductivity across the palm tend to be good indicators of physiological arousal, causing the galvactivator display to glow brightly.  Applications : Self-feedback for stress management, Facilitation of conversation between two people & new ways of visualizing mass excitement levels in performance situations or visualizing aspects of arousal and attention in learning situations.  Performed at MIT Media Lab, MIT.  Group Members : a) Rosalind W. Picard b) Jonny Farringdon c) Nancy Tilbury (Philips Research Laboratories) d) Jocelyn Scheirer
  • 28. Research Groups & their Work  Learning & Pattern Recognition[2]  This project developed efficient versions of Bayesian techniques for a variety of inference problems, including curve fitting, mixture-density estimation, principal-components analysis (PCA), automatic relevance determination, and spectral analysis.  Performed at MIT Media Lab, MIT.  Group Members : a) Rosalind W. Picard b) Thomas Minka c) Yuan Qi
  • 29. Research Groups & their Work  Robotic Computer[2]  A robotic computer that moves its monitor "head" and "neck," but that has no explicit face, is being designed to interact with users in a natural way for applications such as learning, rapport-building, interactive teaching, and posture improvement.  In all these applications, the robot will need to move in subtle ways that express its state and promote appropriate movements in the user, but that don't distract or annoy.  Goal : Giving the system the ability to recognize states of the user and also to have subtle expressions.  Performed at MIT Media Lab, MIT.  Group Members : a) Carson J. Reynolds b) Rosalind W Picard Note : Other publication associated with Affective computing are available @ http://affect.media.mit.edu/publications.php
  • 30. Research Groups & their Work  Agent-Dysl Project[5]  Problem : Children with dyslexia experience problems in reading off a computer screen  Common errors: skipping words, changing word or syllable sequence, easily distracted/frustrated.  Solution : A screen reading software which  Helps them read in correct order by highlighting words and syllables.  Checks and monitors their progress.  Looks for signs of distraction or frustration. Figure : User leans towards the screen? Font size Figure : User looks away? Highlighting stops increased
  • 31. Key Issues for Further Research [3] The critical issues that the Interactive Systems Designers are facing :  In which domain does affective capability make a positive difference to HCI, and where is it irrelevant or even obstructive?  How precise do we need to be identifying human emotions- perhaps it is enough to identify a general positive or negative feeling? What techniques best detect emotional states for this purpose?  How do we evaluate the contribution of affect to overall success of design?
  • 32. References [1] Panrong Yin, Linye Zhao, Lexing Huang and Jianhua Tao, “Expressive Face Animation Synthesis based on Dynamic Mapping Method” Published at National Laborotary of Pattern Recognition , Springer-Verlag Berlin Heidelberg 2011. [2] Site : http://www. affect.media.mit.edu [3] DAVID BENYON 2010, Designing Interactive Systems- A Comprehensive guide to HCI and interaction design ,Addison Wesley-Second Edition [4] Site : http://www.agent-dysl.eu [5] DR. KOSTAS KARPOUZIS, “Technology Potential : Affective Computing”, Image, video and multimedia systems lab, National Technical University of Athens. [6] Site : https://github.com/lfac-pt/Spatiotemporal-Emotional-Mapper-for-Social-Systems [7] ZHIHONG ZENG, Member, IEEE Computer Society, MAJA PANTIC, Senior Member, IEEE, GLENN I. ROISMAN & THOMAS S. HUANG, Fellow, IEEE, “A Survey of Affect Recognition Methods: Audio,Visual, and Spontaneous Expressions” .
  • 33. Projects  Problem Definition 1 : Designing an interface integrating emotion detection for video surveillance.  Problem Definition 2: A 3-D Avatar reflecting the emotion as per scenario in gaming environment.

Notas del editor

  1. Affects is concerned with a whole range of emotions i.e. feelings, moods, sentiments and other aspects(non-cognitive).
  2. Affective Learning Companion is a powerful, flexible new research tool for exploring a variety of social-emotional skills in human-machine interaction, and for understanding how machines can work with people to better meet their needs.The platform enables a computational agent to sense and respond, in real time, to a user's non-verbal emotional cues, using video, postural movements, mouse pressure, physiology, and other behaviors communicated by the user to infer, for example, if a user is in a high or low state of interest, or feeling frustrated. We recently developed an animated agent that combines non-verbal mirroring (or not) with multiple kinds of affective and cognitive support during a frustrating learning episode. The system allows us to control factors that have previously been impossible to control, enabling for the first time the study of how these factors interact in helping learners develop the ability to persevere during frustrating learning episodes.
  3. One of the findings in mass-communication settings was that people tended to "glow" when a new speaker came onstage, and during live demos, laughter, and live audience interaction. They tended to "go dim" during powerpoint presentations. In smaller educational settings, students have commented on how they tend to glow when they are more engaged with learning.
  4. This project developed efficient versions of Bayesian techniques for a variety of inference problems, including curve fitting, mixture-density estimation, principal-components analysis (PCA), automatic relevance determination, and spectral analysis. One of the surprising methods that resulted is a new Bayesian spectral analysis tool for nonstationary and unevenly sampled signals, such as electrocardiogram (EKG) signals, where there is a sample with each new (irregularly spaced) R wave. The new method outperforms other methods such as Burg, Music, and Welch, and compares favorably to the multitaper method without requiring any windowing. The ability to use unevenly spaced data helps avoid problems with aliasing. The method runs in real time on either evenly or unevenly sampled data.