9. Roles of ECAs
ECAs are “Interactive Virtual Characters” that are situated in mediated
(often distributed) environments.
They can play four main roles:
Assistants to welcome users and to assist them in understanding and
using the structure and the functioning of applications
Tutors for students in human-learning mediated environments, or
for patients in psychological/pathological monitoring systems
Partners for actors in virtual environments: partner or adversary in a
game, participant in creative design groups, member of a
mixed-initiative community, …
Companions as a friend in a long term relashionship
10. Categories of ECAs
GRETA (Pélachaud LTCI) STEVE (Rickel et al USC) VTE (Gratch et al USC)
Talking heads Gestural Agents Situated Agents
Fixed – Realistic Fixes/floating – moving arms Complete mobile characters
Expressions – LipSynch Dialogue – Deictic – Sign language Virtual/Augmented reality
Emotions Tutoring – Assistance Training – Action
12. Scientific issues
Agent models: interaction performance/efficiency of models
Interaction models with users: Multi-modality, H/M Dialogue
Models of cognitive agents: BDI logics and planning, affective logics, ...
Task models and user models: Symbolic Representation and Reasoning
Human modelling: ecological relevance of models
Capture
Representation of physiology
Reproduction of expressions of humans
Evaluation of behaviours
Application
13. Scientific communities
Computer Science and Natural Language Processing
Multi-Agent Systems
Human-Machine Interaction/Interfaces
Human-Machine Dialogue
Humanities and Society Sciences
Psychology of Language Interaction
Ergonomic psychology
14. Projects and teams
European project SEMAINE
(http://www.semaine-project.eu/)
LTCI (France): Greta
LIMSI (France): MARC
Bielefield University (Germany): MAX
USC (USA): STEVE, VTE
15. Design of animated characters
(ECAs or avatars)
Independent of the embodiment level User
Non verbal behaviours (postures, gestures, ...): feeling?
Theoretical approach (related works)
Empirical approach (corpus analysis)
Generated animations
From an intention (triggering action) Evaluation
Automatically and cyclically
<configuration>
<timecode>10</timecode>
<body>front</body>
<head>front</head>
<eyes>open happy</eyes>
… … …
<leftArm>hello</leftArm>
<rightArm>hip</rightArm>
<speech>Hello!</speech>
</configuration>
Corpora Representation Reproduction
17. Models based on visems
Visem: elementary unit of the position of the
muscles of the human faces (3D model)
Experimental settings:
Small red balls manually placed on the face
3 to 5 cameras
Training corpus
G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
18. Transitions between
expressions
Torsion model of the face
and the neck
The positions of the red
balls are set manually
Pattern recognition is used
to monitor ball movements
Extraction of key frames
automatic and manual
G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
19. Analysis and synthesis
Visems + 3D model are used to (re-)generate
expressions
6 to 11 parameters
Validation by comparing to real expressions
G. Bailly and F. Elisey, GT ACA, 2006.03.15, http://www.limsi.fr/aca/
20. ‘Talking head’ project (LIMSI)
Based on visems
Video corpus of 21
utterances:
« C'est /phonem/ ici » (Ex: « C'est u ici »)
... u
C'est
i ci ...
21. Realistic talking heads
Greta (Prudence, Obadiah, Poppy and Spike)
LTCI – Telecom ParisTech
MARC
LIMSI
23. Design of realistic gestural
agents and avatars
MARC (LIMSI)
Annotated corpus of videos (ex: Anvil)
GRETA (LTCI)
Psychological knowledge (Related works)
24. 2D cartoon agents: Lea
Lea Marco Jules Genius
LEA Agents (LIMSI)
• 2D cartoon characters (110 GIF files)
• Easily integrated into Java applications and JavaScript
• Description language for high-level behaviours
30. Chat bots in the Internet
Chat bot = “chatterbox” + “robot”
A program capable of NL dialogue
Usually key-word spotting
No dialogue session or context
No embodiment
Goals: only chat
ELIZA (J. Weizenbaum,1965)
32. Social avatars on the web
Social avatars
The recent evolution of the communication over the web
has prompted the development of so-called “avatars”
which are graphical entities dedicated to the mediating
process between humans in distributed virtual
environments (ex: the “meeting services”)
Contrary to an ECA which is the personification of a
computer program (an artificial agent) an avatar is the
personification of a real human who controls the avatar.
(Avatar = Mask) ≠ Agent
Second life
Presented as a « metaverse » by its developer, Linden
Lab, Second Life is a web-based virtual simulation of the
social life of ordinary people.
5 000 000 avatars were created
Permanently , ~ 200 000 visitors are logged
34. Mixed communities and
Heterogeneous MAS
Metaphor: meetings
Definition : computerised environments that integrates
transparently humans and artificial agents within meetings
Mixed initiative communities: brainstorm, design, decision, …
Virtual Training Environments (VTE) : training, simulation, …
Deploying environments
Smart Room: physical place with augmented reality
Web-based : distributed and mediated environments
Animated characters
Avatars : virtual characters driven by humans
Agents : virtual characters which embodies artificial agents
35. Social interaction in virtual
environments
Mixed training environments
Avatars
ECAs
+
Task: virtual firemen
Recognition of fire
Activity reports
Interaction
Emotion management
Gestures and facial expressions
NL dialogue management
M. El Jed, N. Pallamin, L. Morales, C. Adam, B. Pavard, IRIT-GRIC, GT ACA, 2005.11.15 — www.irit.fr/GRIC/VR/
36. Realistic behaviour of agents
The study of videos enabled to highlight how deictic gestures and gazes towards items are
used concomitantly with natural language words ("here", "it", "now", ...)
M. El Jed, N. Pallamin, L. Morales, C. Adam, B. Pavard, IRIT-GRIC, GT ACA, 15 nov 2005 — www.irit.fr/GRIC/VR/
37. From assistant agents to
ambient environments
Application taken from the DIVA toolkit (LIMSI)
Transporting the Function of Assistance from stand-alone
applications to room-based ambient environments
38. DIVA display devices of the
HD TV Screen Large touch Screen Portable touch Screen
Presentation of any Internet content Presentation of the IRoom controls
Diva Toolkit + IRoom (LIMSI)
40. Pointing objects in the
physical world Body zooming for close objects
“Which of the two remote controllers should I use?”
Diva Toolkit + IRoom (LIMSI)
42. Examples from professional
web sites
Lucie, virtual assistant of SFR
Question
ECA
EDF
Site
P. Suignard, GT ACA, 2004.10.26
http://www.limsi.fr/aca/
43. A cons-example:
the « Clippie effect »
Microsoft Agents™ Helping text
corresponding to
― Free technology – no support
the 1st line of result
― Agents 2D cartoon
― Can be used on web pages
― Third part characters
(LaCantoche™)
― API for JavaScript & VBScript
Intuitive request
expressed in NL
Limitations
― Inputs limited to clicks and menus
― No proper dialogue model
― No application model
― No synchronization
speech/movements
― Emotions = cartoon ‘emotes’ 2 results too
far ranked
but quite
relevant
44. Assistant agents: definition
« An Assistant Agent is a software tool with the capacity to resolve
help requests, issuing from novice users, about the static structure
and the dynamic functioning of software components or services »
InterViews Project – February 1999
Following Patti Maes MIT, 1994
User person with poor knowledge about the component
(novice)
Request help demand in natural language (speech/text)
Component computer application, web service, ambient appliance
Agent rational, assistant, conversational, (can be embodied)
Mediator symbolic model of the structure and the functioning
46. The agents of the Daft project
Assistant ECA
Hello !
Exemple : un groupe nominal
§Gboxanalysis "le tres petit bouton rouge"
GRASP PHASE 1 RULELIST length 74
GRASP PHASE 2 RULELIST length 9
FLEX le FLEX tres FLEX petit FLEX bouton FLEX rouge
LEM le LEM tres LEM petit LEM bouton LEM rouge
POS DD POS RR POS JJ POS NN POS JJ
KEY THE R KEY INTLARGE R KEY SIZESMALL R KEY BUTTON R KEY RED R
TYPE LEM TYPE LEM TYPE LEM TYPE LEM TYPE LEM
ID 54252 ID 54253 ID 54254 ID 54255 ID 54256
Comment je peux …
HIT : RR : R J
Human Linguistic
tres , petit
FLEX le FLEX petit FLEX bouton FLEX rouge
LEM le LEM petit LEM bouton LEM rouge
POS DD POS JJ POS NN POS JJ
KEY THE R KEY SIZESMALL R KEY BUTTON R KEY RED R
TYPE LEM FLEX tres TYPE LEM TYPE LEM
ID 54252 LEM tres ID 54255 ID 54256
MODE : POS RR
KEY INTLARGE R
TYPE LEM
ID 54253
Agent Agent
SETK 1
TYPE LEM
ID 54254
HIT : NN : D J N E J le, petit , bouton , rouge
FLEX bouton
LEM bouton
POS NN
KEY BUTTON R
FLEX le
LEM le
POS DD
DET
KEY THE R
TYPE LEM
ID 54252
FLEX petit FLEX rouge
LEM petit LEM rouge
POS JJ POS JJ
KEY SIZESMALL R KEY RED R
FLEX tres TYPE LEM
LEM tres ID 54256
ATTR :
MODE : POS RR
KEY INTLARGE R
TYPE LEM
ID 54253
SETK 1
TYPE LEM
ID 54254
SETK 2
TYPE LEM
ID 54255
Software
Σ Hi Rationall
Agent MODEL[ Agent
ID[main]
PARTS[
Dialogue
TITLE,
FIELD,
GROUP[
ID[command],
PARTS[
Session
BUTTON,
BUTTON,
BUTTON
]
Profil ]
],
USAGE["This
comp
onent
is
…" ]
Modeling Agent
]
47. The Rational Assisting Agent
« How many buttons are there? »
ASK[COUNT[REF(BUTTON)]]
Formal request language
Rational agent
Interaction Component
model Σ Hi model
- Session - Structure
- Profile - Action
- Task The agent reasons, - Process
modifies the models
Loop
Formal answer language 1. Read a formal request ;
TELL[EQUAL[COUNT[REF(BUTTON)]],3]] 2. Contextualize the formal request:
« There are three buttons » - Explore the interaction model;
- Explore the component model.
3. Process the formal request:
- Update the interaction model;
- Update the component model.
4. Produce the formal answer.
48. Dialogue with components
Java Interface for ECA: DaftLea (K. LeGuern 2004) counter
Here
is …
“Counter” component
Standard GUI that can
contain Java applets
“Hanoi” component
LEA
“ AMI web site” component
ACA LEA (Java) Speech synthesis: Elan Speech
J-C Martin & S. Abrilian ...
49. Processing of a Daft request
G
IATIN
ED DEL
M O
Done! M
4
3
2
1
«restart the counter » 1) The utterance is analyzed to build a formal request:
RESTART[REF[“COUNTER”]]
2) The reference REF is resolved within the model (large
When the user enters « restart the counter » in the red rectangle)
chatbox, he/she can see: 3) The restart action is applied on the boolean of the
counter thus producing the model variable INT[8] to be
- The number increments 8, 10, 12, 14 … incremented
- LEA says: « Done! » 4) An update is sent to the boolean variable of the
- LEA expresses the emote: ACKNOWLEDGE application
5) The behavioral answer « ACKNOWLEDGE » is sent to
the virtual agent.
52. Evaluation of ECAs
Efficiency Measures the actual performance of the couple user-agent
when carrying out a given task.
Objective
Usability Measures the capacity of the user to have a good
comprehension of the functioning of the system and the
fluency of the control.
Friendliness Measures the “feeling” of the user about the features of the
system (attraction, commitment, esthetic, comfort, …)
Subjective
Believability Measures the “feeling” of the user about the fact that the
agent can understand the user’s problems and has the
capacity to help with real competence.
Trust Measures the “feeling” of the user about the fact that the
agent behaves as a trustable and cooperative entity.
53. Ratio realism / friendliness
Eyeballs 2D cartoon 3D realistic agents Humans
% Friendliness
Bunraku
puppet
Toy
"uncanny
valley"
% similarity
Masahito Mori, Institut Robotique de Tokyo in Jasia Reicardt, “Robots are coming”, Thames and Hudson Ltd, 1978
54. The Persona Effect of Lester
Results
Presentation Measured improving of the parameters:
30 Sujects : With Agent - performance of comprehension
15 M & 15 F,
- performance of recall (memory)
age ≈ 28 years
Presentation - subjective questions satisfaction
Without Agent
ECA
An arrow
WITH AGENT WITHOUT AGENT
• Lester et al. (1997). The Persona Effect: Affective impact of Animated Pedagogical Agents. CHI’97.
• van Mulken et al. (1998). The Persona Effect: How substantial is it? HCI’98.
55. Persona effect and memorizing
Ik ben Annita, de assisterent
bj dit experiment
Measured improving the
Without agent performance of recall of stories
Subjective evaluation:
Anthropomorphic questions
Example : “Did you feel the agent sensitive?”
61 Sujects :
39 M, 22 F
Realistic agent Experiment: stories are told to subjects using three
various situations (without agent, realistic agent,
cartoon agent). Recalling questions are then asked to
subjects. The three situations are compared.
Result: an ECA improve the memorizing of information.
Cartoon agent
Beun et al. (2003). Embodied conversational agents: Effects on memory performance and anthropomorphisation. IVA’03.
56. Experiments on Functional
description
Three different agents:
― 2 men (Marco, Jules) with different garment
― 1 woman (Lea)
1
Three strategies of cooperation:
― Redundancy 33 = 27
― Complementarity experiments
― Specialization
Three objects to describe:
1. Video recorder remote controller
2. Photocopier control panel
3. Application for designing graphic documents
3 2
Stéphanie Buisine, LIMSI 2004
57. Functional description:
evaluation results
Quality of explanation
― No noticeable effect of the appearance,
― Noticeable effect of the multimodal behaviour:
the redundant explanations are preferred.
Sympathy
― No noticeable effect of the multimodal behaviour,
― Noticeable effect of the appearance :
> >
Performance
― Same as the appearance,
― But no correlation between Sympathy and
Performance.
Impact of the user’s gender
― Men react differentially and prefer redounding
explanations,
― Women do not react differentially.
Stéphanie Buisine, LIMSI 2004
59. Turing test: conversational
intelligence
Turing Test, Alan M.
Turing (1912-1954) : "the
imitation game" in
“Computing Machinery
and Intelligence”, Mind,
Vol. 59, No. 236, pp. 433-
460, 1950.
Can be replaced by a
Turing-like test, on
output data instead of
during direct interaction
60. Wizard of Oz methodology
Wizard of Oz
An experimenter pilots an avatar
The user ignores is in front of a human (he thinks
he faces a virtual agent)
61. Humain-Agent interaction
Interaction Interaction
Interaction
Multiple inputs
Textual (text boxes), audio (microphone), video (camera), sensors (AmI environment),...
Multiple outputs
Textual, audio, facial expressions, postures, gestures, lighting, ...
Multiple models
Conversational, emotional, of the environment, of the user, ...
66. Conversational agents
Textual / GUI Embodied
Non linguistic • Contextual help: pop-ups, ... • State printer: emotion,
• Online manuals: hypertext emotes
navigation
• Assistants using a decision tree
Unidirectional, INPUT SORTIE
linguistic • Query fields embedded in an • Feature presenter
application window: keywords,
free/open questions
Conversational • Chat bots: Eliza • Mediators: acts as a
(bidirectional, • Dialogical assistants: representative of an
linguistic) InterViews, Daft, Artemis, … application of a service
• Avatars
67. The reference of the chatbots
A. L. I. C. E.
Author: Richard S. Wallace Website: http://www.alicebot.org
Documentation : http://alicebot.org/TR/2001/WD-aiml-1.0.1-20011025-006.html
68. Jabberwacky wins the Loebner
Prize 2005
….
PROGRAM: Hello, my name is George and I am the human.
JUDGE: Hi George, cslkjslw weicmslw weos, right?
PROGRAM: Don't repeat your name, csthiang!
JUDGE: If lsjkel lwehlx wouldn't you say?
PROGRAM: No, he's still good.
JUDGE: Who.
PROGRAM: You and me.
JUDGE: If you are human, tell me which is larger, a salt crystal or the Empire State Building.
PROGRAM: It's caring about another person's well being more than your own.
JUDGE: What if I care about another person's well being more than I should?
PROGRAM: How would you do that?
JUDGE: Which is further away from us, Queens or Inner Mongolia?
PROGRAM: HA! HA! That was stupid. Tell me a GOOD joke this time.
JUDGE: You say you are human but you can't answer a simple question.
PROGRAM: Because I am alive.
JUDGE: You are not following the rules. You are supposed to convince me that you are a human, not convince me that you are mysterious.
PROGRAM: I can't very well convince one lacking a mind, can I?
JUDGE: It isn't my mind that is in question, it is yours!
PROGRAM: No my name is not katie.
JUDGE: Which is larger, your brain or Belgium?
PPROGRAM Oh the one on the far left without a doubt.
JUDGE: Francais.
ROGRAM: Yes do you speak it?
JUDGE: Francais.
PROGRAM: Francais?
JUDGE: Francais.
PROGRAM: O well I don't speak french.
JUDGE: Who.
PROGRAM: Mine.
JUDGE: The ljwli is what we want, right? PROGRAM: I can not do it right now because I m talking to you a million miles away.
JUDGE: What do you think the ljwli.
PROGRAM: Depends on the person.
JUDGE: But the ljwli.
PROGRAM: So they can chuck more wood?
…
69. Dialogue systems
Retro-action loop not taken into account: knowing the state of the
dialogue (DM) can improve
Automatic speech recognizer
Natural language understanding
The DM should manage multimodality
As input (ex: gestures, facial expressions, emotions)
As output (ex: emphasis, voice modulating, utterance+gesture)
70. Utterance understanding
Speech act theory (Searle&Vanderveken, 1985)
In accordance with Bratman's theory of
intention (Bratman, 1987)
Illocutionary acts (Vanderveken, 1990)
Formalisation: F(P)
5 speech acts: assertive, commissive,
directive, déclarative, expressive
Transposition to MAS
KQML
FIPA ACL
71. Existing dialogue systems
Case-base systems using keyword spotting: chatbot
Communication protocols (FIPA ACL): MAS
Planning systems (Allen 80, Traum 96)
POM-DP: call centres (Frampton&Lemon 09)
Dialogue games (Maudet 00)
Information-State update (Larsson 00) : Trindikit,
GoDiS, IbiS
Scientific deadlock yet unsolved!
(see Mission Rehearsal Exercise (Swartout et al. 2006))
72. Ratio Effort/Performance
Actual Performance
100%
Control, command,
assistance…
50%
Chat
bots
H/M Dialog
Games, Systems
socialization,
10% games, affects, …
Effort = Code and resources
1 10 100 1000
74. Kesako ?
Affective Computing
● Book by R. Picard (1997)
● Detect, Interprete, Process and Simulate
human emotions
How is it conncted to ECAs ?
● (Krämer et al., 2003)
People, when interacting with an ECA, tend to be more
polite, more nervous and behave socially
● (Hess et al., 1999)
Emotions play a crucial rôle in social interactions
→ Affective Conversationnal Agents !
75. The story so far...
Emotions R. Picard Humaine
ACII
Virtual Agents Video Games CHI IVA Serious Games
Simulation
Online ECAs
everywhere
H-M Dialog Eliza Trains
70 90 10
Agents MAS AAMAS
76. Links from the community
IVA (Intelligent Virtual Agents)
http://iva2012.soe.ucsc.edu/
ACII (Affective Computing & Intelligent
Interaction)
http://www.acii2011.org/
Humaine
● Projet FP6 (1/12004 – 31/12/2007)
● Association
● http://emotion-research.net/
78. Research questions
Detect, interprete, process and simulate human
emotions → affects
Cognitive Sciences
● AI and psychology (for emotions)
● AI and sociology (for social relations)
3 domains :
Recognition Models Synthesis
Patterns Reco AI Image
Signal processing Formal Models of Interactions Synchronisation
Fusion KR & reasoning Turn taking
Machine Learning Behaviour models
82. Emotions synthesis
From Emotion to Parameters
parametrization State transitions
Expressing
Dynamics
Annotated Generalization
Corpus Emotions
Evaluation
● Human recognition of expressed emotions
not reliable, even in H-H interaction (with
spontaneous emotions)
Some examples:
● Virtual agents: Greta/Semaine
● Robots: Kismet, iCat, Nao
83. From annotations to
animations
Original GRETA
video
Anger Desperation
C. Pélachaud LTCI and J-C Martin LIMSI
Joy Anger Fear Surprise Distress
Joie Colère Peur Surprise Tristesse
85. A bit of Human Science
« The question is not whether intelligent machines can have
any emotion, but whether machines can be intelligent without
any emotions. »
Marvin Minsky, 1986
Darwin, 1872 (Ed. 2001)
The expression of emotions in man and animals
→ Adaptation mechanism
+ Role in communication (Ekman, 1973)
2 schools
● James, 1887
Organism changes → emotion
● Lazarus, 1984 ; Scherer, 1984
Appraisal theory
86. Appraisal theory
Goals, preferences
Personnality
Evaluation
in context
Perceptions
= stimuli
Emotion
Environment
Actions
Individual Adaptation
aka coping
87. Appraisal theory (2)
Coping (adaptation strategy)
= the process that one puts between
himself/herself and the punishing event
● Active coping (i.e. problem focused)
→ act upon one's environment
● Passive coping (i.e. emotion focused)
→ act upon one's emotion
Rousseau & Hayes-Roth, 1997
● Personnality → emotions (Watson & Clark, 1992)
● Emotions → Social relations (Walker, 1997)
88. Architecture
Events Personnality
Emotions Mood
Social role
Sociales
relations
Actions
89. Architecture
Events Personnality
1
2
Emotions Mood
Social role
3
Sociales
relations
Actions
90. Models of personnality
Personnality = stable component in an
individual's behaviour, attitude, reactions...
Computational models
● Define variables
● Define their value domains
● Define the characteristic values
Eysenck, 1967
● Extraversion → Higher sensitivity to positive
emotions (and more expressive)
● Neuroticism → Higher sensitivity to negative
emotions (and more frequent changes)
91. Models of personnality (2)
McRae, 1987 (a.k.a OCEAN or « Big Five Model »)
● Openness → curiosity
● Conciousness → organization
● Extraversion → exteriorization
● Agreeableness → cooperation
● Neuroticism → emotional unstability
Myer-Briggs Type Indicator (MBTI) based on Jung
● Energy → introverted or extroverted
● Information collection → sensitive or intuitive
● Decision → thinking or feeling
● Action → judgement or perception
93. Emotions models
What is an emotion ?
● Darwin, James, Ekman, Scherer, …
Characterizing an emotion
● Categories-based approaches
→ There are N emotions (N being subject to discussion)
● Dimensions-based approaches
→ An emotion is a point in an N-space (dimensions
must make a basis)
94. The PAD model
Mehrabian, 1980
Emotion = 3 dimensions
● Pleasure (or « valence »)
● Arrousal (or « activation degree »)
● Dominance
Examples
96. The OCC model
Ortony, Clore, Collins, 1988
20 emotion categories
● Joy – Distress
● Fear – Hope
● Relief – Disappointment, Fear-confirmed –
satisfaction
● Pride – Shame, Admiration – Reproach
● Love – Hate
● Happy-for – Resentment, Gloating – Pity
● Remoarse – Gratification, Gratitude – Anger
97. OCC (cont.)
Appraisal model based on individuals' goals &
preferences:
98. One example: EMA
(Gratch & Marsella, 2010)
EMotions & Adaptation
→ Appraisal
→ Coping (action selection)
Based on SOAR
Evaluation criteria
● Relevance
● Point of view
● Desirability (expected utility)
● Probability & expectedness
● Cause
● Control (by me and others)
99. EMA (cont.)
Rule-based system
E.g. : Desirability(self, p) > 0 & Likelihood(self, p) < 1.0 → hope
Coping strategies
● Perceptions
● Belief revision (including about other agents'
intentions and responsibilities)
● Changing goals
Simulation model
Application to serious
games
100. Social relations
Social behaviour model
Static models: Walker, Rousseau et al.,
Gratch...
Dynamic model: Ochs et al.
Several traits:
● Liking
● Dominance
● Solidarity (or social distance)
● Familiarity (subject to discussion)
→ Social Relation unidirectional and not-always reciprocal!
101. Emotions and Social Relations
Influence E → RS (examples)
● Ortony, 91 : caused >0 emotions → + liking
● Pride, Reproach → + dominance
Fear, Distress, Admiration → - dominance
Influence of the social relation on action and
communication
Action selection guided by target social relation
Emotional contagion
→ neighbours computation based on social
relations
102. One example : OSSE
Attitudes (Ochs et al., 2008)
Scenario
= set of events
Events
→ Emotions
→ Social relation
dynamics
Social roles
→ initial relation
103. Learn more !
Books
● Rosalind Picard, Affective Computing, MIT
Press
● Scherer, Bänziger, & Roesch (Eds.) A blueprint
for an affectively competent agent: Cross-
fertilization between Emotion Psychology,
Affective Neuroscience, and Affective
Computing. Oxford University Press
Next IVA conference → next September
humaine-news mailing list !
ACII in 2013
104. Acknowledgements
Jean-Paul Sansonnet (LIMSI) for his
presentation materials
All members of GT ACAI
105. Conclusion
Join the
GT ACAI!
http://acai.lip6.fr/