Social and Ethical reflections on Companion Robotics
1. “Do not call it a robot, call it a companion”
Social and Ethical reflections on Companion Robotics
5th Companion Robotics Institute (CRI) Workshop , 8th March 2011
AAL User-Centred Comopanion Robotics Experimentoria,
Supporting Socio-ethically Intelligent Assistive Technologies Adoption
Claire Huijnen, Smart Homes
CompanionAble Project: FP7 Grant Agreement Nr. 216487
2. Ethical Model
• Model provides insight in potential ethical issues in AAL
• Based on
• Ethical Principles
• Potential Risks that impede on these Ethical Principles
• Risks are results from SWAMI project (6th Framework
Programme)
• Goal: to identify and analyse the social, economic, legal,
technological and ethical issues related to AmI environments
• Different levels
• Apply Ethical Model
• Include ethical safeguards during development phase
CompanionAble Project: FP7 Grant Agreement Nr. 216487
3. Ethical model
On each level
CompanionAble Project: FP7 Grant Agreement Nr. 216487
4. Example
Level: User
Ethical principle: Autonomy might be diminished
Risk: Control/dependency
Questions:
Will people’s autonomy be diminished when the robot gives medicine reminders
for them?
Will people rely on the robot on the long term? Like “Oh if I forget the medicine,
it is no problem because the robot will tell me, so I do not have to think that
much”.
Can the robot make questions before reminding? For example, “there is
something important you have to do today at this time: do you remember what it
is?”
CompanionAble Project: FP7 Grant Agreement Nr. 216487
5. Scenarios
CompanionAble
Short impression of scenarios of
EU FP7 Project CompanionAble
(video on next slide)
CompanionAble Project: FP7 Grant Agreement Nr. 216487
6. Scenarios Mobiserv
Short impression of scenarios of
EU FP7 Project Mobiserv
(video on next slide)
CompanionAble Project: FP7 Grant Agreement Nr. 216487
7. Privacy
Need for social situation awareness, assessment and differentiation
– Selective and appropriate reacting, prompting and reminding
– Neutral signalling option
Need for controlling degrees of systems’ alertness / presence
– Need for differentiation (‘not now’ versus ‘red alert’)
“The toilet and the bathroom
Need for profiling and personal adaptation are places where things go
wrong quite often, so the
system should be able to
Need for transparency (understand behaviour, state, goal)
offer help here. I would not
bother about privacy here.”
“The robot should only be used for the things you want it
to do. When you have a handicap, you would need help
everywhere. This could be anywhere, so I would not say
that he cannot go somewhere very easily. The system
has to be able to do its job.”
CompanionAble Project: FP7 Grant Agreement Nr. 216487
8. Privacy
“He cannot go to the bedroom? Well…. I “Many people have quite some keys
think that if he can go to the kitchen, and spread around to for instance 6 or 7
has functions that people would also people in the neighborhood, for their
accept in the bedroom and bathroom, safety. But all these people can just
then people also want support there.” enter their home, so what is
privacy?”
Trade-offs “It is not a bad thing that
someone from outside can
Need to be contextualised have a look around in your
Differ in time, place, purpose, condition, home, as long as it is used
social setting for assistance”
“When you get
dependent, privacy
gets less relevant, you
want high quality care”
CompanionAble Project: FP7 Grant Agreement Nr. 216487
9. How to address people?
Pay attention to system’s: “Some encouraging expressions (like:
‘Fantastisch’) are exaggerated and
inappropriate, “I am not a child”
• Embodied interaction “I do not need any pats on my back”
• “Voice” and dialogue “the voice goes down,
• “Mood” or “Vibe” does not sound happy”
• Tone of voice “In Friesland you would
need a local dialect,
• Reminding or stimulating style otherwise people would
not understand the robot,
• Degree of pro-activeness / initiative plus the robot would not
understand the people”
• “Character” of the system
• Level of surprise / predictability “the wording “I detected an
emergency is terrible!” Is this
what you say to ordinary people?
CompanionAble Project: FP7 Grant Agreement Nr. 216487
10. Control
“the system reduces my fear and
“I don’t know anxiety and makes me feel more
what I am confident; it gives me the feeling that it
doing.” would look after you and your health”
Support versus stimulating own activity
Smart dialogue to trigger / stimulate people
“Do you want the robot to follow you all day
long or not. It is like a cell phone; do you
always have it turned on or not. This is ““If I have a problem I
something you decide for yourself.” will turn off and that’s it”
CompanionAble Project: FP7 Grant Agreement Nr. 216487
11. Building up trust
Consultation, cooperation with professional carers builds trust
(pharmacy, physiotherapist, gerontologists, ergo therapist)
Understanding and expectation management (stereotypes)
Intelligent dialogue
“For specific nutrition or people with
Recognition of appropriate support certain restrictions, it would be good
when a professional enters the
Positive reinforcements and encounters reminders and diets. The informal
carer can update the professional”
“This can not be “Who will do the combination of the
more distant” medicines and the filling of the bags?
“And he only gives the The pharmacy?”
right dose of medicine?
This cannot go wrong”?
“Can I trust that the right dose of “But these are medical data, so they
medicine is given? No mistakes are should be very precise. I would not trust
possible?” it, and therefore assign less value to it.”
CompanionAble Project: FP7 Grant Agreement Nr. 216487
12. When Companion?
Effect of embodiment “Stop using the
Additional dimension (pressure, challenge, motivation) word “robot’, call it
a companion”
Truly knows people’s preferences, when/how to
react, what / how to present “Good and safe that the robot keeps an eye
Guiding and suggesting support on you. If you choose to have a robot, you
have to open to this. Some things you will
Keep an eye on you allow, but some others you do not want to
do or learn anymore at a certain age”
Need for personas of companion system
“People will probably find out a
(reserved <> retiring <> entertaining <> invitational <> pro- name for the robot, less distant,
active , … ) less technical, sounds better, short
names, easy to remember.”
“But seen as a robot it is too cold, too hard for
“If it becomes a buddy then I want to
me. When I came here I thought it was a device
use it definitely, but if it’s a robot, then
that carried out tasks, but a buddy is better.”
no.”
CompanionAble Project: FP7 Grant Agreement Nr. 216487