Gianmarco Veruggio. Roboethics on Skolkovo Robotics
Roboethics: Philosophical,
Social and Ethical
Implications of Robotics
Gianmarco Veruggio
Director of Research, Italian National Research Council
Istituto di Elettronica e di Ingegneria dell'Informazione e delle Telecomunicazioni
Honorary President of Scuola di Robotica
From Industrial Robotics to Service Robotics
About 16,100 service robots for professional use were sold in 2012,
2% more than in 2011, reports IFR Statistical Department in the
new study "World Robotics 2013 - Service Robots‖.
2
Service Robotics Overview
•
•
•
•
•
•
•
•
•
•
•
•
Field robotics
Professional cleaning
Inspection and maintenance systems
Construction and demolition
Logistic systems
Medical robotics
Defense, rescue & security applications
Underwater systems
Mobile Platforms in general use
Robot arms in general use
Public relation robots
Humanoids
3
Advanced Robotics Technical Issues
New technical issues arising from:
• HW/SW Complexity
• Autonomy
• Uncertainty, deriving from the unstructured and chaotic real
environment.
• Unpredictability of learning machines;
• Traceability of evaluation/actions procedures.
• Identification of robots.
• Cyber security
6
Advanced Robotics ELS issues
Ethical, Legal and Societal issues:
• Replacement of human beings (economic problems; human
unemployment; social instability);
• Digital divide;
• Generational divide;
• Lack of legislation;
• Privacy;
• Psychological problems (deviations in human emotions,
problems of attachment, disorganization in children, fears,
panic, confusion between real and artificial, feeling of
subordination towards robots)
7
Robotics: a new science?
Robotics is born from…
•Mechanics
•Electrical Engineering
•Electronics
•Automation
•Cybernetics
•Computer Science
•Artificial Intelligence
•Information Technology
…and it draws some
elements from:
•Physics/Math
•Logic/Linguistics
•Neuroscience/Psychology
•Biology/Physiology
•Anthropology/Philosophy
•Art/Industrial Design
Robotics Gestalt
The whole is greater than the sum of the parts!
8
Robots in Human History
Robots come from an ancient mith
and vision: the word "automation"
is the latinization of the Greek
αὐτόματον, automaton, "acting of
one’s own will".
This word was first used by Homer
(8th century BC) to describe
automatic door opening, or
automatic movement of wheeled
tripods. He narrated about metallic
statues made animate by the divine
smith Hephaistos and
manufactured by the great
Athenian craftsman Daedalus.
9
The History of Automatons
In the reality Heron of Alexandria (c. 10 – 70 AD) was an ancient
Greek mathematician and engineer who wrote the book
―Automata‖, a description of machines which enable wonders in
temples by mechanical or pneumatical means (e.g. automatic
opening or closing of temple doors, statues that pour wine, etc.).
The story of automatons
continues until 19th
Century (the period
1860 to 1910 is known
as "The Golden Age of
Automata‖)
The Writer by Pierre Jaquet-Droz
Tea Serving Doll by TAMAYA Shobei IX
10
The birth of Robots in Literature
The first automaton called "robots", are the mechanical slaves in
the play R.U.R. (Rossum's Universal Robots) by Karel Čapek.
When the play premiered in 1921, it introduced the word into the
world's vocabulary.
And few years later, in 1928, Fritz Lang created the character of
Maria/Robotrix in his movie Metropolis.
11
Human tendency to Anthropomorphization
Giving human characteristics to animals, inanimate objects or
natural phenomena is a human trait called ―to anthropomorphize.‖
The term ανθρωπομορυισμός (anthropomorphism) was coined by
the Greek philosopher Xenophanes (c. 570 – c. 475 BC) when
describing the similarity between religious believers and their gods
(that is, Greek gods were depicted having light skin and blue eyes
while African gods had dark skin and brown eyes).
Anthropomorphism carries many important implications.
For example, thinking of a nonhuman entity in human ways renders
it worthy of moral care and consideration.
In addition, anthropomorphized entities become responsible for
their own actions — that is, they become deserving of punishment
and reward.
12
The Pinocchio Syndrome
I coined the definition ―Pinocchio Syndrome‖ to describe this
trait of human psychology applied to automata/robots, which are
considered sub-human beings who will evolve into humans.
The Adventures of Pinocchio
Carlo Collodi, 1883
A.I. Artificial Intelligence
Steven Spielberg, 2001
13
Pay Attention to the Flaws in Reasoning!
DOGS have four legs,
The THING that I see here has four legs,
therefore
The THING that I see here is a DOG!
?
14
From Science Fiction…
Isaac Asimov wrote the famous Three Laws of Robotics (1942):
1. A robot may not injure a human being or, through inaction, allow
a human being to come to harm.
2. A robot must obey orders given it by human beings except where
such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection
does not conflict with the First or Second Law.
15
… to Reality!
The main applications field of robotics today is Defence: about
6,600 service robots in defence applications account for 40% of the
total number of service robots for professional use sold in 2011.
The value of defence robots can only roughly be estimated. It was
about US$ 748 million, 3% higher than in 2010.
Thereof, about 28,000 robots for
defence applications will be sold
in the period 2013-2016. They
are followed by milking robots
with about 24,500 units. These
two service robot groups make
up 55% of the total forecast of
service robots. (IFR 2013)
16
Which way for Robotics?
―Would you tell me please, which way
I ought to go from here?‖ asked Alice.
―That depends a good deal on where
you want to get to‖ said the Cat.
17
Roboethics Definition
“Roboethics is an applied ethics whose objective is to
develop scientific/cultural/technical tools that can be shared
by different social groups and beliefs. These tools aim to
promote and encourage the development of Robotics for the
advancement of human society and individuals, and to help
preventing its misuse against humankind.” (Veruggio, 2002)
18
The Birth of Roboethics
The School of Robotics organized the
First International Symposium on
Roboethics, 30-31 January 2004,
Villa Nobel, Sanremo, Italy
Philosophers, jurists, sociologists,
anthropologist and moralists, together
with robotic scientists, were called to
contribute to lay the foundations of the
Roboethics: the Ethics in the design,
development and employment of the
Intelligent Machines.
19
What is Roboethics
The first level is represented by the adopted ethical theories,
developed principally by the branch of philosophy called ethics or
morality, which studies human conduct, moral assessments and the
concepts of good and evil, right and wrong, justice and injustice.
This is the proper concept of ―Roboethics‖, meaning applied
ethics that attempts to provide answers to new questions that are
generated by the progress of robotics. This level updates the
various views on concepts such as dignity and integrity of the
person and the fundamental rights of the individual, as well as the
social, psychological and legal aspects involved.
21
What is Robot Ethics
The second level, currently referred to as ―Robot Ethics‖, or
―Machine Ethics‖, regards the code of conduct that designers
implement in the Artificial Intelligence of robots. This means a
sort of Artificial Ethics able to guarantee that autonomous robots
will exhibit ethically acceptable behavior. It is clear that the
guidelines to define what is ethically acceptable and to enforce
them are the product of the abovementioned field of Roboethics.
Robots are, in fact, machines, meaning tools that are unaware of
the choices made by their human creators, which, therefore, bear
the moral responsibility for the actions, good or bad, of robots.
22
What is Robot’s Ethics
Finally, there is a third level, which we could perhaps define as
―Robot’s Ethics‖, because it is the ethic born from the subjective
morality of a hypothetical robot that is equipped with a conscience
and freedom to choose its own actions on the basis of a full
comprehension of their implications and consequences. It is only in
this case that robots may be deemed as moral agents, and that one
may refer to as involving the responsibilities or rights of robots.
23
Roboethics Taxonomy
Humanoids
Artificial Mind, Artificial Body
Advanced production systems
Industrial robotics
Adaptive robot servants and intelligent homes
Indoor Service Robots, Ubiquitous Robotics
Network Robotics
Internet Robotics, Robot ecology
Outdoor Robotics
Land, Sea, Air, Space
Health Care and Life Quality
Surgical Robotics, Bio-Robotics, Assistive Technology
Military Robotics
Intelligent Weapons, Robot Soldiers, Superhumans
Edutainment
Educational Robots, Robot Toys, Entertainment, Robotic Art
24
Focus on Military Robotics
In this field are comprised all the devices resulting from the
development of the traditional systems by robotics technology:
• Integrated Defense Systems: A.I. system for intelligence and
surveillance controlling weapons and aircraft capabilities.
• Unmanned Ground Vehicles (UGVs) Autonomous Tanks: armored
vehicles carrying weapons and/or tactical payloads.
• Intelligent Bombs and Missiles.
• UAVs (Unmanned Aerial Vehicles): also referred to as autonomous
flying veheicles (AFVs) or Drones, unmanned spy planes and
remotely piloted bombers.
• ASV (Autonomous Surface Vessels), patrol boats.
• AUVs (Autonomous Underwater Vehicles): intelligent torpedoes
and autonomous submarines.
25
Robot Soldiers
Robot Soldiers: Eventually humanoids may be employed to
substitute humans in performing ―sensitive‖ tasks and missions in
environments populated by humans. The main reasons of using
humanoids are to permit a one-by-one substitution, without
modifying the environment, neither the human/human interaction
nor the engagement rules. This could be required where the
safeguard of the human life is considered a priority in many
different scenarios:
•
•
•
•
Urban Terrain Combat
Indoor security operations.
Patrol
Surveillance
26
Superhuman
Superhuman: there are several projects aimed at developing a
superhuman soldier. Actually, the human body cannot perform
tasks with the strength, the speed and the fatigue resistance of
the machines.
By augmentation is indicated the
possibility to extend human’s
existing capabilities through
wearable robot exoskeletons, to
create superhuman strength, speed
and endurance.
•
Artificial Sensor Systems
•
Augmented Reality
•
Exoskeletons
27
Benefits
The claimed benefits of military robots are:
a) Tactical/Operational strength superiority;
b) Better performances of superhuman vs. human soldiers.
c) Limited loss of human lives in the Robotized Army;
d) Unemotional behavior, potentially more ethical than humans;
28
Issues
Main problems could arise from:
a) the inadequacy to manage the unstructured complexity of a
hostile scenario;
b) the unpredictability of machine behaviour;
c) the increased risk of starting a videogame-like war, due to the
decreased perception of its deadly effects;
d) Unpredictable side effects on civilian population;
e) Human in control hierarchy and robot’s transparency;
f) Psychological issues of humans in robotized environments
(mixed teams);
g) Accountability and Responsibility Gap;
h) The assignment of liability for misbehaviours or crimes;
29
Under Spotlight: USA Drones
These vehicles are known as ―autonomous combat flying
vehicles‖ (ACFVs), or more commonly as ―drones‖
While such vehicles are autonomous robots as far as flying is
concerned (including take-off and landing), officially they can
fire lethal weapons only by human command.
At present (early 2013) there are an estimated 7500 drones in the
US military arsenal, many are flown in secret missions by the
Central Intelligence Agency (CIA).
While 10 years ago the United States was the only country
possessing autonomous flying vehicles, CNN now estimates that
70 countries have AFV programs, but only about 15 of these have
military drones. (CNN 2012)
30
Drone ELS Issues
The ELS issues arising from the use of drones, both in civil and
military field, include the following:
•
•
•
•
•
•
•
•
Lack of legislation
Privacy
Data Security
Cyber security
Terrorism
Collateral Damages
Push-Button War
Undeclared Wars
31
Collateral Damages
Despite the increasing success of this technology, military
hierarchies feel concerned about potential dangers:
• Drones happen to accidentally fall possibly damaging humans
and objects.
• Daily news report about unintended injury or death of innocent
non-combatants (usually known as ―collateral damage‖) from
war theatres.
• Potential friendly-fire casualties in crowded battlefield or due to
enemy’s hacking/hijacking.
32
Push-Button War
Drones flying over Afghanistan or various targets in Africa are
controlled from Creech Air Force Base (near Las Vegas), or a
base in New Mexico, thousands of km away from the vehicles
themselves.
The very fact that the human
controllers who release the weapons
are very far away, so that they do not
see the blood and destruction directly
but only from the drone’s cameras
means that for some of them such
activities are more like a video game
rather than the killing and destruction
of human beings.
33
Undeclared Wars
Drones are used to attack suspected terrorists in countries that
(officially) are not at war with the US. Hence, they are used in
undeclared wars. This may be a violation of international law,
and it certainly raises ethical issues.
On the opposite, drones can quite easily be used by terrorists to
hit targets in any country of the world, bringing undeclared wars
everywhere.
34
Lack of International Conventions or Agreements
It is clear that military robots are here and they have changed the
nature of warfare dramatically.
However, there are currently no international treaties or
agreements governing their usee which raises serious ethical
questions.
Military Robotics should be thoroughly examined by Specialized
International Organizations, as happens for every type of military
technology, to be regulated by International Conventions or
Agreements.
35
The Basic and Underlying Etical Issue
Prior to discussing
―when, how, and where‖,
we should decide
―IF‖
a fully autonomous robot
can be allowed to kill a
human.
36
Thank You!
Contact Information:
Gianmarco VERUGGIO
Consiglio Nazionale delle Ricerche
Istituto di Elettronica e di Ingegneria dell'Informazione e delle Telecomunicazioni
Via De Marini, 6 - 16149 Genova, Italia
Email gianmarco@veruggio.it
Tel. +(39) 010-6475616
Mob. +(39) 338-9431561
Fax +(39) 010-6475200
38