Presentation given at a seminar on "the impact of algorithms on fundamental rights", 22 March 2018, organized by the Dutch Ministry of the Interior and Kingdom Relations, Department of Constitutional Affairs. Jeroen van den Hoven is professor of ethics and technology at Delft University of Technology and scientific director of the Delft Design for Values Institute.
5. New Technology…Ethics
AI
Deep Learning
Robotics
Big Data
Quantum Computing
Cloud Computing
Block chain
Internet of Things
Social media and On-
line platforms
3D Printing
5G
▪Privacy
▪Accountability
▪Democracy
▪Property
▪Control
▪Safety
▪Security
▪Sustainability
▪Human dignity
▪Identity
▪Social cohesion
6. Converging stuff
▪ Digital Technologies are converging
▪ Ethical Issues are converging
▪ Data protection, equity, justice, autonomy,
accountability…
▪ Boundaries, criteria, definitions….
7. new functionality
new vulnerabilities
▪ Constitutional Rights
▪ Privacy/person
▪ Freedom
▪ Equality/non discrimination
▪ Process/procedure/fairness
▪Cross cutting considerations
9. Cross cutting considerations
▪ Black Box Society and responsibility
▪ Big Nudging Society and autonomous choice
▪ Autonomous Systems and meaningful human
control
▪ Objectification and Privacy
▪ Algorithmic Citizenship and legal standing
▪ Computer supported utilitarianism Human Dignity
27. Algorithms
▪ used to sift through Big Data; find
remarkable patterns
▪ Classify, categorize, label, characterize,
predict, decide, recognize, sense,
distinguish, represent, choose, target,
judge, sentence, refuse, nudge, coach,
profile, select,
28. Black Box Society: epistemic
insecurity
▪ Explainability
▪ Transparency
▪ Understanding
▪ Sense making
▪ Informed choice
▪ Decision making
▪ Autonomy
▪ Accountability
▪ Personal development
▪ New epistemic conditions will affect where you go, associate,
vote, think, ……
29. Example 2
CASS R. SUNSTEIN – WHY NUDGE?
“It is possible that
companies that
provide clear, simple
products would do
poorly in the
marketplace,
because they are not
taking advantage of
people's propensity
to blunder" (p. 11)
30. Big Data …Big Nudging
CASS SUNSTEIN
CHOICE
ARCHITECTURES
42. Big Data: Big Nudging or
brainwashing on steroids
▪ Manipulation
▪ Freedom and non-domination
▪ Arbitrary power
▪ Coercion
▪ Decision
▪ Intention
▪ Plan
▪ Sense of self and identity
46. AI and Lethal Autonomous
Weapon Systems
▪ Algorithm says: “You liked this target, you may also like that
target!
▪ “Meaningful human control” (UN/CCW/IHL)
47. Meaningful human control: lost and
found
▪ Agency
▪ Control
▪ Autonomy
▪ Responsibility
▪ Liability
▪ Accountability
52. “Nothing ‘bout me’”
▪ Lay my head on the surgeon's table
▪ Take my fingerprints if you are able
▪ Pick my brains, pick my pockets
▪ Steal my eyeballs and come back for the sockets
▪ Run every kind of test from a to z
▪ And you'll still know nothin' 'bout me
▪ Run my name through your computer
▪ Mention me in passing to your college tutor
▪ Check my records, check my facts
▪ Check if I paid my income tax
▪ Pore over everything in my c.v.
▪ But you'll still know nothin' 'bout me
▪ You'll still know nothin' 'bout me
▪ Read more: Sting - Nothing 'bout Me Lyrics | MetroLyrics
53. The person: moral autonomy
▪ Identity
▪ Objectification
▪ Classification
▪ Respect
▪ Dignity
▪ Privacy
▪ Personhood
▪ Moral autonomy
▪ The good life
54. DATA PROTECTION: Moral Reasons to
prevent
• Harm to people
• Exploitation
• Discrimation
• Manipulation
• Stigmatization
• Limitations of choice, freedom
• Loss of autonomy
• Lack of respect for persons
• Violation of human dignity
56. Summarizing: Power and
Vulnerability in a digital world
Knowledge (of the world, ourselves, and others)
Freedom, Agency and Control and Efficacy (over actions in a
digital world)
Autonomy, Choice, decisions, intentions and plans
Privacy, dignity and identity and moral autonomy
Citizenship, rights and entitlements
67. Eric Schmidt CEO Google
▪“We know who you are, where you
have been, and more or less what you
think”
68. Google Alphabet is everywhere
▪ 60.000 werknemers jaaromzet van 70 miljard
▪ Advertentie inkomsten
▪ Adwords
▪ Adsense
▪ Google Car
▪ Googledocs
▪ Google glass
▪ Youtube
▪ Android
▪ Gmail
▪ Calendar
▪ Google maps
▪ Google Play
▪ Google hangouts
▪ Google Analytics
▪ Google drive
▪ Double Click (aantal eigenschappen website bezoek > 100)
78. ..Design turn in applied ethics...
Predicted long ago
Cambridge UP,
2017
79. Design for X
▪ Design for privacy
▪ Design for security
▪ Design for inclusion
▪ Design for sustainability
▪ Design for democracy
▪ Design for safety
▪ Design for transparency
▪ Design for accountability
▪ Design for responsibility
80. "I do not believe data
protection law is standing
in the way of your success"
"It's not privacy OR
innovation it's privacy AND
innovation"
"...there is a single
common inescapable
factor: Consumer Trust is
essential to achieving
growth"
ELIZABETH DENHAM – UK
INFORMATION COMMISSIONER
(Debuutrede 29 juli, 2016)