Smarp Snapshot 210 -- Google's Social Media Ad Fraud & Disinformation Strategy
Mantelero collective privacy in3_def
1. IN3 Research Seminar
Internet Interdisciplinary Institute - Universitat Oberta de Catalunya
Barcelona, 23 September 2015
Personal data for decisional purposes in the age of
analytics: from an individual to a collective dimension
of data protection
Alessandro Mantelero
Politecnico di Torino
Nexa Center for Internet and Society
Nanjing University of Information Science & Technology (NUIST)
2. Personal data for decisional purposes
Overview
I. Predictive knowledge and collective behaviour
II. Group privacy
III. A new dimension of protection: collective data protection
IV. The representation of collective interests
3. Predictive knowledge and
collective behaviour
Big data: a new paradigm
• predictive analysis: from causation to correlation
• ‘transformative’ use of data
Big data analytics make it possible to infer predictive
information from large bulks of data in order to acquire
further knowledge about individuals and groups, which
may also not to be related to the initial purposes of data
collection.
A new representation of our society
Analytics group people with the same qualitative attributes
and habits (e.g. low-income people, “working-class mom”,
“metro parents”) and predict future behaviour of these
clusters of individuals.
4. Predictive knowledge and
collective behaviour
Case I
An health insurance company extracts predictive
information about the risks associated to segments of
clients on the basis of their primetime television usage,
propensity to buy general merchandise, ethnicity, geography
or being a mail order buyer.
Case II
A credit company uses the “neighborhood’s general credit
score or range” (a score defined on the basis of aggregate
credit scores) to provide loans to the people living in a given
neighbourhood in ways that bear no relationship to their
personal conditions.
Case III
“PredPol” software anticipate, prevent and respond more
effectively to crime, but create “self-fulfilling cycles of bias”.
5. Predictive knowledge and
collective behaviour
A “categorical” approach
Predictions based on correlations do not only affect
individuals, which may act differently from the rest of the
cluster to which have been assigned, but – due to the
collective dimension of clusters – also affect the whole
group and make it different from the rest of society.
Do we need a new collective dimension of data protection?
“un nouveau régime de vérité” (Rouvroy)
“A map is not the territory” (Korzybski)
6. Group privacy
Privacy scholars have devoted few contributions to group privacy
and collective interests in data processing.
Bloustein (group privacy)
“Group privacy is an extension of individual privacy […] The
interest protected by group privacy is the desire and need of
people to come together, to exchange information, share
feelings, make plans and act in concert to attain their
objectives”
Westin (organizational privacy)
“Privacy is a necessary element for the protection of
organizational autonomy, gathering of information and advice,
preparation of positions, internal decision making, inter-
organizational negotiations, and timing of disclosure”
7. Group privacy
Bygrave (data protection)
Group privacy is referring to information that identifies and
describes the group (e.g. contact addresses, profits, and capital
turnover). Group privacy protects information referring to
collective entities and it is a sort of extension of individual data
protection to these entities.
Theories about group privacy are mainly based on the model of
individual rights:
• Privacy and data protection are related to given individuals,
which are members of a group, or to the group itself as an
autonomous collective body.
• These theories are consistent with the theoretical studies on
group theory in the field of sociology (individualistic theory,
organic theory).
8. A new dimension of
protection
In the Big Data era, data gatherers
Shape the population they intend to investigate
Collect information about various people who do not know the
other members of the group and are often not aware of the
consequences of being part of a group (consumer profiling,
scoring solutions and predictive policing applications).
We are neither in the presence of forms of analysis that involve
only individuals, nor in the presence of groups in the traditional
sociological meaning of the term (lack of consciousness, lack of
interactions)
The new scale entails the recognition of another layer, represented
by the rights of groups to the protection of their collective
dimension of privacy and data.
9. A new dimension of
protection
Collective rights are not necessarily a representation on a large
scale of individual rights and related issues.
Collective data protection concerns non-aggregative collective
interests, which are not the mere sum of many individual
interests.
The protection of groups from potential harms related to invasive
and discriminatory data processing is the most important
interest in this context.
The collective dimension of data processing is mainly focused on
the use of information, rather than on intimacy and data quality.
10. A new dimension of
protection
Discrimination:
- The unjust or prejudicial treatment of different categories of
people
- The recognition and understanding of the difference between
one thing and another
Cases in which big data analytics provide biased representations of
society:
- Involuntary forms of discrimination (StreetBump app to detect
potholes, Progressive case)
- Voluntary forms of discrimination (commercial group profiling,
predictive policing, credit scoring)
11. The representation of
collective interests
Big data and collective interests
In the big data context, data subjects are not aware of the
identity of the other members of the group, have no relationship
with them and have a limited perception of collective issues.
Groups shaped by analytics have a variable geometry and
clusters of individuals can be moved from a group to another.
The partially hidden nature of processes and their complexity
probably make it difficult to bring timely class actions.
Other cases of power imbalance:
- Workplace
- Consumer protection and environmental protection
12. The representation of
collective interests
Big data and power imbalance:
Lack of awareness of the implications of data processing.
Difficult for data subject to negotiate their information and to
take position against illegal processing of their data
Entities that represent collective interests are less affected by
situations of power imbalance and have also a more complete
vision of the impact of specific policies and decisions adopted
by data gatherers.
13. The representation of
collective interests
A preventive approach
to realize how data processing affect collective interests
to identify the potential stakeholders
to tackle the risks of hidden forms of data processing
The risk assessment should adopt a multi-stakeholder approach
and evaluate not only the impact on data protection, but also
ethical and social impacts.
Entities representative of collective interests should be involved in
the processes of risk assessment (right to participate)
14. The representation of
collective interests
The selection of the independent authority responsible for the
protection of collective interests : a matter of decision for
policymakers
Many countries already have independent bodies focused on social
surveillance and discrimination:
- Competences spread across various authorities
- Different approaches, resources, and remedies
- Lack of cooperation
The potential role of Data Protection Authorities
15. Main references
- Alan F. Westin, Privacy and Freedom (Atheneum 1970).
- Alessandro Mantelero, ‘The future of consumer data protection in the E.U.
Rethinking the “notice and consent” paradigm in the new era of predictive
analytics’ in this Review (2014), vol 30, issue 6, 643-660.
- Antoinette Rouvroy, ‘Des données sans personne: le fétichisme de la donnée
à caractère personnel à l'épreuve de l'idéologie des Big Data’ (2014) 9
<http://works.bepress.com/antoinette_rouvroy/55> accessed 8 March 2015
- Bollier D. The Promise and Perils of Big Data. 2010. Aspen Institute,
Communications and Society Program. Available from,
http://www.aspeninstitute.org/sites/default/files/content/docs/pubs/The_Promi
se_and_Peril_of_Big_Data.pdf [accessed 27.02.14].
- Cynthia Dwork and Deirdre K. Mulligan, ‘It’s not Privacy and It’s not Fair’
(2013) 66 Stan. L. Rev. Online 35.
- danah boyd and Kate Crawford, ‘Critical Questions for Big Data:
Provocations for a Cultural, Technological, and Scholarly Phenomenon’
(2012) 15(5) Information, Communication, & Society 662-679.
- Danielle Keats Citron and Frank Pasquale, ‘The Scored Society: Due
Process For Automated Predictions’ (2014) 89 Wash. L. Rev. 1.
- Danielle Keats Citron, ‘Technological Due Process’ (2008) 85(6) Wash. U. L.
Rev. 1249, 1312.
16. - David Wright, ‘A framework for the ethical impact assessment of information
technology’ (2011) 13 Ethics Inf. Technol. 199–226.
- Edward J. Bloustein, Individual and Group Privacy (Transaction Books 1978).
- Frank Pasquale, The Black Box Society. The Secret Algorithms That Control
Money and Information (Harvard University Press 2015).
- Fred H. Cate and Viktor Mayer‐Schönberger, ‘Data Use and Impact. Global
Workshop’ (The Center for Information Policy Research and The Center
for Applied Cybersecurity Research, Indiana University 2013) iii
http://cacr.iu.edu/sites/cacr.iu.edu/files/Use_Workshop_Report.pdf [accessed
27.02.14].
- FTC. Data Brokers: A Call for Transparency and Accountability. 2014.
Available from, https://www.ftc.gov/system/files/documents/reports/data-
brokers-call-transparency-accountability-report-federal-trade-commission-
may-2014/140527databrokerreport.pdf [accessed 27.02.14].
- Ira S. Rubinstein, ‘Big Data: The End of Privacy or a New Beginning?’ (2013)
3 (2) International Data Privacy Law 74-87.
- Kate Crawford, ‘Algorithmic Illusions: Hidden Biases of Big Data’,
presentation at Strata 2013, https://www.youtube.com/watch?v=irP5RCdpilc
[accessed 15.03.15].
- Latanya Sweeney, ‘Discrimination in Online Ad Delivery’ (2013) 56(5)
Communications of the ACM 44-54.
17. - Lee A. Bygrave, Data Protection Law. Approaching Its Rationale, Logic and
Limits (Kluwer Law International 2002).
- Mireille Hildebrandt and Serge Gutwirth (eds.), Profiling the European
Citizen. Cross-Disciplinary Perspective (Springer 2008).
- Omer Tene and Jules Polonetsky, ‘Privacy in the Age of Big Data. A Time for
Big Decisions’ (2012) 64 Stan. L. Rev. Online 63-69
http://www.stanfordlawreview.org/sites/default/files/online/topics/64-SLRO-
63_1.pdf [accessed 13.03.15].
- The White House, Executive Office of the President, ‘Big Data: Seizing
Opportunities, Preserving Values’ (2014)
http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_m
ay_1_2014.pdf [accessed 27.12.14].
- Viktor Mayer-Schönberger and Kenneth Cukier, Big Data. A Revolution That
Will Transform How We Live, Work and Think (John Murray 2013).