1. Is privacy a matter of
transparency and control?
Towards a Privacy Adaptation Procedure
2. Outline
1. Beyond transparency and control
Privacy calculus, more paradoxes, and bounded rationality
2. Privacy nudging
A solution inspired by decision sciences... with some ļ¬aws
3. Privacy Adaptation Procedure
Adaptive nudges based on a contextualized
understanding of usersā privacy concerns
INFORMATION AND COMPUTER SCIENCES
4. The state of privacy, 2013
INFORMATION AND COMPUTER SCIENCES
5. A model by Smith et al. 2011
Why arenāt these more
strongly related?
Transparency
Control
INFORMATION AND COMPUTER SCIENCES
6. Transparency and control
Transparency Control
Informed consent User empowerment
ācompanies should ācompanies should offer
provide clear descriptions consumers clear and
of [...] why they need the simple choices [...] about
data, how they will use itā personal data collection,
use, and disclosureā
INFORMATION AND COMPUTER SCIENCES
7. Examples from your work
Na Wang, Heng Xu, and Jens Grossklags
āOur new designs encompass control and awareness as
the essential dimensions of usersā privacy concerns in the
context of third-party apps on Facebook.ā
Jessica Vitak
āUsers may also employ advanced privacy settings to
segregate audiences, so they can still share relevant
content with their various connectionsā
INFORMATION AND COMPUTER SCIENCES
8. Examples from your work
Stacy Blasiola
āBy drawing awareness to the issue, users will be better
equipped to understand the vulnerabilities posed by third
party applicationsā
Ralf De Wolf and Jo Pierson
ādifferent audience management strategies [...] can be
used as a framework for access control models and/or
feedback and awareness toolsā
INFORMATION AND COMPUTER SCIENCES
9. The Transparency Paradox
Useful for concerned users,
but bad for others
Makes them more fearful
Any mention of privacy,
whether it is favorable or not,
triggers privacy concerns
INFORMATION AND COMPUTER SCIENCES
11. The Control Paradox
Decisions are too numerous
Most Facebook users
donāt know implications of
their own privacy settings!
Decisions are difficult
Uncertain and delayed
outcomes
Control gives a false sense of
security
INFORMATION AND COMPUTER SCIENCES
12. Beyond control
Eden Litt
āwhile many sites give users a variety of buttons and
dashboards to help them technologically enforce their
privacy, these features are only useful if users are aware
that they exist, know where to ļ¬nd them, and use them
effectivelyā
Ralf De Wolf and Jo Pierson
āin an online environment managing privacy becomes a
time-consuming choreā
INFORMATION AND COMPUTER SCIENCES
13. Example: Facebook
ābewildering tangle of optionsā (New York Times, 2010)
ālabyrinthianā controlsā (U.S. Consumer Magazine, 2012)
INFORMATION AND COMPUTER SCIENCES
14. Example: Knijnenburg et al.
Introducing an āextremeā
sharing option
E Nothing - City - Block
beneļ¬ts -->
Add the option Exact
B
Expected:
C
Some will choose Exact
instead of Block
N
Unexpected:
privacy -->
Sharing increases across
the board!
bit.ly/chi2013privacy INFORMATION AND COMPUTER SCIENCES
15. Bounded rationality
Peopleās decisions are
inconsistent and seemingly
irrational
- Framing effects
- Default effects
- Order effects
Transparency:
Information overload
Control:
Choice overload
INFORMATION AND COMPUTER SCIENCES
18. Example: Lai and Hui
A Please send me Vortrex Newsletters and information. 25%
4.
B Please do not send me Vortrex Newsletters and
information. 37%the
In
C Please send me Vortrex Newsletters and information. 53%
inher
defau
D Please do not send me Vortrex Newsletters and
information. 0% t
onlin
The
Figure 4: Subjects were assigned one of the following conditions
the s
in the registration page.
Conn
3.1. Data Analysis and Results actio
The mean levels of participations in each experimental condition are nega
reported in Table 1 below. conv
Table 1: Mean participation levels as a function of frames and posit
INFORMATION AND COMPUTER SCIENCES
19. Summary of part 1
We need to move beyond
control and transparency
Rational privacy decision-
making is bounded
Transparency and control
increase choice difficulty
INFORMATION AND COMPUTER SCIENCES
21. A new model
Jessica Vitak
āit is likely that users employ a number of strategies when
making decisions regarding what and with whom to share
content onlineā
These strategies are not rational, therefore:
- People do not always choose what is best for them
- There is signiļ¬cant leeway to inļ¬uence people's decisions
- Being objectively neutral is impossible
INFORMATION AND COMPUTER SCIENCES
22. A new model
Jessica Vitak
āit is likely that users employ a number of strategies when
making decisions regarding what and with whom to share
content onlineā
These strategies are not rational, therefore:
- People do not always choose what is best for them
- There is signiļ¬cant leeway to inļ¬uence people's decisions
- Being objectively neutral is impossible
INFORMATION AND COMPUTER SCIENCES
23. A new model
Behavioral reactions
(including disclosures)
Default Default
Nudge Nudge
order value
Decision
Privacy Calculus
strategies
Justiļ¬cation
Nudge Justiļ¬cation
Nudge
Risk/ Benfits
Costs
INFORMATION AND COMPUTER SCIENCES
24. A new model
Default
Justiļ¬cation
value
A succinct reason to Relieve users from the
disclose (or not disclose) burden of making decisions
a piece of information
- Path of least resistance
- Make it easier to - Implicit normative cue
rationalize the decision (what I should do)
- Minimize the potential - Endowment effect (what
regret of choosing the I have is worth more than
wrong option what I donāt have)
INFORMATION AND COMPUTER SCIENCES
27. Example: Lai and Hui
A Please send me Vortrex Newsletters and information. 25%
4.
B Please do not send me Vortrex Newsletters and
information. 37%the
In
C Please send me Vortrex Newsletters and information. 53%
inher
defau
D Please do not send me Vortrex Newsletters and
information. 0% t
onlin
The
Figure 4: Subjects were assigned one of the following conditions
the s
in the registration page.
Conn
3.1. Data Analysis and Results actio
The mean levels of participations in each experimental condition are nega
reported in Table 1 below. conv
Table 1: Mean participation levels as a function of frames and posit
INFORMATION AND COMPUTER SCIENCES
28. Example: Brown & Krishna
%pt increase in āHighā
compared to baseline
Default
High default
works Reactance
when aware of motives
INFORMATION AND COMPUTER SCIENCES
29. Example: Knijnenburg & Kobsa
5 justiļ¬cation types
None
Useful for you
Number of others
Useful for others
Explanation
bit.ly/tiis2013 INFORMATION AND COMPUTER SCIENCES
31. Example: Knijnenburg & Kobsa
Sa#sfac#on)with))
Disclosure*behavior* the)system)
Anticipated satisfaction with *
Demographics*disclosure
*
*Context*disclosure*
the system (intention to use):
Context"ļ¬rst" Demographics"ļ¬rst" 1,00" Context"ļ¬rst" Demograpics"ļ¬rst"
100%"
0,75"
90%" 6 items,1"e.g. āI would
***" 0,50"
80%"
recommend the system
70%" 0,25" *" **" *"
60%" to othersā 0,00" *" *"
50%"
$0,25"
Lower for any justiļ¬cation!
40%"
30%" $0,50"
1"
20%" $0,75" **" **"
10%"
$1,00" ***"
0%"
none" useful"for"you" #"of"others" useful"for"others" explanaDon"
bit.ly/tiis2013 INFORMATION AND COMPUTER SCIENCES
32. Problems with Privacy Nudging
What should be the purpose of the nudge?
āMore information = better, e.g. for personalizationā
Techniques to increase disclosure cause reactance in the
more privacy-minded users
āPrivacy is an absolute rightā
More difficult for less privacy-minded users to enjoy the
beneļ¬ts that disclosure would provide
INFORMATION AND COMPUTER SCIENCES
33. Problems with Privacy Nudging
Smith, Goldstein & Johnson:
āWhat is best for
consumers depends upon
characteristics of the
consumer: An outcome
that maximizes consumer
welfare may be
suboptimal for some
consumers in a context
where there is
heterogeneity in
preferences.ā
INFORMATION AND COMPUTER SCIENCES
34. Summary of part 2
Nudges work
Defaults and justiļ¬cations
can inļ¬uence usersā
decisions
But we cannot nudge
everyone the same way!
Users differ in their
disclosure preferences
Nudges should respect
these differences
INFORMATION AND COMPUTER SCIENCES
35. Privacy Adaptation Procedure
Adaptive nudges based on a contextualized
understanding of usersā privacy concerns
36. Contextualized preferences
Sameer Patil et al.
āOne of the factors contributing to this āprivacy paradoxā is
the decoupling of the circumstances in which privacy-
affecting behaviors occur from the time at which privacy
concerns are expressedā
Sam McNeilly,Ā Luke Hutton, andĀ Tristan Henderson
āParticipants were willing to share different types of
information in different waysā
INFORMATION AND COMPUTER SCIENCES
37. Contextualized preferences
Pamela Wisniewski andĀ Heather Richter Lipford
āBy operationalizing SNSĀ desired privacy level at a more
granular level, we wereĀ able to unpack, if not disprove,
aspects of the privacyĀ paradoxā
Sam McNeilly,Ā Luke Hutton, andĀ Tristan Henderson
āprivacy settings are fairly robust to capturing people's
contextual norms over timeā
INFORMATION AND COMPUTER SCIENCES
38. Contextualized preferences
Contextualize
The optimal justiļ¬cation and
different users
default may depend on:
- type of info (what)
different context
Privacy - user characteristics (who)
decision
- recipient (to whom)
- etc...
INFORMATION AND COMPUTER SCIENCES
39. Example: Knijnenburg et al.
Type of data ID Items
1 Wall
2 Status updates
Facebook activity 3 Shared links
4 Notes
5 Photos
6 āWhat?ā
Hometown
Location 7
=
Location (city)
8 Location (state/province)
9 Four
Residence (street address)
Contact info dimensions
11 Phone number
12 Email address
13 Religious views
Life/interests 14 Interests (favorite movies, etc.)
15 Facebook groups
bit.ly/privdim INFORMATION AND COMPUTER SCIENCES
40. Example: Knijnenburg et al.
āWho?ā
=
Five
disclosure
proļ¬les
159 pps tend to share little information overall (LowD)
26 pps tend to share activities and interests (Act+IntD)
50 pps tend to share location and interests (Loc+IntD)
65 pps tend to share everything but contact info (Hi-ConD)
59 pps tend to share everything
bit.ly/privdim INFORMATION AND COMPUTER SCIENCES
41. Example: Knijnenburg et al.
Detect
class
member-
ship
bit.ly/privdim INFORMATION AND COMPUTER SCIENCES
42. Example: Knijnenburg & Kobsa
I care about
the beneļ¬ts
I do whatever
others do
INFORMATION AND COMPUTER SCIENCES
43. disclosure tendency, where requesting context data first
leads to less threat and more trust. Best Strategy to Achieve High Total Disclosure
Since it is best to ask demographics first to increase
Figure 4 compares for each group the best strategy (marked demographics disclosure, and context first to increase
Example: Knijnenburg & Kobsa
with an arrow) against all other strategies. Strategies that context disclosure, increasing total disclosure asks for a
perform significantly worse than the best strategy are compromise. The best way to attain this compromise is to
labeled with a p-value. first choose a preferred request order, and then to select a
User type Context first Demographics first
Males with low The āuseful for youā justification gives the Providing no justification gives the highest
disclosure tendency highest demographics disclosure. context disclosure.
Females with low Providing no justification gives the highest The āexplanationā justification keeps
disclosure tendency demographics disclosure. context disclosure on par.
Males with high The āuseful for othersā justification keeps The āuseful for youā justification keeps
disclosure tendency demographics disclosure almost on par. context disclosure on par.
Females with high Providing no justification gives a high The āuseful for youā justification gives the
disclosure tendency demographics disclosure. highest context disclosure.
Table 2: Best strategies to achieve high overall disclosures.
User type Best strategy
Males with low disclosure tendency Demographics first with āuseful for youā.
Males with high disclosure tendency The āuseful for youā justification in any order.
Females with low disclosure tendency Context first with āuseful for youā.
Females with high disclosure tendency Context first with no justification, but āuseful for youā is second
best.
Table 3: Best strategies to achieve high user satisfaction.
bit.ly/iui2013 INFORMATION AND COMPUTER SCIENCES
44. The Adaptive Privacy Procedure
ā¢ Determine the item-. user-, and recipient-type
ā¢ Select the default and justification that fits best
for this context
pshare = Ī± + Ī²itemtype + Ī²usertype + Ī²recipienttype
OUTPUT
INPUT
{user, item, recipient} {defaults, justification}
INFORMATION AND COMPUTER SCIENCES
45. The Adaptive Privacy Procedure
Practical use:
- Automatic initial defaults in line with ādisclosure proļ¬leā
- Personalized disclosure justiļ¬cations
Relieves some of the burden of the privacy decision:
The right privacy-related information
The right amount of control
āRealistic empowermentā
INFORMATION AND COMPUTER SCIENCES
46. Summary of part 3
Smith, Goldstein & Johnson:
āthe idea of an adaptive
default preserves
considerable consumer
autonomy [...] and strikes
a balance between
providing more choice
and providing the right
choices.ā
INFORMATION AND COMPUTER SCIENCES
47. Final summary
1. Beyond transparency and control
Rational privacy decision-making is bounded, and
transparency and control only increase choice difficulty
2. Privacy nudging
Needs to move beyond the one-size-ļ¬ts-all approach
3. Privacy Adaptation Procedure
The optimal balance between nudges and control
INFORMATION AND COMPUTER SCIENCES