Light Rail in Canberra: Too much, too little, too late: Is the price worth th...
Can we heal the disinformation media ecology?
1. CAN WE HEAL THE
‘DISINFORMATION MEDIA ECOLOGY’?
MULTI-STAKEHOLDER SOLUTIONS & PROBLEMS
Professor Vian Bakir
Bangor University, Wales, UK
Desinformation vs demokratin – vad kan vi göra?
Kalmar, Sweden 20-22 November
2. CAN WE HEAL THE
‘DISINFORMATION MEDIA ECOLOGY’?
Disinformation
• False information spread deliberately to
deceive.
• From Russian ‘dezinformatsia’
• Disinformation is a significant
pollutant of the media
ecology.
• Could fatally taint entire
ecosystem.
Fake news. Post-truth Truthiness. Propaganda. Spin. Deception. Lies. Misdirection.
Omission. Exaggeration. Public Relations. Poor editorial judgement. Sock puppets.
Bots. Echo Chambers. Filter bubbles. Advertising. Click-bait. Misinformation. Rumours.
3. CAN WE HEAL THE
‘DISINFORMATION MEDIA ECOLOGY’?
Fake news. Post-truth Truthiness. Propaganda. Spin. Deception. Lies. Misdirection.
Omission. Exaggeration. Public Relations. Poor editorial judgement. Sock puppets.
Bots. Echo Chambers. Filter bubbles. Advertising. Click-bait. Misinformation. Rumours.
Audience views on ‘fake news’ -
US, UK, Spain & Finland - focus
group & survey data (Nielsen &
Graves 2017)
• Finds discontent with wider
information landscape, incl.
news media, politicians,
platform companies
4. Q. HOW TO HEAL
DISINFORMATION MEDIA
ECOLOGY?
A. VIA MULTI-
STAKEHOLDER RESPONSE
• ‘All stakeholders – including
intermediaries, media outlets, civil
society and academia – should be
supported in developing
participatory and transparent
initiatives for creating a better
understanding of the impact of
disinformation and propaganda on
democracy, freedom of expression,
journalism and civic space, as well
as appropriate responses to these
phenomena.’
(United Nations, 2017,
Joint Declaration on Freedom of
Expression and ‘Fake News’, Principle
6)
6. • 23 oral evidence sessions
• >170 written submissions
• Evidence from 73 witnesses
• Asked >4,350 questions at hearings
Our submissions:
• 2017. Summary And Analysis Of All Written
Submissions On How To Combat Fake News
(Up To April 2017)
• 2017. Fake News: Media Economics and
Emotional Button-Pushing
• 2017. Fake News: A Framework for Detecting
and Avoiding Propaganda
Inquiry’s reports:
• Digital, Culture, Media and Sport Committee. 2018.
Disinformation and ‘fake news’: Interim Report.
Jul. House of Commons 363.
• Digital, Culture, Media and Sport Committee. 2019.
Disinformation and ‘fake news’: Final Report. 18
Feb House of Commons 1791.
7. SOLVING
COMPLEX WEB
OF FAKE NEWS
Education
Media organisations
Digital intermediaries eg Google, Facebook
Advertisers
Professional persuaders and PR
Intelligence agencies
Vian Bakir & Andrew McStay. SUMMARY AND ANALYSIS OF ALL WRITTEN SUBMISSIONS ON HOW TO
COMBAT FAKE NEWS (UP TO APRIL 2017)
8. • increase people’s media/digital literacy so they can recognise fake news
Education
• promote pluralistic media economy so quality news outlets can flourish
• encourage journalists to tell truth
Media organisations
• promote real news & downgrade fake news sites
• be transparent about their algorithms
• be better regulated
Digital intermediaries eg Google, Facebook
• consider health of media landscape with Google/Facebook duopoly in digital ad market
• ensure behavioural ad systems don’t incentivise fake news creation
Advertisers
• avoid deception
Professional persuaders and PR
• give GCHQ leading role in tackling fake news instigated by other nations.
Intelligence agencies
9. • to foster critical thinking
on democratic processes,
digital environment &
news
• Work out:
• how to beat confirmation bias
(where we seek out & notice info
that conforms to our pre-existing
view)
• how to be skeptical of information
which produces emotional
Wardle, C. & Derakshan, H. 2017. Information
Disorder: Toward an interdisciplinary
framework for research and policy making.
Council of Europe report DGI(2017)09
Education should promote media/ digital
literacy
Voting studies & emotion:
• Lu & Myrick 2016 - incivility
• Valentino et al. 2009 - anger
• Brader 2006 – enthusiasm, fear
• Valentino et al. 2008 - fear
10. In July 2018, Fake News Inquiry’s Interim Report recommended:
• 246. … Government put forward proposals in its White Paper for an
educational levy to be raised by social media companies, to finance a
comprehensive educational framework (developed by charities and non-
governmental organisations) and based online. Digital literacy should be
the fourth pillar of education, alongside reading, writing and maths.
• 247. There should be a unified public awareness initiative, supported
by the Departments for DCMS, Health, and Education, with additional
information and guidance from the Information Commissioner’s Office
and the Electoral Commission, and funded in part by the tech company
levy. Such an initiative would set the context of social media content,
explain to people what their rights over their data are, within the context
of current legislation, and set out ways in which people can interact with
political campaigning on social media. This initiative should be a rolling
programme, and not one that occurs only before general elections or
referenda.
(DCMS 2018: 63)
Education should promote media/ digital
literacy
11. • How?
• Maintain high standards of professional accuracy & fact-
checking (ITN, BBC, Society of Editors, Press Association)
• Strengthen fact-generation & fact-checking by increasing
news resources by:
• higher staffing levels (National Union of Journalists)
• employment conditions that promote quality reporting & collaborative journalism
(BBC)
• train journalists to handle data (Royal Statistical Society)
• Regulation
• treat established titles as community assets (National Union of Journalists)
• prevent further concentration of media ownership (National Union of Journalists)
• establish funding arrangements to ensure BBC’s future (National Union of
• Fast redress for fake financial news - as can cause instant financial damage (Gavin
Media organisations should…promote pluralistic media economy so quality news outlets can flourish
12. Media organisations
should…encourage journalists to
tell the truth
First Draft’s verification toolkits
https://firstdraftnews.org/training/verificati
on/
How?
• Journalists should be more
transparent about their
sources, e.g. think tanks and
their sometimes biased
research (Martin Moore (King’s College
London), Tobacco Control Research Group
(University of Bath)
• Train journalists to better
recognise (and avoid)
propaganda (David Miller (University
of Bath) and his colleagues )
13. E.g. by fact-checking/verification, flagging fake content
BUT flagging stories as false may not improve people’s stock of correct knowledge because:
1. Confirmation bias
• ‘Backfire effect’ - conservatives became more likely to believe Iraq had WMD after reading
retractions clarifying that no WMDs existed (Nyhan & Reifler 2010 )
• Effects of debunking - weaker when audiences generate reasons to support initial misinformation
(Chan et al. 2017)
2. People who consume fake news don’t then consume fact-checks
• US study (Guess et al. 2018)
3. ‘Implied truth’ effect - False headlines that fail to get tagged are considered validated & so
are seen as more accurate (Pennycook, Bear, Collins & Rand in press)
Digital intermediaries should … promote real news &
downgrade fake news sites
14. • to ensure that viral fake content is not presented as news (ITN, News Media Alliance).
• BUT, these algorithms are proprietary and lucrative & unlikely to be revealed or compromised
by the digital intermediaries.
Engagement =
$$$$$$$$$$
Digital intermediaries should … be transparent about their
algorithms
15. • E.g. surcharge internet service providers to create a local news fund from which might
be bred hyper-local news providers (National Union of Journalists).
• Make search engines liable if they continue hosting damaging stories that are
manifestly & provably false (The Campaign for Responsible Financial Journalism).
BUT
• their transnational power makes them difficult to regulate (especially for small countries)
• they are keen not to be seen as media companies (rather than technology companies).
Digital intermediaries should … be better regulated
16. Advertisers should … consider health of media landscape
with Google/Facebook duopoly in digital ad market
Cairncross Review (2019) asked::
• Is the market in which publishers now operate a fair one, or has the rapid
growth of the big online platforms - especially Google and Facebook - created
distortions that justify government intervention?
• The platforms take large share of market for advertising
• They also provide routes that many people use to find news online
17. Advertisers should … ensure behavioural ad systems
don’t incentivise fake news creation
Digital intermediaries should self-
police their behavioural &
programmatic advertising
networks, & identify & cut off
advertisers that support fake
news sites
(ITN, Press Association, Public
Relations and Communications
Association, Google, Internet
Advertising Bureau)
18. Professional persuaders and PR should … avoid
deception
The public wants those in public life to be honest and tell truth (Committee on Standards in Public Life )
Politicians’ use of social media removes checks & balances of traditional media (Society of Editors )
Dilemmas:
- regulation of political campaigning (censorship) v. protecting freedom of political speech
- what constitutes proportionate censorship in the coming age of automated propaganda?
19. BUT
• secrecy
• governments in power may have a different view of the
‘national interest’ to those seeking truth
• may suppress critical reports from their intelligence agencies or
intelligence oversight committees.
• E.g. Obama knew that Putin was interfering in 2016 US
campaign while the campaigns were on-going but kept silent
• Michael Isikoff, 2018, Russian Roulette: The Inside Story of Putin’s
War on America and the Election of Donald Trump
• Johnson is suppressing a Nov 2019 Intelligence & Security
Committee report on Russian influence in politics and public
life until after General Election.
Intelligence agencies should … have a leading role in
tackling fake news instigated by other nations
20. CAN WE HEAL THE
‘DISINFORMATION MEDIA ECOLOGY’?
Can they? Will they? And in time?
All stake-holders must pull together
Education Media organisations Digital intermediaries Advertisers Professional persuaders Intelligence agencies
Not easy!
21. CAN WE HEAL THE
‘DISINFORMATION MEDIA ECOLOGY’?
Each type of disinformation
may need bespoke solution
On digital political
disinformation, we need an
ethical code of conduct for
transparent, explainable, civil &
informative digital political
campaigns
(Bakir & McStay 2019)
22. REFERENCES
• Bakir, V. & McStay, A. 2019. AGAINST OPACITY, OUTRAGE & DECEPTION: Towards an ethical code of conduct for
transparent, explainable, civil & informative digital political campaigns. Written submission (invited) to House of Lords
Select Committee on Democracy & Digital Technologies, 13 Sep.
• Bakir, V. & McStay, A.. 2017 SUMMARY AND ANALYSIS OF ALL WRITTEN SUBMISSIONS ON HOW TO COMBAT FAKE
NEWS (UP TO APRIL 2017),
http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/digital-culture-media-and-
sport-committee/fake-news/written/71533.html
• Brader, T. 2006. Campaigning for Hearts and Minds. Chicago: University of Chicago Press.
• Chan, M.S., Jones, C. R., Jamieson, K.H. Albarracín, D. 2017. Debunking: A Meta-Analysis of the Psychological Efficacy of
Messages Countering Mis-information, Psychological Science, 1-16.
• Digital, Culture, Media and Sport Committee. 2018. Disinformation and ‘fake news’: Interim Report. 24 July. House of
Commons 363. Available at: https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/363/363.pdf
• Guess, A., Nyhan, B. & Reifler, J. 2018. Selective exposure to misinformation: Evidence from the consumption of fake
news during the 2016 U. S. presidential campaign. Retrieved from https://www.dartmouth.edu/~nyhan/fake-news-
2016.pdf
• Isikoff, M. 2018, Russian Roulette: The Inside Story of Putin’s War on America and the Election of Donald Trump.
• Lu, Y. & Myrick, J.G. 2016. Cross-cutting exposure on Facebook and political participation: Unravelling the effects of
emotional responses and online incivility. Journal of Media Psychology: Theories, Methods, and Applications, 28(3),
100-110.
23. REFERENCES
• Nielsen, R. K. & Graves, L. 2017. “News you don’t believe”: Audience perspectives on fake news. Reuters
Institute Factsheet, Oct. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2017-
10/Nielsen%26Graves_factsheet_1710v3_FINAL_download.pdf
• Nyhan, B. & Reifler, J. 2010. When corrections fail: the persistence of political misperceptions. Political
Behaviour, 32: 303-330.
• Pennycook, Gordon, Bear, Adam, Collins, Evan T., & Rand, David G. in press. The Implied Truth Effect:
Attaching warnings to a subset of fake news headlines increases perceived accuracy of headlines without
warnings. Management Science.
• Starbird, K. 2017. Examining the Alternative Media Ecosystem through the Production of Alternative
Narratives of Mass Shooting Events on Twitter. Association for the Advancement of Artificial Intelligence.
http://faculty.washington.edu/kstarbi/Alt_Narratives_ICWSM17-CameraReady.pdf
• Valentino, N, Gregorowicz, K & Groenendyk, E. 2009. Efficacy, Emotions and the Habit of Participation,
Political Behavior, 31: 307–330.
• Valentino, N,, Hutchings,V., Banks, A. & Davis, A, 2008. Is a Worried Citizen a Good Citizen? Emotions,
Political Information Seeking, and Learning via the Internet, Political Psychology, 29: 247–273.