4. SOCIAL ENGINEERING AT SCALE
Facebook Group Shares Interactions
Blacktivists 103,767,792 6,182,835
Txrebels 102,950,151 3,453,143
MuslimAmerica 71,355,895 2,128,875
Patriototus 51,139,860 4,438,745
Secured.Borders 5,600,136 1,592,771
Lgbtun 5,187,494 1,262,386
5. INTENT TO DECEIVE
Force adversary to make decision or take action based on information that I:
• Hide
• Give
• Change (or change the context on)
• Deny/degrade
• Destroy
Enable my decisions based upon knowing yours
“Operations to convey selected information and indicators to audiences to
influence their emotions, motives, and objectives reasoning, and ultimately the
behavior of governments, organizations, groups, and individuals”
9. INSTRUMENTS OF NATIONAL POWER
…and how to influence other nation-states.
Diplomatic Informational Military Economic
Resources available in pursuit of national objectives…
10. NATIONSTATE MISINFORMATION
From To
Brazil Brazil
China China, Taiwan, US
Iran India, Pakistan
Russia Armenia, France, Germany, Netherlands, Philippines,
Serbia, UK, USA, Ukraine, World
Saudi Qatar
Unknown France, Germany, USA
20. ADDING MISINFORMATION TO INFOSEC
“Prevention of damage to, protection of, and restoration of computers,
electronic communications systems, electronic communications services, wire
communication, and electronic communication, including information contained
therein, to ensure its availability, integrity, authentication, confidentiality, and
nonrepudiation” - NSPD-54
24. THERE’S NO COMMON LANGUAGE
“We use misinformation attack (and misinformation campaign) to refer to the
deliberate promotion of false, misleading or mis-attributed information. Whilst
these attacks occur in many venues (print, radio, etc), we focus on the creation,
propagation and consumption of misinformation online. We are especially
interested in misinformation designed to change beliefs in a large number of
people.”
30. AND STARTED MAPPING MISINFORMATION ONTO IT
Initial
Access
Create
Artefacts
Insert
Theme
Amplify
Message
Command
And Control
Account takeover Steal existing
artefacts
Create fake
emergency
Repeat messaging
with bots
Create fake real-life
events
Create fake group Deepfake Create fake argument
Parody account Buy friends
Deep cover
31. POPULATING THE FRAMEWORK
• Campaigns
• e.g. Internet Research Agency, 2016 US elections
• Incidents
• e.g. Columbia Chemicals
• Failed attempts
• e.g. Russia - France campaigns
33. HISTORICAL CATALOG: DATASHEET
• Summary: Early Russian (IRA) “fake news”
stories. Completely fabricated; very short lifespan.
• Actor: probably IRA (source: recordedfuture)
• Timeframe: Sept 11 2014 (1 day)
• Presumed goals: test deployment
• Artefacts: text messages, images, video
• Related attacks: These were all well-produced
fake news stories, promoted on Twitter to
influencers through a single dominant hashtag --
#BPoilspilltsunami, #shockingmurderinatlanta,
• Method:
1. Create messages. e.g. “A powerful explosion heard from
miles away happened at a chemical plant in Centerville,
Louisiana #ColumbianChemicals”
2. Post messages from fake twitter accounts; include handles
of local and global influencers (journalists, media,
politicians, e.g. @senjeffmerkley)
3. Amplify, by repeating messages on twitter via fake twitter
accounts
• Result: limited traction
• Counters: None seen. Fake stories were debunked very
quickly.
34. FEEDS INTO TECHNIQUES LIST
• Behavior: two groups meeting in same place at
same time
• Intended effect: IRL tension / conflict
• Requirements: access to groups, group trust
• Detection:
• Handling:
• Examples:
Title
Description
Short_Description
Intended_Effect
Behavior
Resources
Victim_Targeting
Exploit_Targets
Related_TTPs
Kill_chain_Phases
Information_Source
Klil_Chains
Handling
37. INCIDENT ANALYSIS
Top-down (strategic): info ops
❏ What are misinformation creators
likely to do? What, where, when,
how, who, why?
❏ What do we expect to see?
❏ What responses and impediments
to responses were there?
Bottom-up (tactical): data science
❏Unusual hashtag, trend, topic,
platform activity?
❏Content from ‘known’ trollbots,
8/4chan, r/thedonald,
RussiaToday etc
❏What are trackers getting excited
about today?
40. DISTORTION TECHNIQUES
• Distort facts: match intended outcome
• Exaggerate: rhetoric & misrepresent facts
• Generate: realistic false artifacts
• Mismatch: links, images, and claims to
change context of information
41. DISTRACTION TECHNIQUES
• String along: respond to anyone who engages to
waste time
• Play dumb: pretend to be naive, gullible, stupid
• Redirect: draw engagement to your thread
• Dilute: add other accounts to dilute threads
• Threadjack: change narrative in existing thread
42. DIVISION TECHNIQUES
• Provoke: create conflicts and confusion among community
members
• Dehumanize: demean and denigrate target group
• Hate speech: attack protected characteristics or classes
• Play victim: claim victim status
• Dog-whistle: use coded language to indicate insider status
• Hit and run: attack and delete after short time interval
• Call to arms: make open calls for action
43. DISMAY TECHNIQUES
• Ad hominem: make personal attacks, insults
& accusations
• Assign threats: name and personalize enemy
• Good old-fashioned tradecraft
44. DISMISSAL TECHNIQUES
• Last word: respond to hostile commenters
then block them so they can’t reply
• Brigading: coordinate mass attacks or
reporting of targeted accounts or tweets
• Shit list: add target account(s) to insultingly
named list(s)
57. COMPONENTWISE UNDERSTANDING AND RESPONSE
• Lingua Franca across communities
• Defend/countermove against reused techniques, identify gaps in attacks
• Assess defence tools & techniques
• Plan for large-scale adaptive threats (hello, Machine Learning!)
• Build an alert structure (e.g. ISAC, US-CERT, Interpol)
63. THANK YOU
Sara “SJ” Terp
Bodacea Light Industries
sarajterp@gmail.com
@bodaceacat
CDR Pablo C. Breuer
U.S. Special Operations Command / SOFWERX
Pablo.Breuer@sofwerx.org
@Ngree_H0bit
64. Community
• Parody-based counter-campaigns (e.g. riffs on “Q”)
• SEO-hack misinformation sites
• Dogpile onto misinformation hashtags
• Divert followers (typosquat trolls, spoof messaging etc)
• Identify and engage with affected individuals
• Educate, verify, bring into the light
64
65. Offense: Potentials for Next
• Algorithms + humans attack algorithms + humans
• Shift from trolls to ‘nudging’ existing human communities
(‘useful idiots’)
• Subtle attacks, e.g. ’low-and-slows’, ‘pop-up’, etc
• Massively multi-channel attacks
• More commercial targets
• A well-established part of hybrid warfare
65
66. Defence: Potential for next
• Strategic and tactical collaboration
• Trusted third-party sharing on fake news sites / botnets
• Misinformation version of ATT&CK, SANS20 frameworks
• Algorithms + humans counter algorithms + humans
• Thinking the unthinkable
• “Countermeasures and self-defense actions”
66