1. ISBN 978-1-4549-0827-2
9 781 45 4 908272
5 5 0 0 0>
EVERY ANIMATE AND INANIMATE OBJECT ON EARTH
WILL SOON BE GENERATING DATA, INCLUDING OUR
HOMES,OUR CARS, AND YES,EVEN OUR BODIES.”— Anthony D Williams
CREATED BY RICK SMOLAN AND JENNIFER ERWITT
Ati veligen eserume et erites pa qui
blam incias debitem reium veruptur
aut lauta iur si volor mos ullignia
sandis nus es eum latem quiam et
as evelendae secullabor olut rest
et labo. Poribus inveliat alignis se
sequostia ducit hictur saped quibea
molorep eliquunt. Um cus et aliquae pe
volorent verspienem eum re voluptae
veribea quiatec tationsed eum fugit,
voloreperias qui atiundae etur ra quae
auta. dolorio nsequatat ea comnimet
volore ium reped eatur, solorem ut
occuptat qui omnim remolestio et
volent odit od et lamustio es dusda
sitatur itatur res unt, sumqui ilis quiam
veles dolorae ctoreped mo dolore
sequae etur, sam aut occatum que
volum que quiatet, aut omni optassi
tore cone nos conserrume labo. Et hil
everae nonseque que rerist la nobitiu
nim remolestio et volent odiamustio
es dusda sitatur itatur res unt, sumqui
ilis quiam veles dolorae ctoreped mo
dolore sequae etur, sam aut occatum
To view secret videos embedded in this book,
download our free app on your smartphone or
tablet at HumanFaceofBigData.com/viewer.
2. Data Driven by Jonathan Harris
We inhabit an interesting time in history. A small number of people, numbering not more
than a few hundred, but really more like a few dozen, mainly living in cities such as San
Francisco and New York, mainly male, and mainly between the ages of 22 and 35, are
having a hugely outsized effect on the rest of our species.
Through the software they design and deliver, these engineers are transforming the daily
lives of hundreds of millions of people. Previously, this kind of mass transformation of
human behavior was the sole domain of war, famine, disease, and religion, but now it
happens more quietly, through the software we use every day.
In a sense, software can be thought of as a new kind of medicine. But rather than acting
upon a single human body, software acts on the behavioral patterns of entire societies.
The designers of these applications call themselves software engineers, but they are
really more like social engineers—yet very few of them realize that this is what they are
doing, and even fewer consider the ethical implications of that kind of power.
On a small scale, the effects of software are benign. But at large companies, institutions,
and agencies with hundreds of millions of users, something so apparently small as the
choice of what should be a default setting has an immediate impact on the daily behavior
patterns of a large percentage of the planet’s population.
In its capacity to transform the behavior of people, software is like a new kind of drug. As
there are many kinds of drugs (caffeine, Tylenol, Viagra, heroin, crack), so are there
many kinds of software, feeding different urges and creating different outcomes. In
design reviews at Facebook, designers are asked, “Where is the serotonin in this
design?”—which means “How will this new feature release bonding hormones in the
brains of our users to keep them coming back for more?”
All technology extends and amplifies some pre-existing human urge or condition: a
hammer extends the hand, a pencil extends the mind, a piano extends the voice.
Technologies become viral when they amplify something that is already in us, but
blocked. When a technology eliminates a major blockage, the uptake can be explosive.
Facebook gained 500 million users in less than five years by finding a basic human
blockage (our need to share and connect) and offering a way around it—as a surgeon
might extract a clot to restore the flow of blood.
There are many kinds of urges. There are the seven deadly sins (lust, greed, envy, sloth,
gluttony, pride, and wrath). There are the urges to find meaning, joy, wonder, and
happiness. There are the urges to explore, to improve, to learn, to gain wisdom, to teach.
There are the urges to feel loved, to connect, to feel useful, to nurture, to help, to belong.
Each urge, when extended, creates a different kind of outcome and a different kind of
person. So when millions of people have a given urge extended, it creates a different
kind of world. That’s why the great philosophers and moral teachers have always
encouraged us to choose our urges wisely.
3. A lot of software is designed to be addictive. In Silicon Valley, the addictivity of a given
piece of software is considered an asset. Companies strive to make their products “viral”
and “sticky” so that “users keep coming back” to “get their daily fix.” This sounds a lot
like dealing drugs. It might be good for business, but is it good for people?
On the web, there are two main kinds of companies: marketplaces and attention
economies. Both are defined by their different degrees of blockage reduction and
addictivity. And both kinds of companies fulfill urges that are already in us, but the way
that they answer those urges is different.
Marketplaces operate by connecting one group of people to another group of people and
allowing them to conduct a transaction, of which the marketplace takes a cut. Etsy
connects buyers to sellers; Kickstarter connects creators to backers; Airbnb connects
travelers to hosts; OkCupid connects daters to daters. Marketplace companies build
tools to solve problems that exist in the world. At their best, they operate like healers—
mixing up medicine to answer a need.
Attention economies, by comparison, operate by persuading users to spend large
amounts of time online, clicking many things and viewing many ads. These companies
often masquerade as “communication tools” that help people “connect.” But in attention
economies, most of the “connecting” happens alone, while you’re staring at a screen.
Attention economy companies operate less like healers and more like dealers—creating
addictive experiences to keep people hooked.
Marketplaces aim to eliminate urges by feeding them quickly (find a date, book a room,
and so on), while attention economies aim to keep the urges going forever (continuous
updates, another cool video, more new messages, and the like).
There is an ancient pact between tools and their users, which says that tools should be
used by their users, and not the other way around. Good tools should help their users
accomplish a task by satisfying some pre-existing urge and then getting out of the way.
Attention economies, at their most addictive, violate this pact.
Like good medicine, good tools should appear briefly when you need them, and then
disappear, leaving you free to get on with your life.
On the web, where people have learned not to value things directly, the most common
business model is to make a product, give it away for free, attain tremendous scale, and
then, once you have a lot of users, turn those users into the product by selling their
attention to advertisers and their personal information to marketing departments.
This is a dangerous deal—not necessarily in economic terms, but in human terms—
because, once the user has become the product, the user is no longer treated as an
individual but as a commodity, and not even a precious commodity but one insignificant
data point among many—a rounding error, meaningful only in aggregate. Thinking of
humans this way produces sociopathic behavior: rational in economic terms, but very
bad in human terms.
Why does it matter how software companies behave? There are several reasons:
4. First, because of network effects. If many people use a given piece of software, it
becomes more and more likely that you will want to use it, too. You may think that you
are not being coerced, and that you are free to decide, but as a citizen of a global
community, you will want to use the tools and platforms that allow you to connect with
the rest of your tribe and your species.
Second, software is the staging ground for the future. The stakes are low right now, but
they’re about to get higher.
At this moment of transition, we’re straddling the rare evolutionary threshold between
two scales of existence. Evolution at the individual level is about to be transcended by
another kind of evolution at the species level. The Internet is helping us see that what we
really are, in addition to our individual selves, is a network of individual cells composing a
larger human organism. We act with individual agency, but our choices and actions (and
possibly even our thoughts and our feelings) have a very real impact on the broader
world in which we exist.
Through the Internet, we are developing a species-level nervous system, capable of
transmitting thoughts, ideas, and information. The resulting meta-organism—this “global”
human being—is also beginning to exhibit physiological reactions and even “higher”
human traits like empathy and compassion.
We’ve glimpsed this only briefly several times at scale. For instance, when a group of
young Occupy Wall Street protesters were pepper-sprayed by police at UC Davis, within
a few minutes, millions of people around the world had seen the video. And many of
them felt not only moral outrage but also a kind of physical nausea—a visceral sense of
pain and disgust.
That last part is new. It’s as if those millions of viewers shared a simultaneous and
collective wince, as though the nervous systems of those millions of people were
temporarily connected to the nervous systems of the pepper-sprayed protesters, causing
them to share their pain. It was only a glimmer, and it lasted only briefly, but it could be
prophetic of what is to come.
Given all of this, it is wise to see software not only as a product or service, but also as
the staging ground for humanity’s future, affording us the time and space to get our
ethics right, before the stakes are raised. Because soon, the technology is likely to enter
our physical bodies (through pacemakers, biometric monitors, cancer-fighting nanobots,
brain-based Wi-Fi connections, and so on) and will then be much harder to turn off.
The way this will probably happen is that some guy will start a company, and his small
design team will make certain choices around things like default settings, and they will
build their product and release it into the world, and early adopters will adopt it, and then
ordinary folks will try it too, and soon thereafter the physical bodies of millions of people
will forever be augmented by the flippant choices made on a Tuesday afternoon in a little
sunny room in Palo Alto, California. At that point, technology really will be a drug—and
we will want its design to be ethically sound.