Book of the Week: Ten Arguments for Deleting Your Social Media Accounts Right Now

12 Jun

Ahead of our event with Jaron Lanier at Marx Memorial Library this Friday, we share an extract from the tech philosopher’s excellent new polemic Ten Arguments for Deleting Your Social Media Accounts Right Now. In it, he outlines the ways in which social media platforms surveil and manipulate users, and makes the case for liberating yourself from the Internet’s addictive hold. Zadie Smith called it: “A blisteringly good, urgent, essential read.”

You may have heard the mournful confessions from the founders of social media empires, which I prefer to call “behavior modification empires.”

Here’s Sean Parker, the first president of Facebook:

We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. . . . It’s a social-validation feedback loop . . . exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. . . . The inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people—understood this consciously. And we did it anyway . . . it literally changes your relationship with society, with each other. . . . It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.

Here’s Chamath Palihapitiya, former vice president of user growth at Facebook:

The short-term, dopamine-driven feedback loops we’ve created are destroying how society works. . . No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem—this is not about Russian ads. This is a global problem. . . . I feel tremendous guilt. I think we all knew in the back of our minds—even though we feigned this whole line of, like, there probably aren’t any bad unintended consequences. I think in the back, deep, deep recesses of, we kind of knew something bad could happen. . . . So we are in a really bad state of affairs right now, in my opinion. It is eroding the core foundation of how people behave by and between each other. And I don’t have a good solution. My solution is I just don’t use these tools anymore. I haven’t for years.

Better late than never. Plenty of critics like me have been warning that bad stuff was happening for a while now, but to hear this from the people who did the stuff is progress, a step forward.

For years, I had to endure quite painful criticism from friends in Silicon Valley because I was perceived as a traitor for criticizing what we were doing. Lately I have the opposite problem. I argue that Silicon Valley people are for the most part decent, and I ask that we not be villainized; I take a lot of fresh heat for that. Whether I’ve been too hard or too soft on my community is hard to know.

The more important question now is whether anyone’s criticism will matter. It’s undeniably out in the open that a bad technology is doing us harm, but will we will you, meaning you—be able to resist and help steer the world to a better place?

Companies like Facebook, Google, and Twitter are finally trying to fix some of the massive problems they created, albeit in a piecemeal way. Is it because they are being pressured or because they feel that it’s the right thing to do? Probably a little of both.

The companies are changing policies, hiring humans to monitor what’s going on, and hiring data scientists to come up with algorithms to avoid the worst failings. Facebook’s old mantra was “Move fast and break things,” and now they’re coming up with better mantras and picking up a few pieces from a shattered world and gluing them together.

This book will argue that the companies on their own can’t do enough to glue the world back together.

Because people in Silicon Valley are expressing regrets, you might think that now you just need to wait for us to fix the problem. That’s not how things work. If you aren’t part of the solution, there will be no solution.

This first argument will introduce a few key concepts behind the design of addictive and manipulative network services. Awareness is the first step to freedom.

CARROT AND SHTICK
Parker says Facebook intentionally got people addicted, while Palihapitiya is saying something about the negative effects on relationships and society. What is the connection between these two mea culpas?

The core process that allows social media to make money and that also does the damage to society is behavior modification. Behavior modification entails methodical techniques that change behavioral patterns in animals and people. It can be used to treat addictions, but it can also be used to create them.

The damage to society comes because addiction makes people crazy. The addict gradually loses touch with the real world and real people. When many people are addicted to manipulative schemes, the world gets dark and crazy.

Addiction is a neurological process that we don’t understand completely. e neurotransmitter dopamine plays a role in pleasure and is thought to be central to the mechanism of behavior change in response to getting rewards. at is why Parker brings it up.

Behavior modification, especially the modern kind implemented with gadgets like smartphones, is a statistical effect, meaning it’s real but not comprehensively reliable; over a population, the effect is more or less predictable, but for each individual it’s impossible to say. To a degree, you’re an animal in a behaviorist’s experimental cage. But the fact that something is fuzzy or approximate does not make it unreal.

Originally, food treats were the most common reward used in behaviorist experiments, though the practice goes back to ancient times. Every animal trainer uses them, slipping a little treat to a dog after it has performed a trick. Many parents of young children do it, too.

One of the first behaviorists, Ivan Pavlov, famously demonstrated that he didn’t need to use real food. He would ring a bell when a dog was fed, and eventually the dog would salivate upon hearing the bell alone.

Using symbols instead of real rewards has become an essential trick in the behavior modification toolbox. For instance, a smartphone game like Candy Crush uses shiny images of candy instead of real candy to become addictive. Other addictive video games might use shiny images of coins or other treasure.

Addictive pleasure and reward patterns in the brain—the “little dopamine hit” cited by Sean Parker—are part of the basis of social media addiction, but not the whole story, because social media also uses punishment and negative reinforcement.

Various kinds of punishment have been used in behaviorist labs; electric shocks were popular for a while. But just as with rewards, it’s not necessary for punishments to be real and physical. Sometimes experiments deny a subject points or tokens.

You are getting the equivalent of both treats and electric shocks when you use social media.

Most users of social media have experienced catfishing (which cats hate), senseless rejection, being belittled or ignored, outright sadism, or all of the above, and worse. Just as the carrot and stick work together, unpleasant feedback can play as much of a role in addiction and sneaky behavior modification as the pleasant kind.

Ten Arguments For Deleting Your Social Media Accounts Right Now by Jaron Lanier (Bodley Head, £9.99) is out now. Jaron Lanier will be discussing his book with the Guardian’s John Harris and Idler editor Tom Hodgkinson at the Marx Memorial Library this Friday 15 June. Book your place here. Click here to be the first to read our Book of the Week extracts.