Social media preys on vulnerability of users to create algorithms, author says
The book, by New York Times reporter Max Fisher, is called "The Chaos Machine."
Social media companies have deliberately manipulated the desires and fears of its users to drive their engagement metrics which has created an addiction, according to an author.
In his new book, "The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World," New York Times reporter Max Fisher explores how the major social media and technology companies have managed to gain so much power at the expense of their users.
He spoke with "ABC News Prime" about the dilemmas he explores in his book, conversations with industry insiders about the products they've helped to create, and whether or not he'd let his child use social media.
PRIME: Congratulations, Max, and thanks so much for joining us.
FISHER: Thank you for having me.
PRIME: So you talk about how social media platforms have really spent a lot of time focusing on making sure that we stay wired and connected in an effort to make sure they keep making a lot of money. Explain the algorithm basically behind that.
FISHER: So when you open up a social media platform, what you think you're seeing are posts, thoughts and sentiment from people in your community, from your friends, and you think when you interact with them, when you post something and get a response, what you're seeing is the feedback from your community and what they like and don't like. And that is not the case.
What you are actually seeing, what you actually are experiencing are emotions and sentiments and interactions that have been predetermined and pre-selected, often personalized just for you, by these incredibly sophisticated artificial intelligence systems that govern the platforms that have determined the precise sorts of emotions, interactions and sequence of sentiments that will get you not just to spend more time browsing and scrolling on social platforms, what will get you engaged yourself and will solicit specific reactions from you. Because we're talking about billions of people, the overwhelming majority of Americans, for instance, that has profound consequences for the way our society works and for our politics.
PRIME: You use the word consequences a few times there. I'm really curious what you see as social media's real-world consequences.
FISHER: There's this one experiment that I write about in the book where these researchers took two really big groups of people over four weeks, and half of them they said "just live your life as normal," and half of them they said, "deactivate your Facebook account, take it off your phone." And the consequences were staggering. One thing they found is that people who deactivated Facebook reported higher levels of happiness and life satisfaction, equivalent to about a third the effect of going to therapy…it's certainly a lot cheaper than going to therapy.
It's also a suggestion that people weren't using social media because it makes them happy, in fact, they were using it because they are addicted to it and had a hard time turning it off and needed this experiment to force them to turn it off. And another change they found was that people who deactivated it became significantly less polarized [in] the way that they saw the news, [in] the way that they saw other people in their community.
PRIME: Do you have a social media account?
FISHER: You know this is the thing that is tough about social media. It is so dominant in our world, in the way that we consume information, in the way that we interact with people in our lives and our family and friends, that you kind of have to be on it. You probably have to have a smartphone, you probably have to be on social media to some extent. But the number one thing I think you can do is to understand what it's doing to you, understand its effects, understand the way that it distorts what it shows you and the way that people in your community seem to be acting.
It's designed to be engaging but the types of interactions that are engaging, that really activate certain chemicals in your brain and make you want to spend more time on it, are: fear, moral outrage is by far the most engaging sentiment, and also any sense of hostility towards people that are not in your social in-group.
PRIME: I just want to take a look at the subtitle because you say "The Inside Story of How Social Media Rewired Our Minds and Our World." Is that accurate? Has social media really rewired our brains?
FISHER: They have indeed found that your actual brain chemistry is changed as a result of social media use. There are a lot of things in our lives that change our brain chemistry, and they're called drugs. And that can be caffeine, that might be alcohol, it may be recreational drugs, it might be cigarettes. Social media functions in very much the same way. The reason that it's designed like that and it's explicitly designed like that, the people who designed the platforms knowingly used slot machines, dopamine delivery, these addictive, physically addictive features, to get people to spend more time on there, is that it also changes your behavior and changes the way that you think in all sorts of ways that were not intentional on the part of the platforms but are certainly consequential.
PRIME: You also talk about the 2020 election, Jan. 6 insurrection, that there was so much misinformation out there and that social media companies did very little to try to tamp that down. Do you feel like the genie is out of the bottle at this point, or are they able to control misinformation?
FISHER: So, it's funny. There a lot of people who work at the big social media companies whose job is to reduce misinformation, reduce extremism on platforms, reduce recruitment for extremist far right terrorist groups, but they are fighting a losing and in many senses, unwinnable battle. Not because there's something about social media that means that misinformation and hate are going to always be on there but because these platforms are deliberately designed to ramp up engagement in the most ruthless possible ways these companies can come up with.
So it's out of the bottle in the sense that you can't clean it up as long as the companies are doing that but it's also, at least in theory, relatively easy to fix because all the companies have to do is turn off these engagement-maximizing features, and a lot of this problem goes away. But they're not going to do that.
PRIME: Based on the people that you interviewed who are both still inside the system and who've left, is there a sense that you can kind of turn this around and use social media as a force for good?
FISHER: So yeah, a lot of these people who I've talked, some some of them are dissidents in Silicon Valley or people who were whistleblowers, some of the researchers who were outside of Silicon Valley, a lot of them are still true believers in the theoretical potential of a more neutral social media that does not have these engagement-maximizing features is something that can be and sometimes really is a really dramatic and major force for good in the world. But the problem is just these engagement-maximizing features are just overpowering that good and creating a lot of harm in the world.
PRIME: Last quick, quick question. Would you let a child of yours have social media?
FISHER: Oh, my God. No, I wouldn't let myself have social media if I could get myself off of it. The thing is that it's not just that there's a lot of harmful things in social media, but young kids and adolescents especially have a very exaggerated social need and that means they spend a lot more time on social media. They are some of the best customers of these platforms, in fact. And it means that the effects, the things that affect you and me, affect them much more drastically.
PRIME: Max Fisher, we thank you so much. And to our viewers, you can purchase "The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World" anywhere books are sold.