Search

The Social Dilemma (2020) – Transcript

This documentary-drama hybrid explores the dangerous human impact of social networking, with tech experts sounding the alarm on their own creations.
The Social Dilemma (2020)

The Social Dilemma is a 2020 American docudrama film directed by Jeff Orlowski and written by Orlowski, Davis Coombe, and Vickie Curtis. The film explores the rise of social media and the damage it has caused to society, focusing on its exploitation of its users for financial gain through surveillance capitalism and data mining, how its design is meant to nurture an addiction, its use in politics, its effect on mental health (including the mental health of adolescents and rising teen suicide rates), and its role in spreading conspiracy theories such as Pizzagate and aiding groups such as flat-earthers.

The film features interviews with many former employees, executives and other professionals from top tech companies and social media platforms, who provide their first-hand experiences of working in and around the tech industry. Interviewees state that social media platforms and big tech companies have been instrumental in providing positive change for society; they also note that such platforms have also caused problematic social, political, and cultural consequences. These interviews are presented alongside dramatizations of a teenager’s social media addiction and a primer on how a social media algorithm powered by artificial intelligence may work.

 

 

The Social Dilemma Movie 2020 (live stream)

 

* * *

“Nothing vast enters the life of mortals without a curse.”
—Sophocles

[eerie instrumental music playing]

[interviewer] Why don’t you go ahead? Sit down and see if you can get comfy.

You good? All right.

[Tristan Harris] Yeah. [exhales]

[interviewer] Um…

[cell phone vibrates]

[crew member] Take one, marker.

[interviewer] Wanna start by introducing yourself?

[crew member coughs]

[Bailey Richardson] Hello, world. Bailey. Take three.

[interviewer] You good?

[Joe Toscano] This is the worst part, man. [chuckling] I don’t like this.

[Sandy Parakilas] I worked at Facebook in 2011 and 2012.

[Bailey] I was one of the really early employees at Instagram.

[Guillaume Chaslot] I worked at, uh, Google, uh, YouTube.

[Lynn Fox] Apple, Google, Twitter, Palm.

[Aza Raskin] I helped start Mozilla Labs and switched over to the Firefox side.

[interviewer] Are we rolling? Everybody?

[crew members reply]

[interviewer] Great.

[Alex Roetter] I worked at Twitter. My last job there was the senior vice president of engineering.

[Tim Kendall] I was the president of Pinterest. [sips] Before that, um, I was the… the director of monetization at Facebook for five years.

[Jeff Seibert] While at Twitter, I spent a number of years running their developer platform, and then I became head of consumer product.

[Justin Rosenstein] I was the coinventor of Google Drive, Gmail Chat, Facebook Pages, and the Facebook like button.

[Joe] Yeah. This is… This is why I spent, like, eight months talking back and forth with lawyers. This freaks me out.

[Alex] When I was there, I always felt like, fundamentally, it was a force for good. I don’t know if I feel that way anymore.

[Joe] I left Google in June 2017, uh, due to ethical concerns. And… And not just at Google but within the industry at large.

[Justin] I’m very concerned. I’m very concerned.

[Tim] It’s easy today to lose sight of the fact that these tools actually have created some wonderful things in the world. They’ve reunited lost family members. They’ve found organ donors. I mean, there were meaningful, systemic changes happening around the world because of these platforms that were positive! I think we were naive about the flip side of that coin.

[Alex] Yeah, these things, you release them, and they take on a life of their own. And how they’re used is pretty different than how you expected.

[Jeff] Nobody, I deeply believe, ever intended any of these consequences.

[Lynn] There’s no one bad guy. No. Absolutely not.

[interviewer] So, then, what’s the… what’s the problem?

[interviewer] Is there a problem, and what is the problem?

[Jeff] [swallows]

[Justin] [clicks tongue] Yeah, it is hard to give a single, succinct… I’m trying to touch on many different problems.

[interviewer] What is the problem?

[Tristan] [clicks tongue, chuckles]


A NETFLIX ORIGINAL DOCUMENTARY

[birds singing]

[dog barking in distance]

[reporter 1] Despite facing mounting criticism, the so-called Big Tech names are getting bigger.

The entire tech industry is under a new level of scrutiny.

And a new study sheds light on the link between mental health and social media use.

[on TV] Here to talk about the latest research…

[Tucker Carlson] …is going on that gets no coverage at all.

Tens of millions of Americans are hopelessly addicted to their electronic devices.

[reporter 2] It’s exacerbated by the fact that you can literally isolate yourself now in a bubble, thanks to our technology.

Fake news is becoming more advanced and threatening societies around the world.

We weren’t expecting any of this when we created Twitter over 12 years ago.

White House officials say they have no reason to believe the Russian cyberattacks will stop.

YouTube is being forced to concentrate on cleansing the site.

[reporter 3] TikTok, if you talk to any tween out there…

[on TV] …there’s no chance they’ll delete this thing…

Hey, Isla, can you get the table ready, please?

[reporter 4] There’s a question about whether social media is making your child depressed.

[mom] Isla, can you set the table, please?

[reporter 5] These cosmetic procedures are becoming so popular with teens, plastic surgeons have coined a new syndrome for it, “Snapchat dysmorphia,” with young patients wanting surgery so they can look more like they do in filtered selfies.

Still don’t see why you let her have that thing.

What was I supposed to do? I mean, every other kid in her class had one.

She’s only 11.

Cass, no one’s forcing you to get one. You can stay disconnected as long as you want.

Hey, I’m connected without a cell phone, okay? I’m on the Internet right now. Also, that isn’t even actual connection. It’s just a load of sh–

Surveillance capitalism has come to shape our politics and culture in ways many people don’t perceive.

[reporter 6] ISIS inspired followers online, and now white supremacists are doing the same.

Recently in India, Internet lynch mobs have killed a dozen people, including these five…

[reporter 7] It’s not just fake news; it’s fake news with consequences.

[reporter 8] How do you handle an epidemic in the age of fake news?

Can you get the coronavirus by eating Chinese food?

We have gone from the information age into the disinformation age.

Our democracy is under assault.

[man 4] What I said was, “I think the tools that have been created today are starting to erode the social fabric of how society works.”


[eerie instrumental music continues]

/the social dilemma_

[music fades]

[indistinct chatter]

[crew member] Fine.

[stage manager] Aza does welcoming remarks. We play the video.And then, “Ladies and gentlemen, Tristan Harris.”

[Tristan Harris] Right.

[stage manager] Great.

[Tristan] So, I come up, and… basically say, “Thank you all for coming.” Um… So, today, I wanna talk about a new agenda for technology. And why we wanna do that is because if you ask people, “What’s wrong in the tech industry right now?” there’s a cacophony of grievances and scandals, and “They stole our data.” And there’s tech addiction. And there’s fake news. And there’s polarization and some elections that are getting hacked. But is there something that is beneath all these problems that’s causing all these things to happen at once?

[stage manager speaking indistinctly]

[Tristan] Does this feel good?

[Randy Fernando] Very good. Yeah.

[Tristan] Um… [sighs] I’m just trying to… Like, I want people to see… Like, there’s a problem happening in the tech industry, and it doesn’t have a name, and it has to do with one source, like, one…

[eerie instrumental music playing]

[Tristan] When you look around you, it feels like the world is going crazy. You have to ask yourself, like, “Is this normal? Or have we all fallen under some kind of spell?”

[Tristan] I wish more people could understand how this works because it shouldn’t be something that only the tech industry knows. It should be something that everybody knows.

[backpack zips]

[Tristan] [softly] Bye.

[guard] Here you go, sir.

[employee] Hello!

[Tristan] Hi. Tristan. Nice to meet you.

It’s Tris-tan, right?

[Tristan] Yes.

Awesome. Cool.

[presenter] Tristan Harris is a former design ethicist for Google and has been called the closest thing Silicon Valley has to a conscience.

[reporter] He’s asking tech to bring what he calls “ethical design” to its products.

[Anderson Cooper] It’s rare for a tech insider to be so blunt, but Tristan Harris believes someone needs to be.

[Tristan] When I was at Google, I was on the Gmail team, and I just started getting burnt out ’cause we’d had so many conversations about… you know, what the inbox should look like and what color it should be, and… And I, you know, felt personally addicted to e-mail, and I found it fascinating there was no one at Gmail working on making it less addictive. And I was like, “Is anybody else thinking about this? I haven’t heard anybody talk about this.” And I was feeling this frustration… [sighs] …with the tech industry, overall, that we’d kind of, like, lost our way.

[ominous instrumental music playing]

[message alerts chiming]

[Tristan] You know, I really struggled to try and figure out how, from the inside, we could change it.

[energetic piano music playing]

[Tristan] And that was when I decided to make a presentation, kind of a call to arms. Every day, I went home and I worked on it for a couple hours every single night.

[typing]

[Tristan] It basically just said, you know, never before in history have 50 designers—20- to 35-year-old white guys in California—made decisions that would have an impact on two billion people. Two billion people will have thoughts that they didn’t intend to have because a designer at Google said, “This is how notifications work on that screen that you wake up to in the morning.” And we have a moral responsibility, as Google, for solving this problem. And I sent this presentation to about 15, 20 of my closest colleagues at Google, and I was very nervous about it. I wasn’t sure how it was gonna land. When I went to work the next day, most of the laptops had the presentation open. Later that day, there was, like, 400 simultaneous viewers, so it just kept growing and growing. I got e-mails from all around the company. I mean, people in every department saying, “I totally agree.” “I see this affecting my kids.” “I see this affecting the people around me.” “We have to do something about this.” It felt like I was sort of launching a revolution or something like that. Later, I found out Larry Page had been notified about this presentation in three separate meetings that day.

[indistinct chatter]

[Tristan] And so, it created this kind of cultural moment that Google needed to take seriously.

[whooshing]

[Tristan] And then… nothing.

[whooshing fades]

[message alerts chiming]


[Tim Kendall] Everyone in 2006… including all of us at Facebook, just had total admiration for Google and what Google had built, which was this incredibly useful service that did, far as we could tell, lots of goodness for the world, and they built this parallel money machine. We had such envy for that, and it seemed so elegant to us… and so perfect. Facebook had been around for about two years, um, and I was hired to come in and figure out what the business model was gonna be for the company. I was the director of monetization. The point was, like, “You’re the person who’s gonna figure out how this thing monetizes.” And there were a lot of people who did a lot of the work, but I was clearly one of the people who was pointing towards… “Well, we have to make money, A… and I think this advertising model is probably the most elegant way.


[bright instrumental music playing]

Uh-oh. What’s this video Mom just sent us?

Oh, that’s from a talk show, but that’s pretty good. Guy’s kind of a genius. He’s talking all about deleting social media, which you gotta do.

I might have to start blocking her e-mails. I don’t even know what she’s talking about, man. She’s worse than I am.

No, she only uses it for recipes.

Right, and work.

And workout videos.

[guy] And to check up on us.

And everyone else she’s ever met in her entire life.

[Whoopy Goldberg, The View] If you are scrolling through your social media feed while you’re watchin’ us, you need to put the damn phone down and listen up ’cause our next guest has written an incredible book about how much it’s wrecking our lives. Please welcome author of Ten Arguments for Deleting Your Social Media Accounts Right Now

[Sunny Hostin] Uh-huh.

[Whoopy Goldberg] …Jaron Lanier.

[cohosts speaking indistinctly]

[Jaron Lanier] Companies like Google and Facebook are some of the wealthiest and most successful of all time. Uh, they have relatively few employees. They just have this giant computer that rakes in money, right? Uh… Now, what are they being paid for? [chuckles] That’s a really important question.

[Roger McNamee] So, I’ve been an investor in technology for 35 years. The first 50 years of Silicon Valley, the industry made products– hardware, software– sold ’em to customers. Nice, simple business. For the last ten years, the biggest companies in Silicon Valley have been in the business of selling their users.

[Aza Raskin] It’s a little even trite to say now, but… because we don’t pay for the products that we use, advertisers pay for the products that we use. Advertisers are the customers. We’re the thing being sold.

[Tristan] The classic saying is: “If you’re not paying for the product, then you are the product.”

A lot of people think, you know, “Oh, well, Google’s just a search box, and Facebook’s just a place to see what my friends are doing and see their photos.” But what they don’t realize is they’re competing for your attention.

So, you know, Facebook, Snapchat, Twitter, Instagram, YouTube, companies like this, their business model is to keep people engaged on the screen.

[Tim] Let’s figure out how to get as much of this person’s attention as we possibly can. How much time can we get you to spend? How much of your life can we get you to give to us?

[Justin Rosenstein] When you think about how some of these companies work, it starts to make sense. There are all these services on the Internet that we think of as free, but they’re not free. They’re paid for by advertisers. Why do advertisers pay those companies? They pay in exchange for showing their ads to us. We’re the product. Our attention is the product being sold to advertisers.

[Jaron] That’s a little too simplistic. It’s the gradual, slight, imperceptible change in your own behavior and perception that is the product. And that is the product. It’s the only possible product. There’s nothing else on the table that could possibly be called the product. That’s the only thing there is for them to make money from. Changing what you do, how you think, who you are. It’s a gradual change. It’s slight. If you can go to somebody and you say, “Give me $10 million, and I will change the world one percent in the direction you want it to change…” It’s the world! That can be incredible, and that’s worth a lot of money.

[Shoshana Zuboff, PhD] This is what every business has always dreamt of: to have a guarantee that if it places an ad, it will be successful. That’s their business. They sell certainty. In order to be successful in that business, you have to have great predictions. Great predictions begin with one imperative: you need a lot of data.

[Tristan] Many people call this surveillance capitalism, capitalism profiting off of the infinite tracking of everywhere everyone goes by large technology companies whose business model is to make sure that advertisers are as successful as possible.

[Shoshana] This is a new kind of marketplace now. It’s a marketplace that never existed before. And it’s a marketplace that trades exclusively in human futures. Just like there are markets that trade in pork belly futures or oil futures. We now have markets that trade in human futures at scale, and those markets have produced the trillions of dollars that have made the Internet companies the richest companies in the history of humanity.

[indistinct chatter]

[Jeff Seibert] What I want people to know is that everything they’re doing online is being watched, is being tracked, is being measured. Every single action you take is carefully monitored and recorded. Exactly what image you stop and look at, for how long you look at it. Oh, yeah, seriously, for how long you look at it.

[monitors beeping]

[Tristan] They know when people are lonely. They know when people are depressed. They know when people are looking at photos of your ex-romantic partners. They know what you’re doing late at night. They know the entire thing. Whether you’re an introvert or an extrovert, or what kind of neuroses you have, what your personality type is like.

[Shoshana] They have more information about us than has ever been imagined in human history. It is unprecedented.

[Sandy Parakilas] And so, all of this data that we’re… that we’re just pouring out all the time is being fed into these systems that have almost no human supervision and that are making better and better and better and better predictions about what we’re gonna do and… and who we are.

[indistinct chatter]

[Aza] People have the misconception it’s our data being sold. It’s not in Facebook’s business interest to give up the data. What do they do with that data?

[console whirring]

[Aza] They build models that predict our actions, and whoever has the best model wins.

[AI] His scrolling speed is slowing. Nearing the end of his average session length. Decreasing ad load. Pull back on friends and family.

[Tristan] On the other side of the screen, it’s almost as if they had this avatar voodoo doll-like model of us.

All of the things we’ve ever done, all the clicks we’ve ever made, all the videos we’ve watched, all the likes, that all gets brought back into building a more and more accurate model. The model, once you have it, you can predict the kinds of things that person does.

[AI] Right, let me just test.

[Tristan] Where you’ll go. I can predict what kind of videos will keep you watching. I can predict what kinds of emotions tend to trigger you.

[blue AI] Yes, perfect. The most epic fails of the year.

[crowd groans on video]

[whooshes]

[AI] Perfect. That worked. Following with another video. Beautiful. Let’s squeeze in a sneaker ad before it starts.

[Tristan] At a lot of technology companies, there’s three main goals. There’s the engagement goal: to drive up your usage, to keep you scrolling. There’s the growth goal: to keep you coming back and inviting as many friends and getting them to invite more friends. And then there’s the advertising goal: to make sure that, as all that’s happening, we’re making as much money as possible from advertising.

[console beeps]

Each of these goals are powered by algorithms whose job is to figure out what to show you to keep those numbers going up.

[Tim Kendall] We often talked about, at Facebook, this idea of being able to just dial that as needed. And, you know, we talked about having Mark have those dials. “Hey, I want more users in Korea today.” “Turn the dial.” “Let’s dial up the ads a little bit.” “Dial up monetization, just slightly.” And so, that happ– I mean, at all of these companies, there is that level of precision.

Dude, how– I don’t know how I didn’t get carded.

That ref just, like, sucked or something.

You got literally all the way…

That’s Rebecca. Go talk to her.

I know who it is.

Dude, yo, go talk to her.

[guy] I’m workin’ on it.

[AI] His calendar says he’s on a break right now. We should be live.

[AI] [sighs] Want me to nudge him?

[AI] Yeah, nudge away.

[console beeps]

[AI] “Your friend Tyler just joined. Say hi with a wave.”

[Engagement AI] Come on, Ben. Send a wave. [sighs]

You’re not… Go talk to her, dude.

[phone vibrates, chimes]

[Ben sighs]

[cell phone chimes]

[console beeps]

[AI] New link! All right, we’re on. [exhales]

[AI] Follow that up with a post from User 079044238820, Rebecca.

[AI] Good idea. GPS coordinates indicate that they’re in close proximity.

[yellow AI] He’s primed for an ad. Auction time. Sold! To Deep Fade hair wax. We had 468 interested bidders. We sold Ben at 3.262 cents for an impression.

[melancholy piano music playing]

[Ben sighs]

[Jaron] We’ve created a world in which online connection has become primary, especially for younger generations. And yet, in that world, any time two people connect, the only way it’s financed is through a sneaky third person who’s paying to manipulate those two people. So, we’ve created an entire global generation of people who are raised within a context where the very meaning of communication, the very meaning of culture, is manipulation. We’ve put deceit and sneakiness at the absolute center of everything we do.

Artificial Intelligence – 60 Minutes Documentary | Transcript

“Any sufficiently advanced technology is indistinguishable from magic”
—Arthur C. Clarke

[interviewer] Grab the…

[Tristan] Okay. Where’s it help to hold it?

[interviewer] Great.

[Tristan] Here?

[interviewer] Yeah.

[Tristan] How does this come across on camera if I were to do, like, this move–

[interviewer] We can–

[blows] Like that?

[interviewer laughs] What?

[Tristan] Yeah.

[interviewer] Do that again.

[Tristan] Exactly. Yeah. [blows] Yeah. No, it’s probably not… Like… yeah. I mean, this one is less…

[interviewer laughs] Larissa’s, like, actually freaking out over here.

[Tristan] Is that good?

[instrumental music playing]

[Tristan] I was, like, five years old when I learned how to do magic. And I could fool adults, fully-grown adults with, like, PhDs. Magicians were almost like the first neuroscientists and psychologists. Like, they were the ones who first understood how people’s minds work. They just, in real time, are testing lots and lots of stuff on people. A magician understands something, some part of your mind that we’re not aware of. That’s what makes the illusion work. Doctors, lawyers, people who know how to build 747s or nuclear missiles, they don’t know more about how their own mind is vulnerable. That’s a separate discipline. And it’s a discipline that applies to all human beings.

[Stanford University]

[Tristan] From that perspective, you can have a very different understanding of what technology is doing. When I was at the Stanford Persuasive Technology Lab, this is what we learned. How could you use everything we know about the psychology of what persuades people and build that into technology?

[teacher] Now, many of you in the audience are geniuses already. I think that’s true, but my goal is to turn you into a behavior-change genius.

[Sandy] There are many prominent Silicon Valley figures who went through that class– key growth figures at Facebook and Uber and… and other companies– and learned how to make technology more persuasive, Tristan being one.

[Tristan] Persuasive technology is just sort of design intentionally applied to the extreme, where we really want to modify someone’s behavior. We want them to take this action. We want them to keep doing this with their finger.

[Joe Toscano] You pull down and you refresh, it’s gonna be a new thing at the top. Pull down and refresh again, it’s new. Every single time. Which, in psychology, we call a positive intermittent reinforcement.

[Tristan] You don’t know when you’re gonna get it or if you’re gonna get something, which operates just like the slot machines in Vegas. It’s not enough that you use the product consciously, I wanna dig down deeper into the brain stem and implant, inside of you, an unconscious habit so that you are being programmed at a deeper level. You don’t even realize it.

[teacher] A man, James Marshall…

[Tristan] Every time you see it there on the counter, and you just look at it, and you know if you reach over, it just might have something for you, so you play that slot machine to see what you got, right? That’s not by accident. That’s a design technique.

[teacher] He brings a golden nugget to an officer in the army in San Francisco. Mind you, the… the population of San Francisco was only…

[Jeff] Another example is photo tagging.

[teacher] The secret didn’t last.

[phone vibrates]

[Jeff] So, if you get an e-mail that says your friend just tagged you in a photo, of course you’re going to click on that e-mail and look at the photo. It’s not something you can just decide to ignore. This is deep-seated, like, human personality that they’re tapping into. What you should be asking yourself is: “Why doesn’t that e-mail contain the photo in it? It would be a lot easier to see the photo.”

[Tristan] When Facebook found that feature, they just dialed the hell out of that because they said, “This is gonna be a great way to grow activity. Let’s just get people tagging each other in photos all day long.”

[upbeat techno music playing]

[cell phone chimes]

[AI] He commented.

[Growth AI] Nice.

[AI] Okay, Rebecca received it, and she is responding.

[AI] All right, let Ben know that she’s typing so we don’t lose him.

[AI] Activating ellipsis.

[teacher continues speaking indistinctly]

[tense instrumental music playing]

[AI] Great, she posted.

[AI] He’s commenting on her comment about his comment on her post.

[AI] Hold on, he stopped typing.

[AI] Let’s autofill.

[AI] Emojis. He loves emojis.

[AI] He went with fire.

[clicks tongue, sighs]

[AI] I was rootin’ for eggplant.

[Tristan] There’s an entire discipline and field called “growth hacking.” Teams of engineers whose job is to hack people’s psychology so they can get more growth. They can get more user sign-ups, more engagement. They can get you to invite more people.

[Chamath Palihapitiya] After all the testing, all the iterating, all of this stuff, you know the single biggest thing we realized? Get any individual to seven friends in ten days. That was it.

[Sandy] Chamath was the head of growth at Facebook early on, and he’s very well known in the tech industry for pioneering a lot of the growth tactics that were used to grow Facebook at incredible speed. And those growth tactics have then become the standard playbook for Silicon Valley. They were used at Uber and at a bunch of other companies. One of the things that he pioneered was the use of scientific A/B testing of small feature changes. Companies like Google and Facebook would roll out lots of little, tiny experiments that they were constantly doing on users. And over time, by running these constant experiments, you… you develop the most optimal way to get users to do what you want them to do. It’s… It’s manipulation.

[interviewer] Uh, you’re making me feel like a lab rat.

[Sandy] You are a lab rat. We’re all lab rats. And it’s not like we’re lab rats for developing a cure for cancer. It’s not like they’re trying to benefit us. Right? We’re just zombies, and they want us to look at more ads so they can make more money.

[Shoshana] Facebook conducted what they called “massive-scale contagion experiments.”

[AI] Okay.

[Shoshana] How do we use subliminal cues on the Facebook pages to get more people to go vote in the midterm elections? And they discovered that they were able to do that. One thing they concluded is that we now know we can affect real-world behavior and emotions without ever triggering the user’s awareness. They are completely clueless.

[Tristan] We’re pointing these engines of AI back at ourselves to reverse-engineer what elicits responses from us. Almost like you’re stimulating nerve cells on a spider to see what causes its legs to respond. So, it really is this kind of prison experiment where we’re just, you know, roping people into the matrix, and we’re just harvesting all this money and… and data from all their activity to profit from. And we’re not even aware that it’s happening.

[Chamath] So, we want to psychologically figure out how to manipulate you as fast as possible and then give you back that dopamine hit. We did that brilliantly at Facebook. Instagram has done it. WhatsApp has done it. You know, Snapchat has done it. Twitter has done it.

[Sean Parker] I mean, it’s exactly the kind of thing that a… that a hacker like myself would come up with because you’re exploiting a vulnerability in… in human psychology. [chuckles] And I just… I think that we… you know, the inventors, creators… uh, you know, and it’s me, it’s Mark, it’s the… you know, Kevin Systrom at Instagram… It’s all of these people… um, understood this consciously, and we did it anyway.

[Tristan] No one got upset when bicycles showed up. Right? Like, if everyone’s starting to go around on bicycles, no one said, “Oh, my God, we’ve just ruined society. [chuckles] Like, bicycles are affecting people. They’re pulling people away from their kids. They’re ruining the fabric of democracy. People can’t tell what’s true.” Like, we never said any of that stuff about a bicycle. If something is a tool, it genuinely is just sitting there, waiting patiently. If something is not a tool, it’s demanding things from you. It’s seducing you. It’s manipulating you. It wants things from you. And we’ve moved away from having a tools-based technology environment to an addiction- and manipulation-based technology environment. That’s what’s changed. Social media isn’t a tool that’s just waiting to be used. It has its own goals, and it has its own means of pursuing them by using your psychology against you.

[ominous instrumental music playing]

“There are only two industries that call their customers ‘users’:
illegal drugs and software.”
—Edward Tufte

[Tim] Rewind a few years ago, I was the… I was the president of Pinterest. I was coming home, and I couldn’t get off my phone once I got home, despite having two young kids who needed my love and attention. I was in the pantry, you know, typing away on an e-mail or sometimes looking at Pinterest. I thought, “God, this is classic irony. I am going to work during the day and building something that then I am falling prey to.” And I couldn’t… I mean, some of those moments, I couldn’t help myself.

[notification chimes]

[woman gasps]

[Aza Raskin] The one that I’m… I’m most prone to is Twitter. Uh, used to be Reddit. I actually had to write myself software to break my addiction to reading Reddit.

[notifications chime]

[slot machines whir]

[Tristan] I’m probably most addicted to my e-mail. I mean, really. I mean, I… I feel it.

[notifications chime]

[woman gasps]

[electricity crackles]

[Tim] Well, I mean, it’s sort– it’s interesting that knowing what was going on behind the curtain, I still wasn’t able to control my usage. So, that’s a little scary.

[Jeff] Even knowing how these tricks work, I’m still susceptible to them. I’ll still pick up the phone, and 20 minutes will disappear.

[notifications chime]

[fluid rushes]

[woman gasps]

[Roger] Do you check your smartphone before you pee in the morning or while you’re peeing in the morning? ‘Cause those are the only two choices.

[Tim] I tried through willpower, just pure willpower… “I’ll put down my phone, I’ll leave my phone in the car when I get home.” I think I told myself a thousand times, a thousand different days, “I am not gonna bring my phone to the bedroom,” and then 9:00 p.m. rolls around. “Well, I wanna bring my phone in the bedroom.” [takes a deep breath] And so, that was sort of… Willpower was kind of attempt one, and then attempt two was, you know, brute force.

[announcer] Introducing the Kitchen Safe. The Kitchen Safe is a revolutionary, new, time-locking container that helps you fight temptation. All David has to do is place those temptations in the Kitchen Safe. Next, he rotates the dial to set the timer. And, finally, he presses the dial to activate the lock. The Kitchen Safe is great…

We have that, don’t we?

…video games, credit cards, and cell phones.

Yeah, we do.

[announcer] Once the Kitchen Safe is locked, it cannot be opened until the timer reaches zero.

[Dr. Anna Lembke] So, here’s the thing. Social media is a drug. I mean, we have a basic biological imperative to connect with other people. That directly affects the release of dopamine in the reward pathway. Millions of years of evolution, um, are behind that system to get us to come together and live in communities, to find mates, to propagate our species. So, there’s no doubt that a vehicle like social media, which optimizes this connection between people, is going to have the potential for addiction.

Mmm! [laughs]

Dad, stop!

I have, like, 1,000 more snips to send before dinner.

[dad] Snips?

I don’t know what a snip is.

Mm, that smells good, baby.

All right. Thank you. I was, um, thinking we could use all five senses to enjoy our dinner tonight. So, I decided that we’re not gonna have any cell phones at the table tonight. So, turn ’em in.

Really?

[mom] Yep.

All right.

Thank you. Ben?

Okay.

Mom, the phone pirate. [scoffs]

Got it.

Mom!

So, they will be safe in here until after dinner… and everyone can just chill out.

[safe whirs]

Okay?

[Cass sighs]

[notification chimes]

Can I just see who it is?

No.

Just gonna go get another fork.

Thank you.

Honey, you can’t open that.

I locked it for an hour, so just leave it alone.

So, what should we talk about?

Well, we could talk about the, uh, Extreme Center wackos I drove by today.

[mom] Please, Frank.

What?

[mom] I don’t wanna talk about politics.

What’s wrong with the Extreme Center?

See? He doesn’t even get it.

It depends on who you ask.

It’s like asking, “What’s wrong with propaganda?”

[safe smashes]

[mom and Frank scream]

[Frank] Isla!

Oh, my God.

[sighs] Do you want me to…

[mom] Yeah.

[Anna] I… I’m worried about my kids. And if you have kids, I’m worried about your kids. Armed with all the knowledge that I have and all of the experience, I am fighting my kids about the time that they spend on phones and on the computer. I will say to my son, “How many hours do you think you’re spending on your phone?” He’ll be like, “It’s, like, half an hour. It’s half an hour, tops.”

[Anna’s Son, James Lembke, Age 15] I’d say upwards hour, hour and a half.

[Mary] I looked at his screen report a couple weeks ago. Three hours and 45 minutes.

[James] That… I don’t think that’s… No. Per day, on average?

[Mary] Yeah.

[James] Should I go get it right now?

[Anna] There’s not a day that goes by that I don’t remind my kids about the pleasure-pain balance, about dopamine deficit states, about the risk of addiction.

[Mary] Moment of truth. Two hours, 50 minutes per day. Let’s see.

[James] Actually, I’ve been using a lot today.

[Mary] Last seven days.

[James] That’s probably why.

[Mary] Instagram, six hours, 13 minutes. Okay, so my Instagram’s worse.

My screen’s completely shattered. Thanks, Cass.

What do you mean, “Thanks, Cass”?

You keep freaking Mom out about our phones when it’s not really a problem.

We don’t need our phones to eat dinner!

I get what you’re saying. It’s just not that big a deal. It’s not.

If it’s not that big a deal, don’t use it for a week.

[Ben sighs]

Yeah. Yeah, actually, if you can put that thing away for, like, a whole week… I will buy you a new screen.

Like, starting now?

[mom] Starting now.

Okay. You got a deal.

[mom] Okay.

Okay, you gotta leave it here, though, buddy.

All right, I’m plugging it in.

Let the record show… I’m backing away.

Okay.

You’re on the clock.

[Ben] One week.

Oh, my…

Think he can do it?

I don’t know. We’ll see. Just eat, okay?

Good family dinner!


[Tristan] These technology products were not designed by child psychologists who are trying to protect and nurture children. They were just designing to make these algorithms that were really good at recommending the next video to you or really good at getting you to take a photo with a filter on it.

[cell phone chimes]

[Tristan] It’s not just that it’s controlling where they spend their attention. Especially social media starts to dig deeper and deeper down into the brain stem and take over kids’ sense of self-worth and identity.

[notifications chiming]

[Tristan] We evolved to care about whether other people in our tribe… think well of us or not ’cause it matters. But were we evolved to be aware of what 10,000 people think of us? We were not evolved to have social approval being dosed to us every five minutes. That was not at all what we were built to experience.

[Chamath] We curate our lives around this perceived sense of perfection because we get rewarded in these short-term signals– hearts, likes, thumbs-up– and we conflate that with value, and we conflate it with truth. And instead, what it really is is fake, brittle popularity… that’s short-term and that leaves you even more, and admit it, vacant and empty before you did it. Because then it forces you into this vicious cycle where you’re like, “What’s the next thing I need to do now? ‘Cause I need it back.” Think about that compounded by two billion people, and then think about how people react then to the perceptions of others. It’s just a… It’s really bad. It’s really, really bad.

[Jonathan Haidt, PhD] There has been a gigantic increase in depression and anxiety for American teenagers which began right around… between 2011 and 2013. The number of teenage girls out of 100,000 in this country who were admitted to a hospital every year because they cut themselves or otherwise harmed themselves, that number was pretty stable until around 2010, 2011, and then it begins going way up.

It’s up 62 percent for older teen girls. It’s up 189 percent for the preteen girls. That’s nearly triple. Even more horrifying, we see the same pattern with suicide. The older teen girls, 15 to 19 years old, they’re up 70 percent, compared to the first decade of this century. The preteen girls, who have very low rates to begin with, they are up 151 percent. And that pattern points to social media.  Gen Z, the kids born after 1996 or so, those kids are the first generation in history that got on social media in middle school.

[thunder rumbling in distance]

[Jonathan] How do they spend their time? They come home from school, and they’re on their devices. A whole generation is more anxious, more fragile, more depressed.

[thunder rumbles]

[Isla gasps]

[Jonathan] They’re much less comfortable taking risks. The rates at which they get driver’s licenses have been dropping. The number who have ever gone out on a date or had any kind of romantic interaction is dropping rapidly. This is a real change in a generation. And remember, for every one of these, for every hospital admission, there’s a family that is traumatized and horrified.

“My God, what is happening to our kids?”

[Isla sighs]

[Tim] It’s plain as day to me. These services are killing people… and causing people to kill themselves.

[Tristan] I don’t know any parent who says, “Yeah, I really want my kids to be growing up feeling manipulated by tech designers, uh, manipulating their attention, making it impossible to do their homework, making them compare themselves to unrealistic standards of beauty.” Like, no one wants that. [chuckles] No one does. We… We used to have these protections. When children watched Saturday morning cartoons, we cared about protecting children. We would say, “You can’t advertise to these age children in these ways.” But then you take YouTube for Kids, and it gobbles up that entire portion of the attention economy, and now all kids are exposed to YouTube for Kids. And all those protections and all those regulations are gone.

[tense instrumental music playing]

[Tristan] We’re training and conditioning a whole new generation of people… that when we are uncomfortable or lonely or uncertain or afraid, we have a digital pacifier for ourselves that is kind of atrophying our own ability to deal with that.


[Chicago Anti-Trust Tech Conference]

[Tristan] Photoshop didn’t have 1,000 engineers on the other side of the screen, using notifications, using your friends, using AI to predict what’s gonna perfectly addict you, or hook you, or manipulate you, or allow advertisers to test 60,000 variations of text or colors to figure out what’s the perfect manipulation of your mind. This is a totally new species of power and influence.

I… I would say, again, the methods used to play on people’s ability to be addicted or to be influenced may be different this time, and they probably are different. They were different when newspapers came in and the printing press came in, and they were different when television came in, and you had three major networks and…

[Tristan] At the time.

At the time. That’s what I’m saying. But I’m saying the idea that there’s a new level and that new level has happened so many times before. I mean, this is just the latest new level that we’ve seen.

[Tristan] There’s this narrative that, you know, “We’ll just adapt to it. We’ll learn how to live with these devices, just like we’ve learned how to live with everything else.” And what this misses is there’s something distinctly new here.

[Randima (Randy) Fernando] Perhaps the most dangerous piece of all this is the fact that it’s driven by technology that’s advancing exponentially. Roughly, if you say from, like, the 1960s to today, processing power has gone up about a trillion times. Nothing else that we have has improved at anything near that rate. Like, cars are, you know, roughly twice as fast. And almost everything else is negligible. And perhaps most importantly, our human– our physiology, our brains have evolved not at all.

[Tristan] Human beings, at a mind and body and sort of physical level, are not gonna fundamentally change.

[indistinct chatter]

[chuckling] I know, but they…

[continues speaking indistinctly]

[camera shutter clicks]

[Tristan] We can do genetic engineering and develop new kinds of human beings, but realistically speaking, you’re living inside of hardware, a brain, that was, like, millions of years old, and then there’s this screen, and then on the opposite side of the screen, there’s these thousands of engineers and supercomputers that have goals that are different than your goals, and so, who’s gonna win in that game? Who’s gonna win?

[yellow AI] How are we losing?

I don’t know.

Where is he? This is not normal.

[Green AI] Did I overwhelm him with friends and family content?

Probably.

Well, maybe it was all the ads.

No. Something’s very wrong.

Let’s switch to resurrection mode.

[Tristan] When you think of AI, you know, an AI’s gonna ruin the world, and you see, like, a Terminator, and you see Arnold Schwarzenegger.

[Terminator] I’ll be back.

[Tristan] You see drones, and you think, like, “Oh, we’re gonna kill people with AI.” And what people miss is that AI already runs today’s world right now.

[Justin Rosenstein] Even talking about “an AI” is just a metaphor. At these companies like… like Google, there’s just massive, massive rooms, some of them underground, some of them underwater, of just computers. Tons and tons of computers, as far as the eye can see. They’re deeply interconnected with each other and running extremely complicated programs, sending information back and forth between each other all the time. And they’ll be running many different programs, many different products on those same machines. Some of those things could be described as simple algorithms, some could be described as algorithms that are so complicated, you would call them intelligence.

[crew member sighs]

[Cathy O’Neil, PhD] I like to say that algorithms are opinions embedded in code… and that algorithms are not objective. Algorithms are optimized to some definition of success. So, if you can imagine, if a… if a commercial enterprise builds an algorithm to their definition of success, it’s a commercial interest. It’s usually profit.

[Jeff Seibert] You are giving the computer the goal state, “I want this outcome,” and then the computer itself is learning how to do it. That’s where the term “machine learning” comes from. And so, every day, it gets slightly better at picking the right posts in the right order so that you spend longer and longer in that product. And no one really understands what they’re doing in order to achieve that goal.

[Bailey Richardson] The algorithm has a mind of its own, so even though a person writes it, it’s written in a way that you kind of build the machine, and then the machine changes itself.

[Sandy Parakilas] There’s only a handful of people at these companies, at Facebook and Twitter and other companies… There’s only a few people who understand how those systems work, and even they don’t necessarily fully understand what’s gonna happen with a particular piece of content. So, as humans, we’ve almost lost control over these systems. Because they’re controlling, you know, the information that we see, they’re controlling us more than we’re controlling them.

[console whirs]

[Growth AI] Cross-referencing him against comparables in his geographic zone. His psychometric doppelgangers. There are 13,694 people behaving just like him in his region.

[Blue AI] What’s trending with them?

[Yellow AI] We need something actually good for a proper resurrection, given that the typical stuff isn’t working. Not even that cute girl from school.

[Blue AI] My analysis shows that going political with Extreme Center content has a 62.3 percent chance of long-term engagement.

[Yellow AI] That’s not bad.

[Blue AI] [sighs] It’s not good enough to lead with.

[Growth AI] Okay, okay, so we’ve tried notifying him about tagged photos, invitations, current events, even a direct message from Rebecca. But what about User 01265923010?

[Blue AI] Yeah, Ben loved all of her posts. For months and, like, literally all of them, and then nothing.

[Growth AI] I calculate a 92.3 percent chance of resurrection with a notification about Ana.

[Blue AI] And her new friend.

[eerie instrumental music playing]

[cell phone vibrates]

[Ben] Oh, you gotta be kiddin’ me. Uh… [sighs] Okay. What?

[fanfare plays, fireworks pop]

[Blue AI] [claps] Bam! We’re back!

[Yellow AI] Let’s get back to making money, boys.

[Growth AI] Yes, and connecting Ben with the entire world.

[Blue AI] I’m giving him access to all the information he might like.

[Growth AI] Hey, do you guys ever wonder if, you know, like, the feed is good for Ben?

[Blue AI] No.

[Yellow AI] No. [chuckles slightly]

[Growth AI] [chuckles softly]

[“I Put a Spell on You” playing]

♪ I put a spell on you ♪

♪ ‘Cause you’re mine ♪

[vocalizing] ♪ Ah! ♪

♪ You better stop the things you do ♪

♪ I ain’t lyin’ ♪

♪ No, I ain’t lyin’ ♪

♪ You know I can’t stand it ♪

♪ You’re runnin’ around ♪

♪ You know better, Daddy ♪

♪ I can’t stand it
‘Cause you put me down ♪

♪ Yeah, yeah ♪

♪ I put a spell on you ♪

♪ Because you’re mine ♪

♪ You’re mine ♪

[Roger] So, imagine you’re on Facebook… and you’re effectively playing against this artificial intelligence that knows everything about you, can anticipate your next move, and you know literally nothing about it, except that there are cat videos and birthdays on it. That’s not a fair fight.

[Cass] Ben and Jerry, it’s time to go, bud! [sighs] Ben? [knocks lightly on door] Ben.

[Ben] Mm.

[Cass] Come on. School time. [claps] Let’s go.

[Ben sighs]


[Center for Humane Technology]

[excited chatter]

[tech] How you doing today?

[Tristan] Oh, I’m… I’m nervous.

[tech] Are ya?

[Tristan] Yeah. [chuckles]

[Tristan] We were all looking for the moment when technology would overwhelm human strengths and intelligence. When is it gonna cross the singularity, replace our jobs, be smarter than humans? But there’s this much earlier moment… when technology exceeds and overwhelms human weaknesses. This point being crossed is at the root of addiction, polarization, radicalization, outrage-ification, vanity-ification, the entire thing. This is overpowering human nature, and this is checkmate on humanity.

[Cass] [sighs deeply]

[door opens]

[Ben] I’m sorry. [sighs]

[seat belt clicks]

[engine starts]

[Jaron] One of the ways I try to get people to understand just how wrong feeds from places like Facebook are is to think about the Wikipedia. When you go to a page, you’re seeing the same thing as other people. So, it’s one of the few things online that we at least hold in common. Now, just imagine for a second that Wikipedia said, “We’re gonna give each person a different customized definition, and we’re gonna be paid by people for that.” So, Wikipedia would be spying on you. Wikipedia would calculate, “What’s the thing I can do to get this person to change a little bit on behalf of some commercial interest?” Right? And then it would change the entry. Can you imagine that? Well, you should be able to, ’cause that’s exactly what’s happening on Facebook. It’s exactly what’s happening in your YouTube feed.

[Justin] When you go to Google and type in “Climate change is,” you’re going to see different results depending on where you live. In certain cities, you’re gonna see it autocomplete with “climate change is a hoax.” In other cases, you’re gonna see “climate change is causing the destruction of nature.” And that’s a function not of what the truth is about climate change, but about where you happen to be Googling from and the particular things Google knows about your interests.

[Tristan] Even two friends who are so close to each other, who have almost the exact same set of friends, they think, you know, “I’m going to news feeds on Facebook. I’ll see the exact same set of updates.” But it’s not like that at all. They see completely different worlds because they’re based on these computers calculating what’s perfect for each of them.

[whistling over monitor]

[Roger] The way to think about it is it’s 2.7 billion Truman Shows. Each person has their own reality, with their own… facts.

[Clip from The Truman Show]
Why do you think that, uh, Truman has never come close to discovering the true nature of his world until now?
We accept the reality of the world with which we’re presented. It’s as simple as that.

[Roger] Over time, you have the false sense that everyone agrees with you, because everyone in your news feed sounds just like you. And that once you’re in that state, it turns out you’re easily manipulated, the same way you would be manipulated by a magician. A magician shows you a card trick and says, “Pick a card, any card.” What you don’t realize was that they’ve done a set-up, so you pick the card they want you to pick. And that’s how Facebook works. Facebook sits there and says, “Hey, you pick your friends. You pick the links that you follow.” But that’s all nonsense. It’s just like the magician. Facebook is in charge of your news feed.

[Rashida Richardson] We all simply are operating on a different set of facts. When that happens at scale, you’re no longer able to reckon with or even consume information that contradicts with that world view that you’ve created. That means we aren’t actually being objective, constructive individuals. [chuckles]

[crowd chanting] Open up your eyes, don’t believe the lies! Open up…

[Justin] And then you look over at the other side, and you start to think, “How can those people be so stupid? Look at all of this information that I’m constantly seeing. How are they not seeing that same information?” And the answer is, “They’re not seeing that same information.”

[crowd continues chanting] Open up your eyes, don’t believe the lies!

[shouting indistinctly]

[interviewer] What are Republicans like?

People that don’t have a clue.

The Democrat Party is a crime syndicate, not a real political party.

A huge new Pew Research Center study of 10,000 American adults finds us more divided than ever, with personal and political polarization at a 20-year high.

[pundit] You have more than a third of Republicans saying the Democratic Party is a threat to the nation, more than a quarter of Democrats saying the same thing about the Republicans.

[Justin] So many of the problems that we’re discussing, like, around political polarization exist in spades on cable television. The media has this exact same problem, where their business model, by and large, is that they’re selling our attention to advertisers. And the Internet is just a new, even more efficient way to do that.

[Guillaume Chaslot] At YouTube, I was working on YouTube recommendations. It worries me that an algorithm that I worked on is actually increasing polarization in society. But from the point of view of watch time, this polarization is extremely efficient at keeping people online.

[vlogger] The only reason these teachers are teaching this stuff is ’cause they’re getting paid to. It’s absolutely absurd.

[Cass] Hey, Benji. No soccer practice today?

[Ben] Oh, there is. I’m just catching up on some news stuff.

[vlogger] Do research. Anything that sways from the Extreme Center–

[Cass] Wouldn’t exactly call the stuff that you’re watching news.

[Ben] You’re always talking about how messed up everything is. So are they.

[Cass] But that stuff is just propaganda.

[vlogger] Neither is true. It’s all about what makes sense.

[Cass] Ben, I’m serious. That stuff is bad for you. You should go to soccer practice.

[Ben] Mm.

[Cass sighs]

[vlogger] I share this stuff because I care. I care that you are being misled, and it’s not okay. All right?

[Guillaume] People think the algorithm is designed to give them what they really want, only it’s not. The algorithm is actually trying to find a few rabbit holes that are very powerful, trying to find which rabbit hole is the closest to your interest. And then if you start watching one of those videos, then it will recommend it over and over again.

[Tristan] It’s not like anybody wants this to happen. It’s just that this is what the recommendation system is doing. So much so that Kyrie Irving, the famous basketball player, uh, said he believed the Earth was flat, and he apologized later because he blamed it on a YouTube rabbit hole.

[Kyrie Irving] You know, like, you click the YouTube click and it goes, like, how deep the rabbit hole goes.

[Tristan] When he later came on to NPR to say, “I’m sorry for believing this. I didn’t want to mislead people,” a bunch of students in a classroom were interviewed saying, “The round-Earthers got to him.”

[audience chuckles]

[Guillaume] The flat-Earth conspiracy theory was recommended hundreds of millions of times by the algorithm. It’s easy to think that it’s just a few stupid people who get convinced, but the algorithm is getting smarter and smarter every day. So, today, they are convincing the people that the Earth is flat, but tomorrow, they will be convincing you of something that’s false.

[reporter] On November 7th, the hashtag “Pizzagate” was born.

[Renée Diresta] Pizzagate… [clicks tongue] Oh, boy. Uh… [laughs] I still am not 100 percent sure how this originally came about, but the idea that ordering a pizza meant ordering a trafficked person. As the groups got bigger on Facebook, Facebook’s recommendation engine started suggesting to regular users that they join Pizzagate groups. So, if a user was, for example, anti-vaccine or believed in chemtrails or had indicated to Facebook’s algorithms in some way that they were prone to belief in conspiracy theories, Facebook’s recommendation engine would serve them Pizzagate groups. Eventually, this culminated in a man showing up with a gun, deciding that he was gonna go liberate the children from the basement of the pizza place that did not have a basement.

[officer 1] What were you doing?

[man] Making sure there was nothing there.

[officer 1] Regarding?

[man] Pedophile ring.

[officer 1] What?

[man] Pedophile ring.

[officer 2] He’s talking about Pizzagate.

[Renée] This is an example of a conspiracy theory that was propagated across all social networks. The social network’s own recommendation engine is voluntarily serving this up to people who had never searched for the term “Pizzagate” in their life.

[Tristan] There’s a study, an MIT study, that fake news on Twitter spreads six times faster than true news. What is that world gonna look like when one has a six-times advantage to the other one?

[Aza Raskin] You can imagine these things are sort of like… they… they tilt the floor of… of human behavior. They make some behavior harder and some easier. And you’re always free to walk up the hill, but fewer people do, and so, at scale, at society’s scale, you really are just tilting the floor and changing what billions of people think and do.

[Sandy] We’ve created a system that biases towards false information. Not because we want to, but because false information makes the companies more money than the truth. The truth is boring.

[Tristan] It’s a disinformation-for-profit business model. You make money the more you allow unregulated messages to reach anyone for the best price.

[vlogger] Because climate change? Yeah. It’s a hoax. Yeah, it’s real. That’s the point. The more they talk about it and the more they divide us, the more they have the power, the more…

[Tristan] Facebook has trillions of these news feed posts. They can’t know what’s real or what’s true… which is why this conversation is so critical right now.

[reporter 1] It’s not just COVID-19 that’s spreading fast. There’s a flow of misinformation online about the virus.

[reporter 2] The notion drinking water will flush coronavirus from your system is one of several myths about the virus circulating on social media.

[automated voice] The government planned this event, created the virus, and had a simulation of how the countries would react.

Coronavirus is a… a hoax.

[man] SARS, coronavirus. And look at when it was made. 2018.

I think the US government started this shit.

Nobody is sick. Nobody is sick. Nobody knows anybody who’s sick.

Maybe the government is using the coronavirus as an excuse to get everyone to stay inside because something else is happening.

Coronavirus is not killing people, it’s the 5G radiation that they’re pumping out.

[crowd shouting]

[Tristan] We’re being bombarded with rumors. People are blowing up actual physical cell phone towers. We see Russia and China spreading rumors and conspiracy theories.

[reporter 3] This morning, panic and protest in Ukraine as…

[Tristan] People have no idea what’s true, and now it’s a matter of life and death.

[woman] Those sources that are spreading coronavirus misinformation have amassed something like 52 million engagements.

You’re saying that silver solution would be effective.

Well, let’s say it hasn’t been tested on this strain of the coronavirus, but…

[Tristan] What we’re seeing with COVID is just an extreme version of what’s happening across our information ecosystem. Social media amplifies exponential gossip and exponential hearsay to the point that we don’t know what’s true, no matter what issue we care about.

[teacher] He discovers this.

[continues lecturing indistinctly]

[Rebecca whispers] Ben.

[Rebecca] Are you still on the team?

[Ben] Mm-hmm.

[Rebecca] Okay, well, I’m gonna get a snack before practice if you… wanna come.

[Ben] Hm?

[Rebecca] You know, never mind.

[footsteps fading]

[vlogger] Nine out of ten people are dissatisfied right now. The EC is like any political movement in history, when you think about it. We are standing up, and we are… we are standing up to this noise. You are my people. I trust you guys.

[Blue AI] The Extreme Center content is brilliant.

[Green AI] He absolutely loves it.

[Yellow AI] Running an auction. 840 bidders. He sold for 4.35 cents to a weapons manufacturer.

[Blue AI] Let’s promote some of these events. Upcoming rallies in his geographic zone later this week.

[Green AI] I’ve got a new vlogger lined up, too.

[chuckles]

[vlogger] And… and, honestly, I’m telling you, I’m willing to do whatever it takes. And I mean whatever. Subscribe…

[Cass] Ben?

[vlogger] …and also come back because I’m telling you, yo…

[knocking on door]

[vlogger] …I got some real big things comin’. Some real big things.

[Roger] One of the problems with Facebook is that, as a tool of persuasion, it may be the greatest thing ever created. Now, imagine what that means in the hands of a dictator or an authoritarian. If you want to control the population of your country, there has never been a tool as effective as Facebook.

[Cynthia M. Wong] Some of the most troubling implications of governments and other bad actors weaponizing social media, um, is that it has led to real, offline harm. I think the most prominent example that’s gotten a lot of press is what’s happened in Myanmar. In Myanmar, when people think of the Internet, what they are thinking about is Facebook. And what often happens is when people buy their cell phone, the cell phone shop owner will actually preload Facebook on there for them and open an account for them. And so when people get their phone, the first thing they open and the only thing they know how to open is Facebook.

[CBSN News] Well, a new bombshell investigation exposes Facebook’s growing struggle to tackle hate speech in Myanmar.

[crowd shouting]

[Cynthia] Facebook really gave the military and other bad actors a new way to manipulate public opinion and to help incite violence against the Rohingya Muslims that included mass killings, burning of entire villages, mass rape, and other serious crimes against humanity that have now led to 700,000 Rohingya Muslims having to flee the country.

[Renée Diresta] It’s not that highly motivated propagandists haven’t existed before. It’s that the platforms make it possible to spread manipulative narratives with phenomenal ease, and without very much money.

[Tristan] If I want to manipulate an election, I can now go into a conspiracy theory group on Facebook, and I can find 100 people who believe that the Earth is completely flat and think it’s all this conspiracy theory that we landed on the moon, and I can tell Facebook, “Give me 1,000 users who look like that.” Facebook will happily send me thousands of users that look like them that I can now hit with more conspiracy theories.

[button clicks]

[Yellow AI] Sold for 3.4 cents an impression.

[Blue AI] New EC video to promote.

[Advertising/Yellow AI] Another ad teed up.

[Justin] Algorithms and manipulative politicians are becoming so expert at learning how to trigger us, getting so good at creating fake news that we absorb as if it were reality, and confusing us into believing those lies. It’s as though we have less and less control over who we are and what we believe.

[ominous instrumental music playing]

[vlogger] …so they can pick sides. There’s lies here, and there’s lies over there. So they can keep the power, so they can control everything.

[police siren blaring]

[vlogger] They can control our minds, so that they can keep their secrets.

[crowd chanting]

[Tristan] Imagine a world where no one believes anything true. Everyone believes the government’s lying to them. Everything is a conspiracy theory. “I shouldn’t trust anyone. I hate the other side.” That’s where all this is heading.

[News] The political earthquakes in Europe continue to rumble. This time, in Italy and Spain.

[reporter] Overall, Europe’s traditional, centrist coalition lost its majority while far right and far left populist parties made gains.

[man shouts]

[crowd chanting]

[Police woman] Back up.

[radio beeps]

Okay, let’s go.

[police siren wailing]

[reporter] These accounts were deliberately, specifically attempting to sow political discord in Hong Kong.

[crowd shouting]

[sighs]

[Cass] All right, Ben.

[car doors lock]

[CNN] What does it look like to be a country that’s entire diet is Facebook and social media?
[Maria A. Ressa] Democracy crumbled quickly. Six months.

[reporter 1] After that chaos in Chicago, violent clashes between protesters and supporters…

[reporter 2] Democracy is facing a crisis of confidence.

[Renée] What we’re seeing is a global assault on democracy.

[crowd shouting]

[Renée] Most of the countries that are targeted are countries that run democratic elections.

[Tristan] This is happening at scale. By state actors, by people with millions of dollars saying, “I wanna destabilize Kenya. I wanna destabilize Cameroon. Oh, Angola? That only costs this much.”

[reporter] An extraordinary election took place Sunday in Brazil.

[News] With a campaign that’s been powered by social media.

[crowd chanting in Portuguese]

[Tristan] We in the tech industry have created the tools to destabilize and erode the fabric of society in every country, all at once, everywhere.

[Joe Toscano] You have this in Germany, Spain, France, Brazil, Australia. Some of the most “developed nations” in the world are now imploding on each other, and what do they have in common?

[Interviewer] Knowing what you know now, do you believe Facebook impacted the results of the 2016 election?

[Mark Zuckerberg] Oh, that’s… that is hard. You know, it’s… the… the reality is, well, there were so many different forces at play.

[CBSN News] Representatives from Facebook, Twitter, and Google are back on Capitol Hill for a second day of testimony about Russia’s interference in the 2016 election.

[Roger] The manipulation by third parties is not a hack. Right? The Russians didn’t hack Facebook. What they did was they used the tools that Facebook created for legitimate advertisers and legitimate users, and they applied it to a nefarious purpose.

[Tristan] It’s like remote-control warfare. One country can manipulate another one without actually invading its physical borders.

[reporter 1] We’re seeing violent images. It appears to be a dumpster being pushed around…

[Tristan] But it wasn’t about who you wanted to vote for. It was about sowing total chaos and division in society.

[reporter 2] Now, this was in Huntington Beach. A march…

[Tristan] It’s about making two sides who couldn’t hear each other anymore, who didn’t want to hear each other anymore, who didn’t trust each other anymore.

[reporter 3] This is a city where hatred was laid bare and transformed into racial violence.

[crowd shouting]

[indistinct shouting]

[men grunting]

[police siren blaring]

[Cass] Ben!

[Ben] Cassandra! Cass!

[Cass] Ben!

[officer 1] Come here! Come here! Arms up. Arms up. Get down on your knees. Now, down.

[crowd continues shouting]

[officer 2] Calm–

[Cass] Ben!

[officer 2] Hey! Hands up! Turn around. On the ground. On the ground!

[crowd echoing]

[melancholy piano music playing]

[siren continues wailing]

[Tristan] Do we want this system for sale to the highest bidder? For democracy to be completely for sale, where you can reach any mind you want, target a lie to that specific population, and create culture wars? Do we want that?

[Marco Rubio] We are a nation of people… that no longer speak to each other. We are a nation of people who have stopped being friends with people because of who they voted for in the last election. We are a nation of people who have isolated ourselves to only watch channels that tell us that we’re right.

[Jeff Flake] My message here today is that tribalism is ruining us. It is tearing our country apart. It is no way for sane adults to act.

[Roger] If everyone’s entitled to their own facts, there’s really no need for compromise, no need for people to come together. In fact, there’s really no need for people to interact. We need to have… some shared understanding of reality. Otherwise, we aren’t a country.

[Mark Zuckerberg] So, uh, long-term, the solution here is to build more AI tools that find patterns of people using the services that no real person would do.

[Cathy O’Neil] We are allowing the technologists to frame this as a problem that they’re equipped to solve. That is… That’s a lie. People talk about AI as if it will know truth. AI’s not gonna solve these problems. AI cannot solve the problem of fake news. Google doesn’t have the option of saying, “Oh, is this conspiracy? Is this truth?” Because they don’t know what truth is. They don’t have a… They don’t have a proxy for truth that’s better than a click.

[Tristan] If we don’t agree on what is true or that there is such a thing as truth, we’re toast. This is the problem beneath other problems because if we can’t agree on what’s true, then we can’t navigate out of any of our problems.

[ominous instrumental music playing]

[console droning]

[Growth AI] We should suggest Flat Earth Football Club.

[Engagement AI] Don’t show him sports updates. He doesn’t engage.

[AIs speaking indistinctly]

[music swells]

[Jaron] A lot of people in Silicon Valley subscribe to some kind of theory that we’re building some global super brain, and all of our users are just interchangeable little neurons, no one of which is important. And it subjugates people into this weird role where you’re just, like, this little computing element that we’re programming through our behavior manipulation for the service of this giant brain, and you don’t matter. You’re not gonna get paid. You’re not gonna get acknowledged. You don’t have self-determination. We’ll sneakily just manipulate you because you’re a computing node, so we need to program you ’cause that’s what you do with computing nodes.

[reflective instrumental music playing]

[Tristan] Oh, man. [sighs]

[Tristan] When you think about technology and it being an existential threat, you know, that’s a big claim, and… it’s easy to then, in your mind, think, “Okay, so, there I am with the phone… scrolling, clicking, using it. Like, where’s the existential threat? Okay, there’s the supercomputer. The other side of the screen, pointed at my brain, got me to watch one more video. Where’s the existential threat?”

[indistinct chatter]

[Tristan] It’s not about the technology being the existential threat. It’s the technology’s ability to bring out the worst in society… [chuckles] …and the worst in society being the existential threat. If technology creates… mass chaos, outrage, incivility, lack of trust in each other, loneliness, alienation, more polarization, more election hacking, more populism, more distraction and inability to focus on the real issues… that’s just society. [scoffs] And now society is incapable of healing itself and just devolving into a kind of chaos.

[Tristan] This affects everyone, even if you don’t use these products. These things have become digital Frankensteins that are terraforming the world in their image, whether it’s the mental health of children or our politics and our political discourse, without taking responsibility for taking over the public square. So, again, it comes back to–

[Dan Sullivan] And who do you think’s responsible?

[Tristan] I think we have to have the platforms be responsible for when they take over election advertising, they’re responsible for protecting elections. When they take over mental health of kids or Saturday morning, they’re responsible for protecting Saturday morning.

[Tristan] The race to keep people’s attention isn’t going away. Our technology’s gonna become more integrated into our lives, not less. The AIs are gonna get better at predicting what keeps us on the screen, not worse at predicting what keeps us on the screen.

[Jon Tester] I… I am 62 years old, getting older every minute, the more this conversation goes on…

[crowd chuckles]

…but… but I will tell you that, um… I’m probably gonna be dead and gone, and I’ll probably be thankful for it, when all this shit comes to fruition. Because… Because I think that this scares me to death. Do… Do you… Do you see it the same way? Or am I overreacting to a situation that I don’t know enough about?

[interviewer] What are you most worried about?

[Tim] [sighs] I think, in the… in the shortest time horizon… civil war.

[Jaron] If we go down the current status quo for, let’s say, another 20 years… we probably destroy our civilization through willful ignorance. We probably fail to meet the challenge of climate change. We probably degrade the world’s democracies so that they fall into some sort of bizarre autocratic dysfunction. We probably ruin the global economy. Uh, we probably, um, don’t survive. You know, I… I really do view it as existential.

[helicopter blades whirring]

[Tristan] Is this the last generation of people that are gonna know what it was like before this illusion took place? Like, how do you wake up from the matrix when you don’t know you’re in the matrix?

[ominous instrumental music playing]

“Whether it is to be utopia or oblivion will be a touch-and-go relay race right up to the final moment…”
—Buckminster Fuller

[Tristan] A lot of what we’re saying sounds like it’s just this… one-sided doom and gloom. Like, “Oh, my God, technology’s just ruining the world and it’s ruining kids,” and it’s like… “No.” [chuckles] It’s confusing because it’s simultaneous utopia… and dystopia. Like, I could hit a button on my phone, and a car shows up in 30 seconds, and I can go exactly where I need to go. That is magic. That’s amazing.

[Justin] When we were making the like button, our entire motivation was, “Can we spread positivity and love in the world?” The idea that, fast-forward to today, and teens would be getting depressed when they don’t have enough likes, or it could be leading to political polarization was nowhere on our radar.

[Joe] I don’t think these guys set out to be evil. It’s just the business model that has a problem.

[Alex Roetter] You could shut down the service and destroy whatever it is– $20 billion of shareholder value– and get sued and… But you can’t, in practice, put the genie back in the bottle. You can make some tweaks, but at the end of the day, you’ve gotta grow revenue and usage, quarter over quarter. It’s… The bigger it gets, the harder it is for anyone to change.

[Tristan] What I see is a bunch of people who are trapped by a business model, an economic incentive, and shareholder pressure that makes it almost impossible to do something else.

[Sandy] I think we need to accept that it’s okay for companies to be focused on making money. What’s not okay is when there’s no regulation, no rules, and no competition, and the companies are acting as sort of de facto governments. And then they’re saying, “Well, we can regulate ourselves.” I mean, that’s just a lie. That’s just ridiculous.

[Jaron] Financial incentives kind of run the world, so any solution to this problem has to realign the financial incentives.

[Joe] There’s no fiscal reason for these companies to change. And that is why I think we need regulation.

[Sandy] The phone company has tons of sensitive data about you, and we have a lot of laws that make sure they don’t do the wrong things. We have almost no laws around digital privacy, for example.

[Joe] We could tax data collection and processing the same way that you, for example, pay your water bill by monitoring the amount of water that you use. You tax these companies on the data assets that they have. It gives them a fiscal reason to not acquire every piece of data on the planet.

[Roger] The law runs way behind on these things, but what I know is the current situation exists not for the protection of users, but for the protection of the rights and privileges of these gigantic, incredibly wealthy companies. Are we always gonna defer to the richest, most powerful people? Or are we ever gonna say, “You know, there are times when there is a national interest. There are times when the interests of people, of users, is actually more important than the profits of somebody who’s already a billionaire”?

[Shoshana] These markets undermine democracy, and they undermine freedom, and they should be outlawed. This is not a radical proposal. There are other markets that we outlaw. We outlaw markets in human organs. We outlaw markets in human slaves. Because they have inevitable destructive consequences.

[Justin] We live in a world in which a tree is worth more, financially, dead than alive, in a world in which a whale is worth more dead than alive. For so long as our economy works in that way and corporations go unregulated, they’re going to continue to destroy trees, to kill whales, to mine the earth, and to continue to pull oil out of the ground, even though we know it is destroying the planet and we know that it’s going to leave a worse world for future generations. This is short-term thinking based on this religion of profit at all costs, as if somehow, magically, each corporation acting in its selfish interest is going to produce the best result. This has been affecting the environment for a long time. What’s frightening, and what hopefully is the last straw that will make us wake up as a civilization to how flawed this theory has been in the first place is to see that now we’re the tree, we’re the whale. Our attention can be mined. We are more profitable to a corporation if we’re spending time staring at a screen, staring at an ad, than if we’re spending that time living our life in a rich way. And so, we’re seeing the results of that. We’re seeing corporations using powerful artificial intelligence to outsmart us and figure out how to pull our attention toward the things they want us to look at, rather than the things that are most consistent with our goals and our values and our lives.

[static crackles]

[crowd cheering]

[Steve Jobs] What a computer is to me, is it’s the most remarkable tool that we’ve ever come up with. And it’s the equivalent of a bicycle for our minds.

[Aza] The idea of humane technology, that’s where Silicon Valley got its start. And we’ve lost sight of it because it became the cool thing to do, as opposed to the right thing to do.

[Bailey Richardson] The Internet was, like, a weird, wacky place. It was experimental. Creative things happened on the Internet,and certainly, they do still, but, like, it just feels like this, like, giant mall. [chuckles] You know, it’s just like, “God, there’s gotta be… there’s gotta be more to it than that.”

[man typing]

[Bailey] I guess I’m just an optimist. ‘Cause I think we can change what social media looks like and means.

[Justin] The way the technology works is not a law of physics. It is not set in stone. These are choices that human beings like myself have been making. And human beings can change those technologies.

[Tristan] And the question now is whether or not we’re willing to admit that those bad outcomes are coming directly as a product of our work. It’s that we built these things, and we have a responsibility to change it.

[static crackling]

[Tristan] The attention extraction model is not how we want to treat human beings.

[distorted] Is it just me or…

[distorted] Poor sucker.

[Tristan] The fabric of a healthy society depends on us getting off this corrosive business model.

[console beeps]

[gentle instrumental music playing]

[console whirs, grows quiet]

[Tristan] We can demand that these products be designed humanely. We can demand to not be treated as an extractable resource. The intention could be: “How do we make the world better?”

[Jaron] Throughout history, every single time something’s gotten better, it’s because somebody has come along to say, “This is stupid. We can do better.” [laughs] Like, it’s the critics that drive improvement. It’s the critics who are the true optimists.

[AI] [sighs] Hello.

[Tristan] [sighs] Um… I mean, it seems kind of crazy, right? It’s like the fundamental way that this stuff is designed… isn’t going in a good direction. [chuckles] Like, the entire thing. So, it sounds crazy to say we need to change all that, but that’s what we need to do.

[interviewer] Think we’re gonna get there?

[Tristan] We have to.

[tense instrumental music playing]

[interviewer] Um, it seems like you’re very optimistic.

[Justin] Is that how I sound?

[crew laughs]

[interviewer] Yeah, I mean…

[Justin] I can’t believe you keep saying that, because I’m like, “Really? I feel like we’re headed toward dystopia. I feel like we’re on the fast track to dystopia, and it’s gonna take a miracle to get us out of it.” And that miracle is, of course, collective will.

[Anna] I am optimistic that we’re going to figure it out, but I think it’s gonna take a long time. Because not everybody recognizes that this is a problem.

[Bailey] I think one of the big failures in technology today is a real failure of leadership, of, like, people coming out and having these open conversations about things that… not just what went well, but what isn’t perfect so that someone can come in and build something new.

[Tristan] At the end of the day, you know, this machine isn’t gonna turn around until there’s massive public pressure.

[Justin] By having these conversations and… and voicing your opinion, in some cases through these very technologies, we can start to change the tide. We can start to change the conversation.

[Jaron] It might sound strange, but it’s my world. It’s my community. I don’t hate them. I don’t wanna do any harm to Google or Facebook. I just want to reform them so they don’t destroy the world. You know?

[Justin] I’ve uninstalled a ton of apps from my phone that I felt were just wasting my time. All the social media apps, all the news apps, and I’ve turned off notifications on anything that was vibrating my leg with information that wasn’t timely and important to me right now. It’s for the same reason I don’t keep cookies in my pocket.

[Sandy] Reduce the number of notifications you get.

[Aza] Turn off notifications.

[Tristan] Turning off all notifications.

[Guillaume] I’m not using Google anymore, I’m using Qwant, which doesn’t store your search history.

[Jaron] Never accept a video recommended to you on YouTube. Always choose. That’s another way to fight.

[Guillaume] There are tons of Chrome extensions that remove recommendations.

[interviewer] You’re recommending something to undo what you made.

[Guillaume] [laughing] Yep.

[Renee] Before you share, fact-check, consider the source, do that extra Google. If it seems like it’s something designed to really push your emotional buttons, like, it probably is.

[Justin] Essentially, you vote with your clicks. If you click on clickbait, you’re creating a financial incentive that perpetuates this existing system.

[Cathy] Make sure that you get lots of different kinds of information in your own life. I follow people on Twitter that I disagree with because I want to be exposed to different points of view.

[Tristan] Notice that many people in the tech industry don’t give these devices to their own children.

[Alex] My kids don’t use social media at all.

[interviewer] Is that a rule, or is that a…

[Alex] That’s a rule.

[Tim] We are zealots about it. We’re… We’re crazy. And we don’t let our kids have really any screen time.

[Jonathan Haidt] I’ve worked out what I think are three simple rules, um, that make life a lot easier for families and that are justified by the research. So, the first rule is all devices out of the bedroom at a fixed time every night. Whatever the time is, half an hour before bedtime, all devices out. The second rule is no social media until high school. Personally, I think the age should be 16. Middle school’s hard enough. Keep it out until high school. And the third rule is work out a time budget with your kid. And if you talk with them and say, “Well, how many hours a day do you wanna spend on your device? What do you think is a good amount?” they’ll often say something pretty reasonable.

[jaron] Well, look, I know perfectly well that I’m not gonna get everybody to delete their social media accounts, but I think I can get a few. And just getting a few people to delete their accounts matters a lot, and the reason why is that that creates the space for a conversation because I want there to be enough people out in the society who are free of the manipulation engines to have a societal conversation that isn’t bounded by the manipulation engines. So, do it! Get out of the system. Yeah, delete. Get off the stupid stuff. The world’s beautiful. Look. Look, it’s great out there. [laughs]

[birds singing]

[children playing and shouting]


 

Full list of the cast (Interviewees and Actors)

SHARE THIS ARTICLE

18 thoughts on “The Social Dilemma (2020) – Transcript”

  1. Cherie G Smith

    How do I get an entire transcript. I want to develop a lesson plan for my English and Psychology students

    1. Hi,
      as far as I know this is the full transcript as I’m not aware of the existence of longer versions of the documentary. If what you need is the official screenplay, probably it’s best to contact directly the producers/creators of the movie.

  2. I was so impressed with the documentary but there were SO many cogent ideas and points of view that I’m super pleased to have found this transcript. Thank you 🙂

  3. 1. How is the exploitation and manipulation of social media users for financial
    gain displayed?
    2. In what way is social media’s design meant to nurture an addiction?
    3. How does the serious issue of social media affect mental health?
    4. It is stated that social media is a “useful service that does lots of good with a
    parallel money machine.” Explain.
    5. What does “disinformation-for-profit business model” refer to?
    6. As the credits roll, the interviewees recommend taking various
    countermeasures to protect oneself against social media. List at least 6
    countermeasures.
    Help me please.

  4. Becky Kupferberg

    Thank you for providing this transcript so I can more easily record and share insights with my students. There are lots of springboards for discussion here.

  5. new to this analysis, reading transcript as dont have Netflix, to catch on background to ‘AI Dilemma 9th March 2023’.
    A couple of points came up. Cynthia mentioned ‘bad actors’, taking us in a wrong direction. No one picked up on that.
    Surely thats the root of the distrust, when bad actors include sections of government, WEF agenda, WHO power grab, Fauci and Pharma profit grab, Military industry war gaming up to nuclear, playing chicken with all our lives, mass media massaging lies.
    These bad actors count as assault on democracy before any AI and social media platforms are involved .
    Lastly, Cathy O’Neill says, ‘We are allowing technologists to frame this as a problem they’re equipped to solve. That is… a lie.
    AIs not gonna solve these problems.. problem of fake news. Google doesn’t have the option.. because they dont know what truth is.’
    More I acknowledged bad actors at play, and Facebook censored true information for political reasons, not. A good trust building exercise, censoring speeches made in UK parliament by elected MP.
    Tristan replies to Cathy, If we dont agree on what is true, or that there is such a thing as truth, we’;re toast’
    The truth that humans are hackable robots in most of their responses seems the only take home truth here.
    Perhaps Tristan has developed a broader appreciation for the truth we dont know yet, a respect for the unknown , in the 2 years since.
    I do appreciate the stance he is taking in his own industry to encourage his fellow tech to take responsibility and propose elegant ways to elevate the game rather than destroy ourselves, and the whole mass market created.
    Its not his fault that bad actors have no such scruples against enslavement by fear or addiction, and themselves seem quite unable to forego one dollar of profit, and still frame everything as if ‘for the general good’ .
    If AI is the weapon pointed at our minds, bad actors can ask for any result and will get the best/cheapest/simplest/ least alerting plan of action to implement the plan. It is far more likely to be successful if people agree with the plan. Carrot more than stick.
    Where are the proposed safeguards now? It cannot just depend on goodwill of tech operators and Media CEOs recognising their responsibility. Humans are natural exploiters of advantage, predators and pirates abound in unregulated waters, and will not give up their power easily. If earth plane is a school, bad actors are here to learn to manage power, and many are failing miserably. I suspect the biggest supercomputers are still military/government owned, as was in my youth.

  6. Thank you.
    The piece that is not being covered is the trove of research studies over the last 25+ years, the MAJORITY of which demonstrate biological effects from the frequencies used in wireless technologies. So, it is not just psychological/ behavioral addiction. There are ways that the medium carrying these communications also affects the body. 533 studies, that’s 90% of the peer-reviewed, published EMF studies on oxidative stress and free radicals, have shown bio-effects. Meanwhile, 574 studies, 83% of studies, have shown bio-effects on neurology. There are several other commonly seen effects like genetic and fertility damage, opening of the blood-brain barrier, and causal links to ADHD from pre-natal and early developmental exposure. These are nothing to sneeze at—it’s just that the wireless industry has done such a good job of making people think such studies don’t exist or are unreliable. That could not be further from the truth. There’s too many of them to ignore, and in fact, hundreds of the world’s independent (of industry) researchers in the field have issued resolutions and pleas for governments to take notice and tighten their exposure limits, and for people to take simple steps to lower their and their family’s exposure. (More info: bioinitiative.org H.Lai’s research summaries; saferemr.com J. Moskowitz (UCBerkeley, Public Health).

Leave a Comment

Your email address will not be published. Required fields are marked *

Read More

Monkey Man (2024)

Monkey Man (2024) | Transcript

An anonymous young man unleashes a campaign of vengeance against the corrupt leaders who murdered his mother and continue to systemically victimize the poor and powerless.

Immaculate (2024)

Immaculate (2024) | Transcript

Cecilia, a woman of devout faith, is warmly welcomed to the picture-perfect Italian countryside where she is offered a new role at an illustrious convent. But it becomes clear to Cecilia that her new home harbors dark and horrifying…

Weekly Magazine

Get the best articles once a week directly to your inbox!