Facebook Whistleblower Wants More Government Regulation
An insider blows the whistle on Facebook She says the company is putting profit over people. She wants Congress to pass laws regulating Facebook. And Facebook...agrees? What’s going on? Welcome to America Uncovered, I’m Chris Chappell. This episode is sponsored by Surfshark. Whenever you go online, especially to look into controversial topics, you should be using a VPN like Surfshark to protect your identity. Facebook has put profit over the wellbeing of society and we’re all doomed unless Congress fixes it. OK, that’s a bit of a simplification, but that’s the gist of what Facebook whistleblower Frances Haugen told US Senators last week. Like most people on Facebook, she was overly long winded. No Aunt Arlene, I do not need an hour-by-hour breakdown of your vacation in Des Moines! Unfortunately, constant oversharing was not one of the complaints Haugen had about Facebook. Haugen was hired by Facebook in 2019 to be a member of its civic integrity team, which studied how Facebook’s algorithm increased misinformation or how the platform was used by bad actors. And I’m not just talking about bad actors like Rob Schneider. The mantra of Haugen’s team was “Serve the people’s interests first, not Facebook’s.” Haugen claims, however, that that ethos was not shared by the rest of the company. “The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.” Those profits, she says, are the result of an ad model where the more time users spend on Facebook, the more money it makes. She says this engagement-at-all-costs funding model has prevented Facebook management from making changes it knows are right. If the more time someone spends on Facebook, the more money it makes, then Aunt Arlene has made that company a fortune. Haugen said she left Facebook after her team was dissolved, a claim I’ll talk about later. And when she did, she took a ton of internal documents. She initially leaked them anonymously to the Wall Street Journal, which used it to publish a series of articles called The Facebook Files.
The Wall Street Journal concluded that Facebook “knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.” Haugen also shared the documents with the Securities and Exchange Commission, alleging that Facebook misled its investors, a claim I’ll also talk about later. And she shared them with CBS before revealing her identity on 60 Minutes last week. But if she *really* wanted to stick it to Facebook, she should have shared all of these documents on Twitter. Go ahead, Frances. Be petty. That’s what Twitter’s for! So here are Haugen’s three main critiques of her former employer: The first is that its photo-sharing app Instagram causes harm to teenagers, especially teenage girls. The second is that Facebook isn’t doing enough to combat misinformation and violence on its platform. And the third is that Facebook should be transparent about the research it’s doing, and it should open its data up to outside researchers. And the fourth is that pregnancy announcements get way too many likes, especially when half these people will wind up being terrible parents… okay, that’s actually *my* critique. But it’s worth addressing. So let’s start with Instagram allegedly being harmful to teenagers. There are a few leaked internal studies that have led to headlines like this. Senator Marsha Blackburn opened a hearing about protecting kids online with this shocking statistic. “66% of teen girls on Instagram and 40% of teen boys experience negative social comparisons. This is Facebook’s research.” That is shocking. Mainly because they’d feel way better about themselves if they learned to use the Clarendon filter. Look how gorgeous I look. Well… gorgeous-er. The statistics Senator Blackburn quoted come from this leaked internal slide from Facebook. If you look closely at this study though, you see that only 25 people participated.
15 people participated in focus groups, and 10 people participated in an online diary study. Then researchers also interviewed seven people out of the 10 who did the online diaries. And of those 25, not all of them were teens because the age range was 13 to 21. So in total, they based this study on less than the number of kids on most high school football teams. But way more than the number of kids that came to my high school birthday parties. Now those numbers might be fine for Facebook’s internal market research. But it’s hardly a rigorous scientific study. Also keep in mind, these were self-reported results with no control group, which was a problem with this internal Facebook study as well. This study found that “Approximately 30% of teen girls felt that Instagram made dissatisfaction with their body worse.” Facebook even says in the notes on this survey that “The methodology is not fit to provide statistical estimates for the correlation between Instagram and mental health.” One psychology professor says that based on what was leaked, no one actually knows how Instagram affects teen mental health. To be fair, teens are so hormonal, no one knows how *anything* will affect their mental health. I once cried my eyes out to Avril Lavigne’s “Skater Boy.” As a teen, that is. And now Facebook is saying that their leaked research is being mischaracterized. Reporter: “I have a question about that. Do you deny that teenage girls are not having a good experience on Instagram.” Monika Bickert: “The majority of young people on Instagram are having a good experience and that is born out by the documents that were stolen, including the Instagram youth survey of about 40 Instagram users. These were teens who were already struggling.” “What that survey says is that the majority of teens, who were surveyed, it’s a small number, who are struggling with these issues, it made things better or didn’t have a material change.
” Yeah, 40 users isn’t a lot. But she’s only looking at the study’s in-person participants. It says it also relied on more than 2,500 online surveys. She *is* right though that that study found most teens either didn’t think Instagram had a negative impact on how they felt about themselves or even had a good impact on how they felt about themselves. Which seems right. Because after I scroll through Instagram for three straight hours, I feel absolutely nothing. I even lose feeling in my legs from sitting on the floor for too long. And that’s because I live in New York City, and I can’t fit a sofa in my apartment. Now it’s not just Facebook’s own studies that have looked into this. Outside studies show similar as well as contradictory results, meaning that the relationship between teen mental health and Instagram is more nuanced than what the headlines suggest. I know, nuance isn’t very popular these days— especially when you’re sharing it on Facebook. Ugh, this would have gotten more likes if it were a meme. This brings me to Haugen’s next point about how Facebook encourages content that makes us all more angry. More on that after this short break. Welcome back. Facebook whistleblower Frances Haugen took a trove of internal Facebook documents before she left the company earlier this year. One of the things she says those documents show is how Facebook manipulates users’ newsfeeds for the worse. “One of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions.” I don’t know. I have a pretty easy time inspiring people... to envy. “The result has been more division, more harm, more lies, more threats, and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people.
” So basically, what you see in your newsfeed isn’t based on who shared what when—it’s based on what Facebook thinks you’ll be most interested in. Which makes me nervous considering how much ‘My Little Pony’ content I see in my feed. What are you trying to say, Facebook?! Facebook prioritizes content that gets more comments and reactions, but the content that causes people to comment and react most is often sensational and divisive.“Misinformation, toxicity, and violent content are inordinately prevalent among reshares” researchers said in internal memos. This was the case even after a 2018 algorithm change that Facebook said was designed to “encourage people to interact more with friends and family” and less with media publishers. This shouldn’t surprise anyone whose family looks like mine at Thanksgiving. I’m just lucky I made it out alive after Aunt Arlene started talking politics with Uncle Roger while carving the turkey. I should have known better than to intervene. What’s ironic is that Facebook used this algorithm change to defend itself, saying that it shows it cares about stopping divisive and violent content. Facebook CEO Mark Zuckerberg took to—yes, you guessed it—his Facebook account to say that the company made that change knowing it would decrease users’ time on the platform, but that they believed it was the right thing for people’s well-being.“The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content. And I don’t know any tech company that sets out to build products that make people angry or depressed. The moral, business and product incentives all point in the opposite direction.” Yeah, they’re not deliberately trying to make people angry and depressed. That’s the job of CNN and FOX News. But even if making people angry wasn’t Facebook’s intent with the algorithm change, that was the result, according to the Wall Street Journal.
It said documents show Facebook management decided to do nothing about it, even after some politicians and at least one media company reached out to Facebook. They reportedly found that only posts that were negative and divisive got seen, and they were worried about the effect this was having on public discourse. So the fact that your pregnancy announcement got so many likes, indicates that your baby is actually negative and divisive. For Haugen though, this change in the algorithm was personal. She said when Facebook recruited her in 2019, she agreed to join only if she could work on misinformation because she “lost” a friend to conspiracy theories. Wait, does that mean her friend went off the deep end mentally? Or that she was kidnapped by lizard people and is being held at the ice wall at the edge of the flat Earth? During her time at Facebook, Haugen was the lead product manager on Civic Misinformation, and later worked on counter espionage. She said her team was understaffed and overworked, but that when she complained to management, they brushed it off. “I was told, flat out, at Facebook we accomplish impossible things with far less resources than anyone thought possible.” Accomplishing something while on Facebook? Now *that’s* a conspiracy theory. After the 2020 election, Haugen claimed that there was a sigh of relief from within Facebook and a sense that they had come out on the other side. “And right after the election, a couple of weeks after the election, they told us, ‘we’re dissolving Civic Integrity.” “Dissolving” may not be exactly the right word for it. Haugen later clarified that Facebook told her that Civic Integrity was so important, it wanted to incorporate it into its other teams. That’s also what Facebook says. “We didn’t dissolve that team. What we did was build a bigger team that that team became a part of.” For Haugen, and she says others on her team, it felt like the team had been disbanded because Facebook no longer cared about misinformation.
You know, because Donald Trump was the only person pushing misinformation. Joe Biden would never lie… For Haugen, rearranging her team was a line in the sand. But now something weird is happening. I’ll explain after this final break. Welcome back. Facebook whistleblower Frances Haugen said she decided to copy thousands of internal documents from the company’s employee forum Facebook Workplace and leak them to the press. She did that because she felt Facebook was hiding information about the dangers of its platform, and doing it for profit. After the Civic Integrity team that she was working on was incorporated into other parts of the company, she felt like she had to get outside help to right the ship. And that outside help she chose was Congress. Because whenever I want something accomplished, the first people I go to for help is Congress. Assuming they’re not in the middle of one of their shutdowns. I especially go to Congress if I need help with something that involves technology. All kidding aside, a lot of these guys are too old to understand Facebook, even though Facebook is the platform for older people. They can’t even figure out the Wifi without their grandkids’ help, not to mention Facebook’s complexities. For example: Sen. Orrin Hatch: “Well if so, how do you sustain a business model in which users don’t pay for your service?” Mark Zuckerberg: “Senator, we run ads.” Sen. Orrin Hatch: “I see.” You know you’re in trouble when Mark Zuckerberg is the less awkward one in the conversation. Haugen wants Facebook to make its algorithms available to the government and outside researchers. She believes that will keep Facebook on the straight and narrow. “Facebook’s closed design means it has no real oversight. Only Facebook knows how it personalizes your feed for you. At other large tech companies like Google, any independent researcher can download from the Internet the company’s search results and write papers about what they find, and they do.
” Facebook isn’t exactly denying this. Facebook’s vice president for content policy says they partner with outside researchers… but only ones they’ve handpicked. Reporter: “Do you keep some of the visibility from regulators?” Monika Bickert: “You know, I want to first answer the point about researchers, we actually partner with hundreds of researchers. We put out thousands of peer-reviewed research articles that our researchers have participated in.” Okay, but you handpicked them. That’s like a bank robber saying they didn’t rob any banks, and it just so happens all the detectives they handpicked didn’t find any evidence. Case closed! Meanwhile, Facebook is putting on hold its version of Instagram for 10- to 12-year-olds that it’s been developing.It says this will give it time to talk to stakeholders and “demonstrate the value and importance of this project”. It’s now focusing on creating opt-in parental controls for teens on its regular Instagram app. You know, because every teenager wants their mom snooping through their private messages to see which parties they went to when they were supposedly “studying at a friend’s house.” But when it comes to government regulation….Facebook is all for it! Yeah, weird, I know. Well, until you understand the reason... Zuckerberg said in his response to Haugen’s testimony that, “We’re committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress.” And he’s been saying this since at least since 2019 when he published this op-ed. But you see, Zuckerberg doesn’t actually want government regulation of Facebook. He wants government regulation of the entire internet. You know, because everyone wants their government snooping through their private messages to see which parties they went to when they were supposedly “studying at a friend’s house.” Zuckerberg says that “ updating the rules for the Internet, we can preserve what’s best about it.
.. while also protecting society from broader harms.”Yes, a heavily regulated internet. You know who might agree? Yeah, that makes you think, doesn’t it? Zuckerberg has called for regulation of the internet in 4 areas: “harmful content, election integrity, privacy and data portability.” Some members of Congress have taken Zuckerberg’s suggestion seriously, and are looking at how to regulate social media platforms. Right after their grandkids show them how the Wifi works. One of the more popular ways being discussed is stripping away their immunity from being sued under section 230 of the Communications Decency Act. “I’m also a firm supporter of amending section 230. I think we should consider narrowing this sweeping immunity when platforms’ algorithms amplify illegal conduct.” Section 230 was included in the Communications Decency Act of 1996—just as the internet was starting to take its modern form. The idea was that big platforms like America Online couldn’t possibly have the resources to monitor or control what users posted on their sites. Therefore, they wouldn’t be held liable. They couldn’t be sued if users posted harmful stuff on their platforms. Although they should have been sued for the number of discs they spammed us with. But now, Facebook *wants* government regulation of platforms. They *want* internet companies to be held accountable for what users post. Zuckerberg writes that “Internet companies should be accountable for enforcing standards on harmful content.” And that “Regulation could...require companies to build systems for keeping harmful content to a bare minimum.” And Facebook is even publishing ads like this, offering to help write the new regulations. Wow! Imagine how much money it will cost internet companies to police all of the content users post! Of course, Facebook could probably afford to do it. After all, they raked in more than $80 billion dollars last year. Thanks mostly to our aunts.
But for a new social media competitor with limited resources—self-regulation would be almost impossible. Meaning it would be illegal for them to operate. It’s almost like the regulation Facebook wants to help Congress write would be a wall that keeps out competitors, so no one can ever do to Facebook, what Facebook did to MySpace. Rest in power, Tom. So maybe...this Frances Haugen brouhaha is actually a blessing for Facebook, since it’s a perfect excuse to enact more regulation. Haugen’s lawyers have filed lawsuits with the SEC, claiming that Facebook lied to the public and its investors... and that Facebook's public statements conflicted with what Facebook knew internally. But weirdly, even though Facebook could sue her for leaking internal documents to the press, Facebook is not suing her. Hmmm... Maybe Facebook is just being nice. And now the question for all of you is: to delete or not to delete? And this episode is sponsored by Surfshark. If you’re as wary about big tech as I am, it’s a good idea to use a VPN. Surfshark helps protect you from tracking by advertisers, the government, and big tech. Surfshark secures your data with top of the line, uncrackable encryption and the most secure VPN protocols. And to further protect your privacy, Surfshark doesn’t keep logs of what you do. It even has a server network in the British Virgin Islands, where local laws allow companies to not keep user data logs. So if you need a VPN—and believe me, you do—go with Surfshark. Try it out with our special deal that includes 83% off of a 2-year plan plus 3 extra months for free. Go to Surfshark dot com slash uncovered. And for just $2.21 a month, you can get Surfshark on all your devices. Once again, I’m Chris Chappell. Thanks for watching America Uncovered.
The Wall Street Journal concluded that Facebook “knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.” Haugen also shared the documents with the Securities and Exchange Commission, alleging that Facebook misled its investors, a claim I’ll also talk about later. And she shared them with CBS before revealing her identity on 60 Minutes last week. But if she *really* wanted to stick it to Facebook, she should have shared all of these documents on Twitter. Go ahead, Frances. Be petty. That’s what Twitter’s for! So here are Haugen’s three main critiques of her former employer: The first is that its photo-sharing app Instagram causes harm to teenagers, especially teenage girls. The second is that Facebook isn’t doing enough to combat misinformation and violence on its platform. And the third is that Facebook should be transparent about the research it’s doing, and it should open its data up to outside researchers. And the fourth is that pregnancy announcements get way too many likes, especially when half these people will wind up being terrible parents… okay, that’s actually *my* critique. But it’s worth addressing. So let’s start with Instagram allegedly being harmful to teenagers. There are a few leaked internal studies that have led to headlines like this. Senator Marsha Blackburn opened a hearing about protecting kids online with this shocking statistic. “66% of teen girls on Instagram and 40% of teen boys experience negative social comparisons. This is Facebook’s research.” That is shocking. Mainly because they’d feel way better about themselves if they learned to use the Clarendon filter. Look how gorgeous I look. Well… gorgeous-er. The statistics Senator Blackburn quoted come from this leaked internal slide from Facebook. If you look closely at this study though, you see that only 25 people participated.
15 people participated in focus groups, and 10 people participated in an online diary study. Then researchers also interviewed seven people out of the 10 who did the online diaries. And of those 25, not all of them were teens because the age range was 13 to 21. So in total, they based this study on less than the number of kids on most high school football teams. But way more than the number of kids that came to my high school birthday parties. Now those numbers might be fine for Facebook’s internal market research. But it’s hardly a rigorous scientific study. Also keep in mind, these were self-reported results with no control group, which was a problem with this internal Facebook study as well. This study found that “Approximately 30% of teen girls felt that Instagram made dissatisfaction with their body worse.” Facebook even says in the notes on this survey that “The methodology is not fit to provide statistical estimates for the correlation between Instagram and mental health.” One psychology professor says that based on what was leaked, no one actually knows how Instagram affects teen mental health. To be fair, teens are so hormonal, no one knows how *anything* will affect their mental health. I once cried my eyes out to Avril Lavigne’s “Skater Boy.” As a teen, that is. And now Facebook is saying that their leaked research is being mischaracterized. Reporter: “I have a question about that. Do you deny that teenage girls are not having a good experience on Instagram.” Monika Bickert: “The majority of young people on Instagram are having a good experience and that is born out by the documents that were stolen, including the Instagram youth survey of about 40 Instagram users. These were teens who were already struggling.” “What that survey says is that the majority of teens, who were surveyed, it’s a small number, who are struggling with these issues, it made things better or didn’t have a material change.
” Yeah, 40 users isn’t a lot. But she’s only looking at the study’s in-person participants. It says it also relied on more than 2,500 online surveys. She *is* right though that that study found most teens either didn’t think Instagram had a negative impact on how they felt about themselves or even had a good impact on how they felt about themselves. Which seems right. Because after I scroll through Instagram for three straight hours, I feel absolutely nothing. I even lose feeling in my legs from sitting on the floor for too long. And that’s because I live in New York City, and I can’t fit a sofa in my apartment. Now it’s not just Facebook’s own studies that have looked into this. Outside studies show similar as well as contradictory results, meaning that the relationship between teen mental health and Instagram is more nuanced than what the headlines suggest. I know, nuance isn’t very popular these days— especially when you’re sharing it on Facebook. Ugh, this would have gotten more likes if it were a meme. This brings me to Haugen’s next point about how Facebook encourages content that makes us all more angry. More on that after this short break. Welcome back. Facebook whistleblower Frances Haugen took a trove of internal Facebook documents before she left the company earlier this year. One of the things she says those documents show is how Facebook manipulates users’ newsfeeds for the worse. “One of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions.” I don’t know. I have a pretty easy time inspiring people... to envy. “The result has been more division, more harm, more lies, more threats, and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people.
” So basically, what you see in your newsfeed isn’t based on who shared what when—it’s based on what Facebook thinks you’ll be most interested in. Which makes me nervous considering how much ‘My Little Pony’ content I see in my feed. What are you trying to say, Facebook?! Facebook prioritizes content that gets more comments and reactions, but the content that causes people to comment and react most is often sensational and divisive.“Misinformation, toxicity, and violent content are inordinately prevalent among reshares” researchers said in internal memos. This was the case even after a 2018 algorithm change that Facebook said was designed to “encourage people to interact more with friends and family” and less with media publishers. This shouldn’t surprise anyone whose family looks like mine at Thanksgiving. I’m just lucky I made it out alive after Aunt Arlene started talking politics with Uncle Roger while carving the turkey. I should have known better than to intervene. What’s ironic is that Facebook used this algorithm change to defend itself, saying that it shows it cares about stopping divisive and violent content. Facebook CEO Mark Zuckerberg took to—yes, you guessed it—his Facebook account to say that the company made that change knowing it would decrease users’ time on the platform, but that they believed it was the right thing for people’s well-being.“The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content. And I don’t know any tech company that sets out to build products that make people angry or depressed. The moral, business and product incentives all point in the opposite direction.” Yeah, they’re not deliberately trying to make people angry and depressed. That’s the job of CNN and FOX News. But even if making people angry wasn’t Facebook’s intent with the algorithm change, that was the result, according to the Wall Street Journal.
It said documents show Facebook management decided to do nothing about it, even after some politicians and at least one media company reached out to Facebook. They reportedly found that only posts that were negative and divisive got seen, and they were worried about the effect this was having on public discourse. So the fact that your pregnancy announcement got so many likes, indicates that your baby is actually negative and divisive. For Haugen though, this change in the algorithm was personal. She said when Facebook recruited her in 2019, she agreed to join only if she could work on misinformation because she “lost” a friend to conspiracy theories. Wait, does that mean her friend went off the deep end mentally? Or that she was kidnapped by lizard people and is being held at the ice wall at the edge of the flat Earth? During her time at Facebook, Haugen was the lead product manager on Civic Misinformation, and later worked on counter espionage. She said her team was understaffed and overworked, but that when she complained to management, they brushed it off. “I was told, flat out, at Facebook we accomplish impossible things with far less resources than anyone thought possible.” Accomplishing something while on Facebook? Now *that’s* a conspiracy theory. After the 2020 election, Haugen claimed that there was a sigh of relief from within Facebook and a sense that they had come out on the other side. “And right after the election, a couple of weeks after the election, they told us, ‘we’re dissolving Civic Integrity.” “Dissolving” may not be exactly the right word for it. Haugen later clarified that Facebook told her that Civic Integrity was so important, it wanted to incorporate it into its other teams. That’s also what Facebook says. “We didn’t dissolve that team. What we did was build a bigger team that that team became a part of.” For Haugen, and she says others on her team, it felt like the team had been disbanded because Facebook no longer cared about misinformation.
You know, because Donald Trump was the only person pushing misinformation. Joe Biden would never lie… For Haugen, rearranging her team was a line in the sand. But now something weird is happening. I’ll explain after this final break. Welcome back. Facebook whistleblower Frances Haugen said she decided to copy thousands of internal documents from the company’s employee forum Facebook Workplace and leak them to the press. She did that because she felt Facebook was hiding information about the dangers of its platform, and doing it for profit. After the Civic Integrity team that she was working on was incorporated into other parts of the company, she felt like she had to get outside help to right the ship. And that outside help she chose was Congress. Because whenever I want something accomplished, the first people I go to for help is Congress. Assuming they’re not in the middle of one of their shutdowns. I especially go to Congress if I need help with something that involves technology. All kidding aside, a lot of these guys are too old to understand Facebook, even though Facebook is the platform for older people. They can’t even figure out the Wifi without their grandkids’ help, not to mention Facebook’s complexities. For example: Sen. Orrin Hatch: “Well if so, how do you sustain a business model in which users don’t pay for your service?” Mark Zuckerberg: “Senator, we run ads.” Sen. Orrin Hatch: “I see.” You know you’re in trouble when Mark Zuckerberg is the less awkward one in the conversation. Haugen wants Facebook to make its algorithms available to the government and outside researchers. She believes that will keep Facebook on the straight and narrow. “Facebook’s closed design means it has no real oversight. Only Facebook knows how it personalizes your feed for you. At other large tech companies like Google, any independent researcher can download from the Internet the company’s search results and write papers about what they find, and they do.
” Facebook isn’t exactly denying this. Facebook’s vice president for content policy says they partner with outside researchers… but only ones they’ve handpicked. Reporter: “Do you keep some of the visibility from regulators?” Monika Bickert: “You know, I want to first answer the point about researchers, we actually partner with hundreds of researchers. We put out thousands of peer-reviewed research articles that our researchers have participated in.” Okay, but you handpicked them. That’s like a bank robber saying they didn’t rob any banks, and it just so happens all the detectives they handpicked didn’t find any evidence. Case closed! Meanwhile, Facebook is putting on hold its version of Instagram for 10- to 12-year-olds that it’s been developing.It says this will give it time to talk to stakeholders and “demonstrate the value and importance of this project”. It’s now focusing on creating opt-in parental controls for teens on its regular Instagram app. You know, because every teenager wants their mom snooping through their private messages to see which parties they went to when they were supposedly “studying at a friend’s house.” But when it comes to government regulation….Facebook is all for it! Yeah, weird, I know. Well, until you understand the reason... Zuckerberg said in his response to Haugen’s testimony that, “We’re committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress.” And he’s been saying this since at least since 2019 when he published this op-ed. But you see, Zuckerberg doesn’t actually want government regulation of Facebook. He wants government regulation of the entire internet. You know, because everyone wants their government snooping through their private messages to see which parties they went to when they were supposedly “studying at a friend’s house.” Zuckerberg says that “ updating the rules for the Internet, we can preserve what’s best about it.
.. while also protecting society from broader harms.”Yes, a heavily regulated internet. You know who might agree? Yeah, that makes you think, doesn’t it? Zuckerberg has called for regulation of the internet in 4 areas: “harmful content, election integrity, privacy and data portability.” Some members of Congress have taken Zuckerberg’s suggestion seriously, and are looking at how to regulate social media platforms. Right after their grandkids show them how the Wifi works. One of the more popular ways being discussed is stripping away their immunity from being sued under section 230 of the Communications Decency Act. “I’m also a firm supporter of amending section 230. I think we should consider narrowing this sweeping immunity when platforms’ algorithms amplify illegal conduct.” Section 230 was included in the Communications Decency Act of 1996—just as the internet was starting to take its modern form. The idea was that big platforms like America Online couldn’t possibly have the resources to monitor or control what users posted on their sites. Therefore, they wouldn’t be held liable. They couldn’t be sued if users posted harmful stuff on their platforms. Although they should have been sued for the number of discs they spammed us with. But now, Facebook *wants* government regulation of platforms. They *want* internet companies to be held accountable for what users post. Zuckerberg writes that “Internet companies should be accountable for enforcing standards on harmful content.” And that “Regulation could...require companies to build systems for keeping harmful content to a bare minimum.” And Facebook is even publishing ads like this, offering to help write the new regulations. Wow! Imagine how much money it will cost internet companies to police all of the content users post! Of course, Facebook could probably afford to do it. After all, they raked in more than $80 billion dollars last year. Thanks mostly to our aunts.
But for a new social media competitor with limited resources—self-regulation would be almost impossible. Meaning it would be illegal for them to operate. It’s almost like the regulation Facebook wants to help Congress write would be a wall that keeps out competitors, so no one can ever do to Facebook, what Facebook did to MySpace. Rest in power, Tom. So maybe...this Frances Haugen brouhaha is actually a blessing for Facebook, since it’s a perfect excuse to enact more regulation. Haugen’s lawyers have filed lawsuits with the SEC, claiming that Facebook lied to the public and its investors... and that Facebook's public statements conflicted with what Facebook knew internally. But weirdly, even though Facebook could sue her for leaking internal documents to the press, Facebook is not suing her. Hmmm... Maybe Facebook is just being nice. And now the question for all of you is: to delete or not to delete? And this episode is sponsored by Surfshark. If you’re as wary about big tech as I am, it’s a good idea to use a VPN. Surfshark helps protect you from tracking by advertisers, the government, and big tech. Surfshark secures your data with top of the line, uncrackable encryption and the most secure VPN protocols. And to further protect your privacy, Surfshark doesn’t keep logs of what you do. It even has a server network in the British Virgin Islands, where local laws allow companies to not keep user data logs. So if you need a VPN—and believe me, you do—go with Surfshark. Try it out with our special deal that includes 83% off of a 2-year plan plus 3 extra months for free. Go to Surfshark dot com slash uncovered. And for just $2.21 a month, you can get Surfshark on all your devices. Once again, I’m Chris Chappell. Thanks for watching America Uncovered.