Instagram Study: Big T-ob-ECH-o

The Flag Staff Contributor
Instagram Study: Big T-ob-ECH-o
Read Time: approx. 4:35

Instagram Study: Facebook’s own studies have revealed Instagram’s harmful consequences for teenagers’ mental health, which has spurred discussion on regulating the platform’s usage and even the validity of the study. Here’s what both sides are saying. To have stories like this and more delivered directly to your inbox, be sure to sign up for our newsletter.

Top Story: Instagram Study


On September 14, a trio of Wall Street Journal reporters released the “Facebook Files”. The lengthy expose outlined how Facebook’s own in-depth research shows a significant teen mental-health issue that the Big Tech company plays down in public. More specifically, “For the past three years, Facebook has been conducting studies into how its photo-sharing app, Instagram, affects its millions of young users. Repeatedly, the company’s researchers found that Instagram is harmful for a sizable percentage of them, most notably teenage girls.” In a March 2020 slide posted to Facebook’s internal message board, researchers quantified the damage, saying “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” The thing is, as many of us know, Facebook’s main app and it’s subsidiaries like Instagram are wildly addictive. For teens, “They often feel ‘addicted’ and know that what they’re seeing is bad for their mental health but feel unable to stop themselves.” The description has prompted many to compare Facebook and Big Tech to Big Tobacco. As a result of the “Facebook Files” the Senate Commerce Committee’s Subcommittee on Consumer Protection, Product Safety, and Data Security held a hearing with Facebook Global Head of Safety Antigone Davis this past Thursday. Davis said the Wall Street Journal mischaracterized the company’s research, spinning it into something bigger than it really is. Both lawmakers and commentators disagreed. Here’s what both sides had to say.

On The Right


Right-leaning commentators and outlets describe Facebook’s approach to the problem as nonchalant, meaning they don’t really care about their own research at all. Unfortunately, critics say, unless there is legitimate regulation, nothing will change. In fact, as Max Cherney of Barron’s (right-center) notes, “For the first time in 2020, Facebook generated more revenue than Intel, one of the world’s largest chipmakers.” This year, the company is projected to generate $40.61 billion in net profit at a margin of 34%.

“Facebook doesn’t care about doing the right thing at all” Editorial Board, New York Post: “Facebook is once again on Capitol Hill trying to defend itself against the indefensible. In this case, knowing that the social-media giant’s photo-sharing app, Instagram, is harmful to its young users, particularly teenage girls, and doing nothing about it. … And worse, … the company’s global head of safety, Antigone Davis, attempted to convince the senators that the company protects kids. … Sorry, 13 percent of suicidal British teen girls and 6 percent of Americans desiring to kill themselves thanks to Instagram is pretty massive. … When Sen. Ted Cruz (R-Texas) asked if the company has changed its policies in light of the studies, Davis refused a clear answer. Instead, she insisted that ‘our products actually add value and enrich teens’ lives.’ Clearly false. … That’s the bottom line, isn’t it? Facebook just doesn’t care about doing the right thing.”

“Facebook Was Back in Front of Congress Today. The Hearing Was Ugly.” Max Cherney, Barron’s: “It’s hard to imagine Davis’ performance will ingratiate the company with Washington’s lawmakers. Senators compared Facebook to Big Tobacco … Sen. Ted Cruz (R., Texas) repeatedly battered Davis with questions such as whether the company had attempted to calculate the number of children’s suicides the platform has caused over the years. Davis said causality is a difficult problem to assess. Facebook’s Washington problems are real. Along with rivals such as Google parent Alphabet, and Apple, Facebook is staring down what feels like inevitable regulation. Europe has taken some steps already, and there is an appetite from both US political parties to do something—albeit for very different reasons. Those regulations could reshape Big Tech, or business could go on as usual. It all depends on what Congress, and ultimately the White House, can agree upon.”

“Facebook on the Hot Seat Once Again” Staff, Dispatch: “Pick any two lawmakers at random, and odds are they both want to regulate Facebook—for entirely different reasons. Some want to break it up because they say it’s too profitable; others want to break it up because it’s too powerful. Some want to revoke its Section 230 protections because they believe it censors too much; others want to revoke its Section 230 protections because it doesn’t censor enough. A Republican-controlled Federal Trade Commission fined Facebook $5 billion in 2019 after finding it ‘deceiv[ed] users about their ability to control the privacy of their personal information,’ and a Democratic FTC is now trying to prove the company violated antitrust law in its rise to the top. … In a lengthy statement released 12 days after the Journal’s report, Facebook said its research actually demonstrated many teens believe ‘using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced’ and attempted to downplay the significance of the leaked slides.”

On The Left


Left-leaning commentators and outlets generally agree with the sentiment above. There was, however, one exception from Alex Beam of The Boston Globe, which we include below. Most don’t think change is in the air, and in the meantime, Bloomberg’s Cathy O’Neil believes Facebook’s own research does not provide much to rely on.

“We Know What Facebook Knew, and When It Knew It. Now What?” Kate Klonick, New York Times: “While calling out Facebook for its mistakes and omissions may seem like a win, berating the company for its flawed internal and external research projects does not mean this type of work will become more ethical or transparent. The outcome is that it doesn’t get done at all — not by Facebook or anyone else — and if it does, the results stay hidden. Other popular platforms are also part of the problem. … But it will take more than breathless reporting to make sure that reform happens in effective ways. That will require laws demanding transparency from platforms, a new agency to specialize in online issues, and more science. Whistle-blowing gets us halfway there. We have to do the rest.”

“In defense of Facebook” Alex Beam, The Boston Globe: “I probably spend about 30 minutes a day on Facebook. No one has urged me to overthrow the Republic, overdose on ivermectin, or join a hate group. … I found the Journal outing quite underwhelming. … There are some very bad people on Facebook, and let’s stipulate that the company could, and probably will, devote more resources to booting them off the site. But it’s worth remembering that in many parts of the world Facebook is akin to a common carrier, a means of communication like the telephone company. … In a 2019 post, Facebook executive Andrew Bosworth wrote, ‘While Facebook may not be nicotine I think it is probably like sugar. Sugar is delicious and for most of us, there is a special place for it in our lives. But like all things, it benefits from moderation.’ Facebook’s critics could benefit from some moderation too.”

“Facebook’s Instagram Research Isn’t Anything Like Science” Cathy O’Neil, Bloomberg: “I’d like to focus on something slightly different that should be a scandal, too: the quality of that internal research. … Facebook is right on one point: Its internal research doesn’t demonstrate much of anything. … That said, one can reach a pretty strong conclusion by observing the way Facebook has done research over the years: It’s afraid to know the truth. After all, why not do more studies? If it’s possible that your product is leading young women to kill themselves, wouldn’t you want to explore further, at least to clear your name? Why not let outside researchers use your data, to get a better answer faster? Instead, Facebook allows only tiny internal studies and tries to keep them under lock and key. Even if they leak, the company maintains deniability: The results are far from conclusive.”

Flag This: Instagram Study


According to somewhat dated polling from Pew Research Center, “Americans favor more, not less, regulation of major technology companies,” Monica Anderson writes. “According to a Center survey conducted in June, some 47% of US adults think the government should be regulating major technology companies more than it is now, while just 11% think these companies should be regulated less.” Additionally, “About two-thirds of Americans (64%) say social media have a mostly negative effect on the way things are going in the country today.” 

Flag This: One nugget tucked into Kate Klonick’s article above is fascinating and helps contextualize the uphill battle Facebook and other tech platforms face in regards to moderating content. Recently, “Facebook announced it h