Meta Allegedly Hid Evidence of Social Media Harm — 23 November 2025 Updates

Court filings allege Meta hid research showing Facebook harmed users’ mental health. 2025 update on social media accountability.

Raja Awais Ali

11/23/20252 min read

a white square with a blue logo on it
a white square with a blue logo on it

Meta Allegedly Hid Evidence of Social Media Harm — Court Filings Reveal

On 23 November 2025, U.S. court filings in a lawsuit brought by school districts revealed serious allegations that Meta Platforms concealed internal research showing that its social media platform Facebook negatively affects users’ mental health. The case highlights the ongoing debate over corporate responsibility and user safety.

According to the filings, Meta conducted a 2020 study code-named Project Mercury, in collaboration with a survey firm. The study examined users who deactivated their Facebook accounts for one week. The documents claim that users who temporarily stopped using the platform reported lower levels of depression, anxiety, loneliness, and social comparison compared to regular users. These findings suggested a measurable mental health benefit from stepping away from social media.

Despite these results, the filings allege that Meta did not continue the research or publish the findings. Internal discussions reportedly described the negative outcomes as tainted by the “existing media narrative” and decided to halt further research. The court documents also claim that even though the study suggested a causal link between Facebook use and negative mental health outcomes, Meta told U.S. Congress it was unable to quantify the harm to teenage girls.

Meta’s Response

Meta stated the study was terminated due to methodological limitations. A company spokesperson emphasized that Meta has “listened to parents, researched issues that matter most, and made real changes to protect teens.” Critics argue that suppressing such research points to a systematic approach to minimize potential criticism and maintain user engagement, possibly at the expense of user well-being.

Implications for Ethics and Public Health

Legal experts and advocates for digital well-being describe these revelations as deeply concerning. The allegations raise questions about the responsibility of social media companies to transparently report risks, especially to vulnerable and young populations. If upheld, the findings could result in legal challenges, regulatory scrutiny, and serious reputational damage for Meta.

This case also underscores a broader societal concern: how technology companies balance corporate interests with user safety. It emphasizes the need for accountability, transparency, and ethical research to protect mental health in an era dominated by social media platforms.