A prominent whistleblower from the big‑tobacco era has warned that Meta and YouTube are intentionally designing digital products that are as deliberately addictive for children as cigarettes once were, drawing a direct parallel to the tactics that earlier defined the tobacco industry’s push into young markets. The whistleblower, referencing both internal documents and public‑facing features, argues that infinite scrolling, auto‑played videos, and hyper‑personalised recommendation engines are engineered to keep kids hooked far beyond casual use, with the primary aim of boosting ad revenue and engagement metrics.
How the analogy to big tobacco works
The whistleblower highlights that, just as tobacco companies once hid the addictive nature of nicotine and its health risks while targeting adolescents through marketing, tech giants now build features that exploit attention‑fatigue and reward‑cycle psychology in young users. Internal emails and research cited in recent US litigation show that Meta and YouTube were aware their platforms could trigger compulsive behaviour and mental‑health strain in minors, yet continued to prioritise growth‑oriented design rather than robust safeguards.
Legal experts following the landmark Los Angeles trial against Meta and YouTube have described this pattern as a “social‑media‑addiction reckoning” that mirrors the 1990s tobacco lawsuits, complete with whistleblowers, internal memos, and juries being asked to decide whether corporate intent caused demonstrable harm. The case centred on a 20‑year‑old plaintiff whose teenage years were dominated by Instagram and YouTube, with the jury ultimately finding that the companies’ designs were deliberately addictive and had materially harmed her mental health.
Broader implications for tech and regulation
The whistleblower’s stark warning—to the effect that Meta and YouTube “expected” children to become addicted, and that they were fine with that outcome as long as usage numbers rose—has become a rallying point for lawmakers and regulators pushing for stricter rules on data‑driven algorithms, under‑18 restrictions, and age‑appropriate defaults. The tobacco‑style framing also makes it easier for the public to see social‑media addiction not as a personal‑willpower issue, but as a product‑design and public‑health problem that demands legal and policy action, just as the tobacco industry once did.