European Union regulators on Thursday opened an investigation into American tech giant Meta over the potentially addictive effects of Instagram and Facebook on children, a move that touches on the fundamentals of the company's product design. It is an act that has far-reaching effects.
The European Commission, the 27-nation bloc's executive arm, said Meta's products could “exploit the weaknesses and inexperience of minors” and create behavioral dependence that threatens their mental health. said in a statement. EU regulators could ultimately fine Meta up to 6% of its global revenue ($135 billion last year) and force other product changes.
The investigation is part of efforts by governments around the world to rein in services like Instagram and TikTok to protect minors. Meta has long faced criticism that its products and recommendation algorithms are tweaked to appeal to children. In October, 30 US states sued Meta for violating consumer protection laws and using “psychologically manipulative product features” to lure children.
EU regulators said they were in contact with U.S. regulators about the investigation announced Thursday. The regulator said Meta may be in violation of the Digital Services Act. The law, approved in 2022, requires major online services to more aggressively police illegal content on their platforms and implement policies to reduce risks to children. No one under the age of 13 can sign up for an account, and EU regulators said they would scrutinize the company's age verification tools as part of their investigation.
The EU's Thierry Breton said: “We will now focus on the potential addiction and 'rabbit hole' effects of platforms, the effectiveness of age verification tools and the level of privacy afforded to minors in the functioning of recommendation systems. We plan to conduct a thorough investigation.” The Domestic Market Commissioner, who is overseeing the investigation, said in a statement. “We will spare no effort to protect our children.”
Meta said Thursday that its social media services are safe for young people, citing a feature that allows parents and children to set time limits on Instagram and Facebook. Teens also default to more restricted content and recommended settings. Advertisers are prohibited from showing targeted ads to underage users based on their activity on Meta's apps.
“We want young people to have safe and age-appropriate experiences online, and we've spent a decade developing more than 50 tools and policies designed to protect young people. ,” Mehta said in a statement. “This is a challenge facing the entire industry and we look forward to sharing details of our efforts with the European Commission.”
EU officials did not give a timeline for how long the investigation would take. But with the formal investigation launched on Thursday, regulators have broad powers to gather evidence from Meta, including sending legal information requests, interviewing company executives and conducting inspections of its headquarters. Given the. Instagram and Facebook will be investigated separately.
Since the Digital Services Act came into force, EU regulators have targeted a number of companies. TikTok last month after authorities questioned its “addictive” features that allow users to earn rewards such as gift cards by watching videos, liking content, or following certain creators. , suspended the version of the app in the European Union.
Meta is facing a separate investigation related to political advertising, while Elon Musk-owned social media site X is facing an investigation into content moderation, risk management, and advertising transparency.