Instagram CEO denies “clinical addiction” to social media in key juvenile trial

Last update: February 13
  • Adam Mosseri, the head of Instagram, rejects the term "clinical addiction" and instead uses the concept of "problematic use".
  • The civil trial in Los Angeles is examining whether Meta and Google designed Instagram and YouTube to deliberately ensnare children and teenagers.
  • The case of young Kaley GM serves as a reference for more than a thousand similar lawsuits in the U.S.
  • The prosecution focuses on algorithms, infinite scrolling, and beauty filters, while the defense insists that the priority is the safety of minors.

debate on problematic use of social media

The head of Instagram, Adam MosseriHe has become the center of the global debate on the impact of social media on the mental health of young people after testifying in a civil trial in Los Angeles. During his testimony, the executive denied that one could speak of “clinical addiction” to platforms like Instagram or YouTube, although he admitted that there is what he calls a “problematic use” of these digital services.

The legal process, followed closely since Europe and Spain due to their potential regulatory effectIt aims to clarify whether major technology companies like Meta Google and its products have been designed with mechanisms specifically intended to retain customers. Children and adolescents for as long as possible, with the consequent impact on their psychological well-being.

A landmark trial against Meta and Google over the design of their platforms

trial concerning social media and minors

The case is being heard before a civil jury in Los Angeles (California) and has accused MetaInstagram and Facebook's parent company, already YouTube, owned by Google. The key question is whether the companies deliberately built addictive platforms for minorsturning features like infinite scrolling, notifications, or automatic recommendations into real "hooks" to increase usage time.

The demand is based on the experience of Kaley GMA 20-year-old Californian woman who started using YouTube at age six and opened an Instagram account around age 9-11, below the official minimum age of 13. According to her, this intensive and early contact with various social networks—including later Snapchat and TikTok— contributed to severe psychological damagewith symptoms of anxiety, depression, and a strong dependence on digital interaction.

This procedure is considered a pilot test for more than a thousand lawsuits related to the effects of social media on young users in the United States. The outcome could guide future class action lawsuits and, indirectly, influence the regulatory debate already underway in the European Union with regulations such as the Digital Services Act, as previous cases against Meta have shown.

Before the jury, the plaintiffs' lawyers described Instagram and YouTube as carefully designed “traps” to exploit the emotional vulnerability of teenagers. Meta and Google's defense, for its part, denies that there is any intention to create addiction and argues that it is communication and entertainment platforms comparable, in part, to video-on-demand services like Netflix.

“Clinical addiction” versus “problematic use”: the nuance defended by Mosseri

During the questioning, Adam Mosseri insisted that it is essential to differentiate between a clinically recognized pathology and the behavior of a person who spends too much time on an app. In his opinion, the term “clinical addiction” It is not appropriate to describe what happens to most users on social networks, although it admits that there may be a use that causes discomfort and negative consequences.

It may interest you:  How to request DGT stickers

The prosecuting attorney, Mark LanierHe was asked directly if he thought Instagram could make its users “clinically addicted.” Mosseri replied no: he explained that he had once said he was “hooked” on a Netflix series that he watched until late at night, but that this doesn't equate to a addiction in the medical senseHe also acknowledged that, in the past, he had used the word "addict" in a way that “too frivolous”.

Lanier reminded him that He is neither a doctor nor a psychologist and questioned his authority to pronounce on clinical diagnoses. Mosseri replied that he has never intended to diagnose addiction and that his argument is limited to pointing out that, from a scientific point of view, speaking of “problematic use” would be more rigorous that is to say, a recognized addictive disorder.

This semantic nuance has fundamental implications: if the jury accepts the idea that it is “only” excessive use, the obligations and legal responsibility of the platforms could be qualified. If, on the other hand, it considers it proven that the product design generates something comparable to a clinical dependencyThe scenario for social media would be much more complicated both in the United States and in other regions, including Europe.

The accusation: algorithms, infinite scrolling, and beauty filters

Beyond the debate about medical terminology, the lawsuit focuses on the technical design of the platformsKaley's team maintains that functions like the infinite scrollAutomatic video playback, the "like" button, and recommendation systems personalize content in ways that encourage a compulsive consultation mobile phones, especially among teenagers seeking social validation.

According to the prosecution, these mechanisms act as “dopamine dispensers” by offering immediate rewards —new videos, notifications, reactions— every time the user interacts with the app. In young people who are still developing, they add, this type of stimulation can facilitate repetitive behaviors that are difficult to control and aggravate pre-existing problems. self-esteem and mental health.

A relevant chapter of the trial revolves around the beauty filters and face editing from Instagram. Internal Meta documents, cited by the prosecution, show discussions within the company about whether to ban effects that drastically alter the appearance of the face or that They simulate the results of cosmetic surgeryIn some emails, experts consulted by the company almost unanimously warned of the potential harm of these filters to young girls.

Initially, Instagram opted to globally ban the effects that distorted the faceHowever, the policy was later revised, allowing a group of filters that modify physical features, while maintaining the specific block on those promoting cosmetic procedures. Mosseri explained to the jury that this change aimed to focus on content considered “more problematic,” although Kaley’s lawyers suggest there were also [other problematic] filters. commercial pressures, including the fear of losing competitiveness in certain international markets.

It may interest you:  How to disable the

Meta and Instagram's defense: child safety and long-term effects

From Meta's side, the answer rests on two main ideas: firstly, that Instagram is not intended to cause harmOn the other hand, the company has been incorporating a wide catalog of protection tools for teenagers. Mosseri, who has led Instagram since 2018, said the company is interested in that children are safe not only out of social responsibility, but also because Protecting them is good for business long term.

The executive stressed that decisions to prioritize profit over user well-being can be “very problematic” for the company over time. He also reiterated the idea, already expressed in appearances before the US Senate, that he is in favor of a greater regulation of online securitywhile continuing to emphasize that they are working to make the platform as safe as possible even when families do not use the available parental control tools.

Among the measures introduced in recent years, Mosseri cited the so-called “accounts for teenagers”These automatically activate content restrictions and stricter privacy options for young users. Systems to limit exposure to certain sensitive topics have also been strengthened, and implementation has begun. age verification using artificial intelligence with the aim of detecting minors who register with false birth dates, although the company itself admits that its accuracy is limited.

In this context, Meta insists that the plaintiff's mental health difficulties cannot be attributed solely to Instagram. The defense has presented statements from professionals who worked with Kaley, as well as information about previous family problems, in order to demonstrate that there were other determining factors in his psychological evolution and that the social network was not the "substantial factor" alleged by the prosecution.

Mental health, minors and responsibility: a debate that also points to Europe

Kaley's case is part of a wave of lawsuits against social media in the United States, presented by families who claim to have seen their children develop eating disorders, severe depressionSelf-harm and even suicide attempts have been reported following intensive use of these platforms. Some parents have gathered outside the courthouse since dawn, braving the rain, to closely follow each session of the trial.

Attention to the case extends beyond the United States. Spain and the rest of the EUThe debate on the responsibility of large technology companies in protecting minors is wide open. The European Digital Services Act already obliges large platforms to assess and mitigate systemic risks, including those related to the mental health of children and adolescents, and demands greater transparency about the algorithms that determine what each user sees.

It may interest you:  How to Know If You Have Been at a Polling Station

Parents' organizations, consumer associations, and European mental health experts view this trial as a potential inflection pointA verdict that finds a deliberate strategy to encourage addictive behavior proven could strengthen the arguments in favor of more severe restrictions in terms of interface design, notifications or algorithmic personalization in social networks used massively by minors.

At the same time, Meta and Google's defense relies on the existence of legal frameworks—such as the well-known Section 230 In the United States, laws limit the direct liability of platforms for the content their users post. While this protection does not cover product design, it does restrict the scope of arguments focused exclusively on the material Kaley was exposed to, which is why the judge has asked that the details of the specific content viewed by the plaintiff not be examined in detail.

What's at stake for the future of social media

The trial is expected to last several weeks, with key dates including the appearance of the Meta founder Mark Zuckerbergand the CEO of YouTube, Neil Mohanwho will have to answer questions about how product decisions are made and what weight the growth and advertising revenue facing the possible consequences for minors.

Meanwhile, companies like TikTok and SnapchatCompanies that had also been implicated in similar cases have opted for confidential pre-trial agreements. This move reinforces the idea that the industry is aware of the implications of the increasing scrutiny surrounding it. screen addiction and the design of its services.

For the technology sector, Kaley's case functions as a barometer of the degree of social and legal tolerance towards business models based on capturing and retaining attention. In markets like Europe, where regulations are increasingly stringent regarding data use and the protection of minors, a strong ruling against Meta and Google could accelerate legislative initiatives aimed at limiting persuasive design techniques already require mental health impact assessments.

Ultimately, Mosseri's statement, denying the existence of a "clinical addiction" but accepting that there is a problematic use that can harm some young peopleThis summarizes the tension that runs through the entire industry: Social media has become a central part of everyday life, especially among teenagers, and the challenge now is to determine to what extent those responsible for its design should be legally held accountable when that digital experience becomes unhealthy or uncontrollable for those who are most vulnerable.

Related article:
How many reports do you need to block a page on Instagram: how to report an account