Have we finally reached the 'Big Tobacco moment' for Big Tech?

Finally, a legal argument has pierced the protective shield Big Tech has been given by the three-decade-old Section 230 of the Communications Decency Act (1996), writes Alex Cooney
Have we finally reached the 'Big Tobacco moment' for Big Tech?

A court ruled yesterday that Meta's Instagram and Google's YouTube were deliberately designed to be addictive and that their design caused serious harm to a young girl. Photo: Vincent Feuray/Hans Lucas via AFP

There have been a few moments over the past 10 years of working in online safety for children that might well have been described as "tobacco moments".

Some point to the Cambridge Analytica scandal coming to light in 2018, which exposed massive misuse of personal data for political gain. 

Others to the 2022 coroner's inquest into the death of 14-year-old Molly Russell, who died by suicide in 2017, which represented the first time a coroner in the UK legally concluded that social media content contributed to a child's death — explicitly naming Pinterest and Instagram in the findings.

Or the whistleblower revelations — Frances Haugen in 2021 and Arturo Béjar in 2023 — which showed that Meta not only knew their platforms were harming children, but in some cases actively suppressed their own internal research rather than act on it.

At a minimum these were shocking moments in the digital age, yet they were not enough to shift the dial in terms of changing these companies' behaviour and practices.

But Thursday's ruling in Los Angeles — that Meta's Instagram and Google's YouTube were deliberately designed to be addictive and that their design caused serious harm to a young girl — is an historic outcome, and one that could dramatically shape the resolution of the thousands of similar lawsuits to come. 

Communications Decency Act (1996)

Perhaps this is the moment. You might ask why it's so momentous. 

In the US context it matters because it pierces the protective shield provided by the three-decade-old Section 230 of the Communications Decency Act (1996). 

In the years since that law passed Big Tech has consistently fallen back on the defence that it is merely a hosting provider — not a publisher — and therefore cannot be held responsible for content posted on its platforms. 

The analogy often made is with a phone company: they provide the hardware and the means of connection, but bear no responsibility for the conversations that take place across it.

Thirty years ago, when that law was passed, you might reasonably say that was the case. 

The internet looked very different then — this was pre-Google, pre-Facebook, pre-Snapchat and TikTok. 

It was certainly before the engagement-based model that has become the bedrock of how Big Tech designs and monetises its platforms.

Which brings me back to yesterday's ruling. 

By focusing on how these platforms were deliberately designed to maximise engagement and addiction — rather than on the content they host — the plaintiffs' lawyers found a way around the shield that Big Tech has so consistently used to evade responsibility. 

That changes everything. It also has implications in terms of the precedent set for the many upcoming cases that are still to play out in law courts across the US.

Digital Services Act

Europe, thankfully, is not sitting idle — although the pace of change has been frustratingly slow. 

Like Section 230, we had the E-Commerce Directive (2000), which broadly accepted the idea that platforms are hosts rather than publishers. 

But Europe has moved more quickly than the US to update that premise, putting in place the Digital Services Act (DSA), which came into force for the largest platforms in 2023. 

It requires major platforms to assess the risks posed to minors, restrict targeted advertising to children, and remove illegal content once notified — a meaningful step forward from the passive hosting model that preceded it.

Proving breaches and actually getting companies to pay fines imposed is, however, another matter entirely. Big Tech has effectively endless resources with which to fight regulatory outcomes, as it has proven time and again. 

TikTok and Snap Inc. chose to settle out of court in the US rather than face the publicity and scrutiny of full investigations into their internal practices. File photo: AP/Damian Dovarganes
TikTok and Snap Inc. chose to settle out of court in the US rather than face the publicity and scrutiny of full investigations into their internal practices. File photo: AP/Damian Dovarganes

Ireland's Data Protection Commission — which, by virtue of where these companies have chosen to locate their European headquarters, has carried out numerous investigations into data breaches and harms to children — has issued substantial fines, including the largest GDPR fine in history against Meta. 

But almost all are being challenged through the courts, meaning the actual payment of any fines, and more importantly any meaningful reform of practices, remains frustratingly distant.

Back in the US, both TikTok and Snap Inc. — whose products are enormously popular with children — chose to settle out of court rather than face the publicity and scrutiny of full investigations into their internal practices. 

That tells its own story.

But neither will escape scrutiny for long. The European Commission recently published preliminary findings concluding that TikTok is designed to be addictive — findings TikTok has of course immediately announced it will challenge. 

Encouraging signs

Also on Thursday, the EC announced a formal investigation into Snapchat to determine whether it has breached the DSA by exposing minors to grooming attempts, criminal recruitment, and access to illegal products including drugs. 

Sanctions include substantial fines and in the most serious cases, a platform ban entirely.

The momentum we are seeing grow is genuinely encouraging — not just for those of us who have spent years working in child safety, but for society as a whole. 

It is completely unacceptable that these platforms have profited so substantially, for so long, from products that harm children — with so little accountability.

We know they have the resources to fight every ruling, across every jurisdiction, for as long as it takes. 

But the harm has now been proven — in courtrooms, in coroners' courts, in regulatory investigations. 

The question is no longer whether these platforms have caused harm; it is simply how long they can delay being held responsible for it. 

Children are living their lives online as much as offline. They deserve safety and protection wherever they are. That is not too much to ask.

x

More in this section

Revoiced

Newsletter

Sign up to the best reads of the week from irishexaminer.com selected just for you.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited