There are thousands of legal cases against social media platforms, but legal experts are watching three landmark cases. The first bellwether case ended on March 24 in a loss for Meta, the parent company of Facebook and Instagram, when a jury in New Mexico ordered the Silicon Valley giant to pay $375 million for violations of the state’s consumer protection laws. The next watershed case, on March 25, resulted in a Los Angeles jury finding Meta and YouTube “guilty on all charges” and awarding $6 million to a woman who claimed that she became addicted to social media as a child, which led to suicidal thoughts, depression, and other mental health problems.
A third closely watched case will go to trial in April. A school district in Kentucky is seeking compensation over disruptions in classrooms, and is also demanding an injunction against targetting features, such as endless app notifications and the auto-playing of videos. The school district is also insisting on restrictions requiring age-verification, parental controls, and similar features. There are more than 250 school districts across the country that have brought legal cases against social media platforms. Safety advocates are now more optimistic for concrete changes, after years of frustration, during which increased scrutiny resulted in but little action or accountability for social media giants. Seeing the handwriting on the wall, insurance companies are now suing Meta, in order to get out of their contracts to protect Meta from legal challenges.
After the $375 million verdict against Meta on March 24, New Mexico Attorney General Raúl Torrez stated: “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.” In this case investigators from the New Mexico Department of Justice went undercover on Facebook to gather evidence to prove that the website’s algorithms lack any protections for young users, exposing them to would-be predators. The state alleged in court filings that “Meta’s platforms Facebook and Instagram are a breeding ground for predators.” Before going to trial, Attorney General Torrez wrote to Meta chief executive Mark Zuckerberg requesting simple changes including age-verification, steps to address risks created by encrypted chat features, changes to algorithms, and the removal of “bad actors,” but Meta failed to act.
The Los Angeles verdict on March 25 may have a smaller payout, but it could represent a bigger liability for the social media giants, because it addresses both the design of the platforms and behavior-modification by manipulation, not just misbehavior by users. The now 20-year-old woman testified that she had become addicted to social media and found it easier to endure the cyber-bullying than to leave the platforms. She did not tell her parents, out of fear that they would have taken away her smartphone.
Social media giants thought that a legal code called Section 230 gave them blanket immunity from any liability for actions by bad actors on their platforms. This has been the core of their legal argument, but for the first time it has been successfully challenged. Silicon Valley giants have announced that they intend to appeal all of these verdicts.