Lee Posted March 25 Posted March 25 A US jury has found Meta Platforms violated state law in a lawsuit brought by New Mexico's attorney-general, who accused the company of misleading users about the safety of Facebook, Instagram and WhatsApp and of enabling child sexual exploitation on those platforms. After deliberating for less than a day, the jury found that Meta violated New Mexico's consumer protection law and ordered the company to pay $US 375 million ($538 million) in civil penalties. The verdict marks the first time a jury has ruled on such claims against Meta, as the company faces a wave of lawsuits over how its platforms affect young people's mental health. "We respectfully disagree with the verdict and will appeal," a Meta spokesperson said in a statement. "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content." In a statement, New Mexico Attorney-General Raúl Torrez, a Democrat, called the verdict "a historic victory for every child and family who has paid the price for Meta's choice to put profits over kids' safety". "The substantial damages the jury ordered Meta to pay should send a clear message to big tech executives that no company is beyond the reach of the law," he said. In a second phase of the trial in May, Mr Torrez said his office would ask the court to order Meta to make changes to its platforms to protect children and to impose additional financial penalties. Meta shares were up 0.8 per cent in after-hours trade following the verdict. The state had asked the jury to award more than $US2 billion in damages. Meta faces broad challenge related to youth mental health The jury's decision capped a six-week trial in Santa Fe. Mr Torrez had accused the company of allowing predators unfettered access to underage users and connecting them with victims, often leading to real-world abuse and human trafficking. "Over the course of a decade, Meta has failed over and over again to act honestly and transparently," Linda Singer, an attorney for the state, told the jury during closing arguments on Monday, local time. "It's failed to act to protect young people in this state." Meta denied the allegations, saying it had extensive safeguards in place to protect younger users. "What the evidence shows is Meta's robust disclosures and tireless efforts to prevent harmful content. And these disclosures mean that Meta did not knowingly and intentionally lie to the public," Kevin Huff, a lawyer for Meta, told the jury on Monday. Meta has come under increasing scrutiny in recent years over its handling of child and teen safety, spurred in part by whistleblower testimony before Congress in 2021 that alleged the company knew its products could be harmful but refused to act. Separately, Meta is facing thousands of lawsuits accusing it and other social media companies of intentionally designing their products to be addictive to young people, leading to a nationwide mental health crisis. New Mexico's investigation The New Mexico lawsuit grew out of an undercover operation that Mr Torrez, a former prosecutor, and his office ran in 2023. As part of the case, investigators created accounts on Facebook and Instagram posing as users younger than 14. The accounts received sexually explicit material and were contacted by adults seeking similar content, leading to criminal charges against multiple individuals, according to Mr Torrez's office. The state says Meta told the public Instagram, Facebook and WhatsApp were safe for New Mexico teens and children, while hiding the truth about how much dangerous and harmful content the company hosts. According to the state, internal company documents acknowledged problems with sexual exploitation and mental health harm. Yet the company, the state says, did not institute basic safety tools such as age verification and insisted it was safe. The state also accused Meta of designing its platforms to maximise engagement despite evidence they were harming children's mental health. Features such as infinite scroll and auto-play videos keep children on the site, fostering addictive behaviour that can lead to depression, anxiety and self-harm, the lawsuit claims. The jury found that Meta had violated the state's consumer protection law by knowingly engaging in an unfair or deceptive trade practice. Reuters https://www.abc.net.au/news/2026-03-25/meta-ordered-to-pay-375m-in-us-trial-child-exploitation-claims/106495616 Link to comment Share on other sites More sharing options...
BrettGC Posted March 25 Posted March 25 (edited) It's well known that Meta employs psychologists and sociologists to assist in the fine tuning of their "addictive" content algorithms. But you'll find both in many marketing teams in any corporation. Edited March 25 by BrettGC 1 Link to comment Share on other sites More sharing options...
Possum Posted March 25 Posted March 25 $375 million is a rounding error for META who likely figure it is well covered by the increased income addiction provides them 1 Link to comment Share on other sites More sharing options...
Forum Support Mike J Posted March 25 Forum Support Posted March 25 11 hours ago, Possum said: $375 million is a rounding error for META who likely figure it is well covered by the increased income addiction provides them But there are 49 more states. 2 Link to comment Share on other sites More sharing options...
Forum Support scott h Posted March 25 Forum Support Posted March 25 15 hours ago, Lee said: will appeal," a Meta spokesperson said in a statement. Just a mental exercise until the appeal process SLOWLY grinds through. Nothing will change in the foreseeable future. 1 Link to comment Share on other sites More sharing options...
Forum Support Old55 Posted March 25 Forum Support Posted March 25 1 hour ago, Mike J said: But there are 49 more states. This is big news here in the States. A national class action with each individual State taking a cut. I'm with Scott, not in our lifetime? 1 1 1 Link to comment Share on other sites More sharing options...
Lee Posted March 26 Author Posted March 26 The Social-Media Shakedown Begins The verdict against Meta and YouTube is a victory for the plaintiffs bar, not for children or society. A Los Angeles jury on Wednesday held Meta Platforms and Google’s YouTube liable for a 20-year-old woman’s personal troubles. The schadenfreude will be overwhelming—nail the billionaires! But using a novel product liability theory to shake down companies won’t help young people and isn’t a good way to make law. The $6 million verdict against the two companies is the first of more than 3,000 lawsuits pending in California courts that seek to hold social-media companies liable for the travails of young people. School districts and more than 40 state Attorneys General have also sued for damages to compensate for social problems allegedly caused by the platforms. Section 230 of the 1996 Communications Decency Act protects internet platforms from being held liable for harm caused by user-generated content. But plaintiffs are trying to dodge that law by arguing the platforms were negligent in how they designed their sites. They claim that features like so-called infinite scrolling and “like” buttons—not user posts per se—harm children. Whether this theory trumps Section 230 will be the main issue on appeal, and the platforms have a strong case. Trial lawyers are also trying to copy the Big Tobacco playbook by arguing that company executives concealed knowledge that their sites are addictive and deleterious to kids. “This case is as easy as ABC,” lead trial attorney Mark Lanier told the jury. “Addicting, brains, children,” adding companies “didn’t just build apps, they built traps.” The reality is that the link between youth mental health and social media is complicated. Take the 20-year-old plaintiff identified by the initials K.G.M. in the Wednesday case. She said she started using YouTube at age six and Instagram when she was nine. Both require users to be at least 13 years old, so she broke platform rules and bypassed controls. She says her compulsive use of social media made her “feel very depressed” and that unrealistic images she saw on the platforms made her feel insecure about her appearance. But are platforms supposed to prohibit users from posting photos that might make someone feel depressed or insecure? Sorry, Californians, no posting beach photos in December. She was also exposed to domestic abuse as a young child, which studies show can increase vulnerability to mental illness. Studies show that parenting plays a critical role in mediating and mitigating the impact of social media. Most children who use social media don’t experience severe problems. There’s no doubt that increasing teen use of social media and smartphones over the last 15 years has coincided with rising levels of depression, anxiety and other mental illnesses. But it’s hard if not impossible to prove that social media caused any given individual’s troubles, let alone apportion liability among the platforms. It’s also only recently that social scientists and policy makers have honed in on social media’s impact on youth mental health. The evidence presented at trial that executives purposefully designed the platforms to be addictive was weak. The trial lawyers’ best argument is that platforms should have done more to limit compulsive teen use. But companies aren’t required to design products to prevent abuse or excessive consumption. A jury in New Mexico on Tuesday nonetheless found Meta liable in a separate case brought by the state AG for $375 million for failing to protect young people from online dangers. The temptation to find corporate scapegoats for social ills is great. Congress for years has debated legislation to protect teens online, including stronger parental controls and privacy settings. But lawmakers have punted because, well, it’s easier to beat up Big Tech. Some Members also demand that any legislation include a right of private action that would let trial attorneys loot the companies. *** Which is what these lawsuits are really about. Trial lawyers will now use the L.A. verdict in advertisements to recruit more plaintiffs. They may even use the social-media platforms to advertise. Unemployed? Depressed? Spend your Friday nights scrolling? You could make big money by holding billionaires responsible for your problems. Trial lawyers and juries may figure that Big Tech companies can afford to pay, but extorting companies is certain to have downstream consequences. Meta and Google are spending hundreds of billions of dollars on artificial intelligence this year, which could have positive social impacts such as accelerating treatments for cancer. https://www.wsj.com/opinion/social-media-verdict-meta-youtube-california-6b7c05dd?mod=hp_opin_pos_1 Link to comment Share on other sites More sharing options...
Forum Support scott h Posted March 26 Forum Support Posted March 26 1 hour ago, Lee said: Trial lawyers It is always about a big payoff for the lawyers. If a company did not have deep pockets the "victims" would be out in the cold. But I always get back to "Where are the parents". I guess it is easier to blame someone else than accept personal responsibility. 3 1 1 Link to comment Share on other sites More sharing options...
hk blues Posted March 27 Posted March 27 17 hours ago, scott h said: It is always about a big payoff for the lawyers. If a company did not have deep pockets the "victims" would be out in the cold. But I always get back to "Where are the parents". I guess it is easier to blame someone else than accept personal responsibility. It's not like the old days where kids' behavior was visible to their parents - most kids have their own rooms, devices and so on and thus it's become harder and harder for parents to oversee their kids. Would you have sat on your child's shoulder in days gone by because that's what it would take nowadays - I suspect not? I'm anticipating some kickback on this but it's the reality of life as a parent nowadays. 1 1 Link to comment Share on other sites More sharing options...
GeoffH Posted March 27 Posted March 27 (edited) 7 hours ago, hk blues said: It's not like the old days where kids' behavior was visible to their parents - most kids have their own rooms, devices and so on and thus it's become harder and harder for parents to oversee their kids. That's one of the justifications behind the child social media bans that Australia have enacted (and that other countries are seriously considering). As for us we didn't have to ban phone use, the kids had hand me down phones and got too rough with them (broke the screens) so we basically said "well you broke it, you're not getting another phone until we have another hand me down to give to you". Oh the last two hand me downs went to the sister and the nephew so the problem has sort of dealt with itself There was a lot of angst at first but they're playing outside more (good) and watching more TV and youtube on TV (not so good) but their media consumption is down a lot. Edited March 27 by GeoffH 3 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now