How's it going America,
Check out our latest episode of The Pod Millennial! I talk to Pop Culture Crisis' Mary Morgan on Timothee Chalamet, looksmaxxing, why fashion sucks now, and Gen Z dating. Our next episode drops Tuesday. Listen, rate (5 stars, of course!), and subscribe!
Let's get into it:
A jury in Los Angeles decided that Meta and YouTube were liable to the tune of $3 million for the distress caused to a young woman who used social media too much. This case opens the door for hundreds more cases like it, where people who just couldn't put down their phones are taking their issue to the courts. The jury heard that the young woman began using social media when she was just 6-years-old and was unable to look away. She ended up depressed, anxious, and thinking about self-harm, the jury heard.
She told jurors that she said she wanted to be on social media all the time because she was worried she would miss something. The jury agreed with her and awarded her millions for her pain and suffering. I find it odd that the girls' parents didn't take it away from her, didn't leave the phone in a drawer for months at a time, didn't help her find some other way to occupy herself. As a mom whose son had an iPad far sooner than, in retrospect, I wish he did, I know that it can be hard to take the devices away, but let's be honest, it's not that hard. There's this crazy thing parents can do called saying "no" and as all parents know, it works wonders.
A Meta spokesperson said in a statement following the verdict, "We respectfully disagree with the verdict and are evaluating our legal options." In their defense, they brought up the plaintiff's turbulent childhood and other family issues, saying that she used the apps to escape from and cope with the trauma of her life. The jury heard from the plaintiff's attorney that it was the algorithms, endless scrolling, and pop-up recommendations from both platforms that were to blame for their client's inability to put down the devices. Attorneys for both Snapchat and TikTok were also in the courtroom, likely trying to figure out how they'd be able to defend their platforms from the same result.
The verdict in LA follows one in New Mexico, where a jury ordered Meta to pay $375 million over failures to protect children from sexual predators on their platforms. In that case, Meta said, "We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online."
Do social media platforms know that their algorithms draw users into their platforms and make it hard for people to log off? Yes. Do they work to make their algorithms even stickier? Undoubtedly. But it is up to each of us human beings to make determinations for ourselves, to have the wherewithal to look away, to put the phone down. It's not always easy for me to look away from Instagram, but it's not up to Instagram to tell me to put the phone down, it's up to me to do it, and of course, I am capable of doing it. In the LA case, the issue is that this girl did not have the tools to do that. But it's not up to Meta or YouTube to give her those tools, those platforms didn't give her a phone or access to their platforms. Her parents did that.
Libby
