A number of school districts nationwide are reportedly filing lawsuits against social media giants Instagram, Snapchat, TikTok, and YouTube, as a means of countering this country’s mental health crisis among youths.
Seattle Public Schools was among the first wave of districts to file lawsuits in early January.
In its lawsuit, SPS claims that social media companies “exploit the same neural circuitry as gambling and recreational drugs to keep consumers using their products as much as possible,” and that social media is used by 90% of youths, ages 13-17.
Seattle Public Schools highlighted one study showing that users check Snapchat 30 times a day, according to The Washington Post. Also, nearly 20% of teens use YouTube “almost constantly.”
Also, SPS alleges that the youths “who say they cannot stop or control their anxiety, who feel so sad and hopeless, they stop doing the activities they used to love, who are considering suicide.”
San Mateo County, which oversees 23 school districts in northern California, recently filed a 107-page complaint in federal court, alleging that social media companies were using advanced artificial intelligence and machine-learning technology to create addictive platforms for young people.
“The results have been disastrous,” the San Mateo filing asserts. “There is simply no historic analog to the crisis the nation’s youth are now facing.”
As a supplement, San Mateo County also referenced data from the Centers for Disease Control and Prevention (CDC) which chronicled the rising rates of suicidal thoughts and depressive symptoms among high school students.
School administrators have noticed a spike in mental health emergencies during the school day, according to Nancy Magee, San Mateo County superintendent of schools. There has also been a rise of “very serious” cyber-bullying incidents related to social media.
“The social media companies create the platforms and the tools but the impacts are felt by schools, and I would really like to see an understanding of that,” said Magee. “And then that the education community receives the resources in both people and tools to help support students adequately.”
Similar lawsuits against social media companies have been filed in Pennsylvania, New Jersey, and Florida; and according to the Post, more legal action will take place in the coming weeks.
According to the Post, TikTok, YouTub, Instagram, and Snapchat provided written statements about how each company prioritizes teen safety among users.
TikTok “cited age-restricted features, with limits on direct messaging and livestreams, as well as private accounts by default for younger teens. It also pointed to limits on nighttime notifications; parental controls, called Family Pairing, that allow parents to control content, privacy and screen time; and expert resources, including suicide prevention and eating disorder helplines, directly reachable from the app.”
The Google-owned YouTube has a “Family Link” setting for parents to limit their children’s screen time and potentially block certain types of content on supervised devices.
Also, according to spokesperson José Castañeda, YouTube has built-in protections for users 17 and younger.
Meta, which owns Instagram, has age-verification technology and notifications to take regular breaks.
“We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99 percent of it before it’s reported to us,” said Antigone Davis, Global Head of Safety of Meta.
And Snapchat said its platform “curates content from known creators and publishers and uses human moderation to review user generated content before it can reach a large audience,” according to a company spokesperson.
© 2023 Newsmax. All rights reserved.
Read the full article here