Quantcast

Snapchat, Google and Apple defeat claims they help sexual predators find victims

LEGAL NEWSLINE

Saturday, November 23, 2024

Snapchat, Google and Apple defeat claims they help sexual predators find victims

Federal Court
Biz wrk google age dmt

SAN DIEGO (Legal Newsline) - Federal law protecting online platforms from liability over what users post has blocked young girls' claims that apps like Snapchat are dangerous because they help sexual predators locate victims.

San Diego federal judge Larry Burns on June 5 dismissed lawsuits against Snap, Apple and Google brought by a trio of girls known only by their initials. Plaintiff L.W. says from age 12 to 16, she was groomed and abused by a man on Snapchat after they met on Instagram.

Snapchat fosters a sense of impunity for predators, while Apple and Google steer users to an app called Chitter that also featured in L.W.'s abuse, the suit says.

The defendants invoked Section 230 of the Communications Decency Act, which says "no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

Plaintiff lawyers attempted to work around that immunity, arguing the defendants materially contributed to the wrongful behavior that caused harm to their clients. Part of their argument said Snapchat's Quick Add feature allows predators identify and connect with children based on geographic location and mutual interests.

The Ninth Circuit had said Roommates.com was not entitled to Section 230 immunity because it was specifically designed to match potential roommates based on sex, family status and sexual orientation and, therefore, encouraged users to post content that violated fair housing laws.

Citing other rulings for Snapchat, though, Burns said the flaws alleged in its design "in essence, seek to impose liability on [Snap] based on how well [Snap] has designed its platform to prevent the posting of third-party content containing child pornography and to remove that content after it is posted."

Claims against Apple and Google failed for similar reasons, Burns wrote. They also couldn't be held liable for material that violates federal law regarding child sexual abuse material because they did not post it themselves.

L.W.'s lawsuit says she tried to block a man known as B.P. multiple times but he would contact her through Instagram or with a fake Snapchat account. B.P. used Snapchat to sexually abuse others, the suit says, from his location in the Marine Corps Base Camp Pendleton.

"B.P. would ridicule and berate her if L.W. refused and would compliment her when she would comply," the lawsuit says. "B.P. first tasked L.W. for photos and videos in her underwear, then photos in the shower, and eventually photos and videos of L.W. depicting L.W.'s face and body, as well as exposed breasts and vaginal area.

"The videos include L.W. masturbating and penetrating her vagina with foreign objects at B.P.'s instructions and requests." 

More News