Export Compliance Daily is a Warren News publication.
'Defective Product Features'

S.C. Mom Alleges Snap Was Negligent in Sextortion That Prompted Her Teen's Suicide

Snap’s platform is “designed in unsafe ways,” alleged a survival action complaint (docket 4:24-cv-03521) in U.S. District Court for South Carolina in Columbia. The mother of a 13-year-old Sumter, South Carolina, boy who died by suicide brought the complaint.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Elizabeth Ann Barnett, as personal representative for her late son, Timothy, alleges he was a victim of “sexual exploitation” on the platform that Snap knowingly facilitated. He was a sextortion victim Snapchat when an adult predator posing as a young girl manipulated him into sending sexually explicit images and then “threatened to share those images unless Timothy met the predator's demands,” Barnett alleged.

“Overwhelmed by shame, trauma, and fear that the images would be publicly exposed, Timothy committed suicide on April 6, 2023,” the complaint said.

Snap’s “defective product features have benefited sexual predators,” who can send a “disappearing” audiovisual message, called a Snap, that “disappears forever” after a designated time period, the complaint alleged. The limited display time is intended to reduce "teenagers’ communication apprehension," but also encourages "users to send photos depicting deviant behavior,” the complaint alleged. Sexting, cyberbullying, underage alcohol consumption and illicit use of narcotics are also commonly the subject of Snaps, the complaint said.

Snaps “do not operate as advertised,” because recipients can “save or record them at will,” alleged the complaint. That is “particularly harmful to adolescents, who rely on Snap’s representations when taking and sending photos, and who learn only after the fact that recipients have the means to save photos or videos,” it said: “In many cases, this leads to sexual exploitation.”

The defendant could, but doesn’t, warn users that Snaps may not necessarily disappear, the complaint alleged. For young users, Snaps are especially dangerous because the company’s parental controls “are ill-equipped to mitigate the risks" they pose, it said. But even with parental controls, parents are unable to view a Snap’s content, so they can’t adequately protect their children or deter them from engaging in “dangerous behavior in conjunction with sending Snaps,” it alleged.

Predators leverage Snap’s “disappearing” technology “to assure young users that there is no risk to them sending a sexual photo or video,” the complaint alleged. “Trusting young users are then horrified to discover that these videos have been captured by predators and then circulated to their own friends and contacts or other sexual predators,” it said.

Another “defective” Snap feature, “My Eyes Only,” allows users to hide harmful content from their parents under a special tab that requires a passcode for access, the complaint alleged.

Content in My Eyes Only “self-destructs” if a user tries to access the folder with the wrong code, the complaint said. The feature “has no practical purpose or use other than to hide potentially dangerous content from parents and/or legal owners of the devices used to access Snapchat,” the complaint alleged. The information and evidence found in My Eyes Only “should be in Snap’s possession and control,” it said, but the company has designed the feature "in a way that causes the permanent loss of relevant, material, and incriminating evidence.” My Eyes Only content is unrecoverable, even by Snap, it said. The company designed the feature “knowing it would likely be used to store potentially illegal and injurious photos and images like sexts,” it alleged.

In “severe” cases, young users such as Timothy “find themselves in the nightmarish scheme known as 'sextortion,’ where a predator threatens to circulate the sexual images of the minor unless the predator is paid to keep the images under wraps,” the complaint said. As a direct and foreseeable consequence of Snap’s “connecting children to sexual predators,” its product “facilitates and increases the risk of sexual exploitation, sexual abuse, and sextortion of children,” it alleged.

Barnett asserts claims of strict liability for design defect and failure to warn, plus negligence and negligence per se. She seeks compensatory damages of more than $10 million for Timothy’s “pre-death pain and suffering, emotional distress, and loss of enjoyment of life" in an amount to be determined at trial; punitive and exemplary damages for Snap’s “willful, wanton and reckless conduct”; attorneys’ fees and costs; and pre- and post-judgment interest. Snap didn't comment Wednesday.