Export Compliance Daily is a Warren News publication.
'Defectively Designed'

Snap Is Sued in Sextortion Case After Teen Dies by Suicide Over Shared Photos

Snap’s “reckless disregard for the safety of minor users” allows sexual predators to use its social media platform to target and exploit minors, alleged a negligence complaint Friday (docket 3:24-cv-03068) in U.S. District Court for South Carolina in Columbia.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

The plaintiff estate of Timothy Barnett, through his mother, Elizabeth, as personal representative, filed the action for the wrongful death of the 13-year-old, who died by suicide on April 6, 2023, after being the victim of sextortion on Snap, said the complaint.

Snap is “defectively designed” with features that make the platform “unreasonably dangerous” for minors like Timothy, alleged the complaint. Snap failed to implement adequate age verification and other safeguards to protect vulnerable minors from connecting with and being exploited by predators, it said.

As a direct response of Snap’s “unsafe design,” lack of warnings and inadequate parental controls, a sexual predator extorted the teen, threatening to share “sexually explicit images Timothy had been manipulated into sending via Snapchat,” alleged the complaint. “Unable to cope with the trauma, shame and fear of exposure, Timothy took his own life,” it said.

Sexual predators are drawn to social media because of its easy access to a “large pool of potential victims, many of whom are addicted” to social media, the complaint said. It cited a February 2023 FBI warning about a global “financial sextortion crisis” which stated: “Financial sextortion can happen anywhere, although it mainly occurs on the digital platforms where children are already spending their screen time, like social media and gaming websites, or video chat applications.”

Using fake accounts, predators often pose as girls of a similar age and target young boys to trick them into sending explicit photos or videos, the complaint said. The predator then threatens to release the materials unless the victim sends payment, though in many cases, the predator “will release the images anyway,” it said. Rather than mitigate the risk of sexual exploitation and harm to minors on its platform, Snap “has facilitated and exacerbated it by implementing defective product features that help sexual predators connect with children,” alleged the complaint.

Elements of Snap’s unsafe product design include “flawed age verification, lack of meaningful mechanisms to prevent sham accounts, default-public profiles, matching and recommending connections between adults and minors, promoting unsolicited messages and interactions from adults, and wholly inadequate and ineffective parental controls, among others -- that allow children to be easily identified, targeted, accessed, and exploited,” alleged the complaint.

Worse, Snap routinely fails to report abuse, said the complaint. It cited a DOJ report on child exploitation saying platforms often don’t do anything in response to reports of “problematic” behavior. Reports could be of value, signaling when products are being exploited by offenders or when underage children access inappropriate content, it said.

By failing to implement adequate age and identity verification, Snap “knowingly and foreseeably places children in the pathways of sexual predators, who utilize its product and exploit the defective design features,” the complaint said. The platform's defective design allows minors to easily enter fake dates of birth to bypass age restrictions; has no measures to prevent users from creating multiple sham accounts under different names and ages; and has “incomplete implementation” that requires users to provide date of birth without verification, it said. Snap has the technology to identify minors posing as adults and vice versa, “but it does not use this information to identify violative accounts and remove them from their products,” it said.

The defendant’s “Snap” feature allows users to send and receive “disappearing” audiovisual messages that are viewable for as little as a few seconds, the complaint noted. Once the allotted time expires, “the Snap disappears forever,” which reduces teenagers’ “communication apprehension and encourages users to send photos depicting deviant behavior,” such as sexting, cyberbullying, underage alcohol consumption, and illegal drug use, it said.

But disappearing Snaps don't “operate as advertised” and recipients are able to save “or record them at will,” the complaint said. “This is particularly harmful to adolescents, who rely on Snap’s representations when taking and sending photos, and who only learn after the fact that recipients have the means to save photos or videos,” it said: “In many cases, this leads to sexual exploitation.” Snap could, but doesn’t, warn users that Snaps “may not necessarily disappear,” it said.

Timothy was a victim of sextortion in early 2023 when an adult predator posing as a young girl manipulated the Sumter, South Carolina, resident into sending “sexually explicit images and then threatened to share those images unless Timothy met the predator's demands,” the complaint alleged.

Causes of action are strict liability for design defect and failure to warn, plus negligence and negligence per se. The plaintiff seeks damages of over $10 million for losses suffered by Timothy’s statutory beneficiaries, including emotional distress and grief for loss of his society, companionship, and support; funeral and burial expenses; punitive and exemplary damages; attorneys’ fees and costs; and pre- and post-judgment interest.