Since Congress passed the Communications Decency Act (CDA) in 1996, courts have interpreted Section 230 of the CDA as providing internet companies with incredibly broad immunity from liability for hosting user-generated content. Section 230 states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Time after time, internet companies have successfully used Section 230 as grounds to get claims dismissed in the early stages of litigation, typically before discovery. While many plaintiffs have been unsuccessful in circumventing Section 230’s immunity provisions, two recent cases have succeeded by employing a new theory: product liability. By drawing a distinction between a company’s harmful product design and its user-generated content, courts have started to chip away at the longstanding liability shield.
This theory first gained traction in Lemmon v. Snap, Inc., where the Ninth Circuit Court of Appeals held that Section 230 did not immunize Snap from liability. In Lemmon, the surviving parents of two boys filed a wrongful death action against Snap for its negligent design. Mere minutes before the boys’ car careened off the road into a tree and became engulfed in flames, the boys used Snapchat’s speed filter to document their speed—over 100 miles per hour. At the time of the accident, Snapchat rewarded users with trophies for various achievements but never disclosed how to earn them, which incentivized them to experiment with Snapchat’s features to see what would yield a trophy. Consequently, many users suspected that there was a trophy associated with recording a snap with the speed filter recording a speed of 100 miles per hour or faster.
The district court granted Snap’s motion to dismiss solely on the basis of Section 230 immunity. In determining whether the claims were barred, the Ninth Circuit employed a three-prong: liability is precluded for “(1) a provider or user of an interactive computer service (2) whom a plaintiff seeks to treat … as a publisher or speaker (3) of information provided by another information content provider.” The court concluded that the second prong was not satisfied because the claims treated Snap as a products manufacturer with a “duty to exercise due care in supplying products that do not present an unreasonable risk of injury or harm to the public,” and not as a publisher or speaker. It was immaterial that one of the boys had sent a snap using the speed filter. Instead, the claim merely “seek[s] to hold Snapchat liable for its own conduct, principally for the creation of the Speed Filter.”
In A.M v. Omegle.com, LLC, which cites Lemmon, the United States District of Court for the District of Oregon held that Section 230 did not immunize Omegle from liability. In A.M., Omegle, a virtual one-on-one chat service that randomly pairs users, matched A.M., an eleven-year-old, with Ryan Fordyce, a man in his late thirties. A.M. was struggling with teenage insecurities at the time, and Fordyce promised to help her feel better. By the end of this conversation, Fordyce had persuaded A.M. to share her contact information. Later that night, Fordyce “strategically gained A.M.’s trust and induced A.M. to send him photos of herself.” He told her “it was integral to her “healing” to trust him even if she felt uncomfortable.” Over the following three years, Fordyce forced A.M. to send pornographic content, perform for him and his friends, and recruit other minors for him to sexually abuse. He threatened to leak A.M.’s photos and videos if she reported him to law enforcement or stopped complying with his demands.
In 2021, A.M. brought a civil action against Omegle for pairing her with Fordyce. Omegle moved to dismiss the case, claiming it is immune from suit under the immunity provisions of Section 230’s immunity provision. The court held that the product liability claims fell outside the scope of Section 230 immunity. Like in Lemmon, where the claim rested on Snap’s own acts rather than third-party content, A.M.’s claim turns on Omegle’s design and warnings and not on Fordyce’s communications to the plaintiff. The court concluded that “Omegle could have satisfied its alleged obligation to Plaintiff by designing its product differently—for example, by designing a product so that it did not match minors and adults.”
If you have been the victim of harm facilitated by an internet company, contact our team of licensed, caring professionals today to learn about your legal rights. Call today for a free, confidential consultation at: 1-888-407-0224 or use our confidential submission form. We will treat you with dignity and respect.
You are not alone. We are here to help.