Congress passed Section 230 at the dawn of the internet era to protect innovators from traditional publisher tort liability. At the time, the internet consisted primarily of basic message boards and informational pages. Courts have interpreted Section 230 to provide internet platforms with sweeping immunity from liability for third-party content. The statute has aged poorly and is now ill-suited for today’s internet tools. Modern social media platforms are more than message board intermediaries because they actively shape and select the information pushed to users via engineered, engagement-enhancing algorithms. Engagement algorithms are not merely neutral tools; web developers intentionally design them to dynamically learn and feed content to users. Social media companies amplify inflammatory and negative content because it yields the highest profits, resulting in documented harm to users. This harm includes eating disorder content that severely impacts teen girls’ mental health and misinformation that destabilizes democracies. Lemmon v. Snap reveals a new approach to internet liability that could overcome Section 230’s broad immunity. There, three teenagers tragically perished in a car accident while distracted by Snapchat’s “speed filter” feature. Section 230 did not immunize Snap from liability because the negligent design claim treated Snap as a products manufacturer and not as a publisher or speaker. This Comment connects previously explored theories of algorithm liability to real precedent by finding a new foothold in Lemmon and using a syllogism to liken algorithms to other liability-prone products. Courts should extend the Lemmon approach and hold social media companies responsible as product manufacturers for the harm their algorithm products cause.
Lemmon Leads The Way To Algorithm Liability: Navigating The Internet Immunity Labyrinth,
50 Pepp. L. Rev.
Available at: https://digitalcommons.pepperdine.edu/plr/vol50/iss4/3