- Share this article on Facebook
- Share this article on Twitter
- Share this article on Email
- Show additional share options
- Share this article on Print
- Share this article on Comment
- Share this article on Whatsapp
- Share this article on Linkedin
- Share this article on Reddit
- Share this article on Pinit
- Share this article on Tumblr
Nearly 35 years ago, CBS Records prevailed against a family alleging that Ozzy Osbourne’s music caused a teenager to kill himself. This week, Netflix evoked that now famous case (McCollum v. CBS) in an attempt to similarly beat a lawsuit over 13 Reasons Why, which one grieving family blames for the suicide of their teen daughter “Bella.” Is it now time to re-examine? Content may now be pushed algorithmically on entertainment fans, but according to Netflix, that’s no reason to depart from First Amendment precedent.
In court papers, Netflix says that from Romeo and Juliet to Dead Poets Society, teen suicide has often been explored in literature and movies. And the streamer adds that creators should have such rights.
“Creators obligated to shield certain viewers from expressive works depicting suicide would inevitably censor themselves to avoid the threat of liability,” argue Netflix’s lawyers at Munger Tolles & Olson. “This would dampen the vigor and limit the variety of public debate. In such a landscape, a long line of creative works—from classics like Anna Karenina, Antigone, The Awakening, Madame Bovary, and The Bell Jar, to countless modern works like Dear Evan Hansen, The Perks of Being a Wallflower, Wristcutters: A Love Story, and The Virgin Suicides—would be at risk. The First Amendment does not permit such a result.”
When 13 Reasons Why, based on a young-adult novel by Jay Asher, came out in 2017, it was a mini-sensation. The teen drama which slowly unpacks the suicide of a main character was also took heat due to the graphic death scene in one of the final episodes of the first season. The show was blamed for triggering several real-life suicides, and while hardly admitting culpability, the controversial scene was eventually cut.
Netflix is still in court over what happened, though, and if there’s one thing that is arguably different about this case from the one over Ozzy Osbourne’s music decades ago, it’s the prominence of a recommendation algorithm.
“Netflix is not being sued because it created a Show of questionable morality that arguably glorifies teenage suicide,” states a complaint in California federal court. “It is not being sued for any dissemination of, i.e., public broadcast of, the Show or for the offering of the Show for public consumption…Rather, the bases of the claims against Netflix stem from something else: (1) Netflix’s failure to adequately warn of its Show’s, i.e., its product’s, dangerous features and (2) Netflix’s use of its trove of individualized data about its users to specifically target vulnerable children and manipulate them into watching content that was deeply harmful to them—despite dire warnings about the likely and foreseeable consequences to such children.”
In a motion to strike under California’s anti-SLAPP statute, Netflix addresses both theories.
With respect to the allegation that Netflix should have ensured that 13 Reasons Why wasn’t being targeted towards “the most vulnerable members in society,” the streamer suggests there’s not much difference between an algorithm and a news editor.
“The recommendations system, and the display of suggested titles, is speech,” states a dismissal motion. “The recommendations fall within the well-recognized right to exercise ‘editorial control and judgment.’ Plaintiffs allege that the recommendations here are different because they are dictated by an algorithm. But the fact that the recommendations ‘may be produced algorithmically’ makes no difference to the analysis. After all, the algorithms themselves were written by human beings, and they ‘inherently incorporate… engineers’ judgments…’ The recommendations generated are ‘much like many other familiar editorial judgments,’ such as the guidebook writer’s judgments about which attractions to mention and how to display them, and Matt Drudge’s judgments about which stories to link and how prominently to feature them.”
Read the full brief, which also gets into side topics like why Netflix’s recommendation to watch 13 Reasons Why shouldn’t be considered unprotected incitement.
UPDATE: A Netflix spokesperson gave us this comment: “Our hearts go out to this family, who have suffered a terrible loss. But, their counsel’s description of Netflix’s recommendations system is not accurate. Netflix suggests shows to our members solely based on what they watch. Netflix doesn’t collect data such as age or gender from our members when they sign up, and there is no audience targeting based on personal information or those characteristics.”
Sign up for THR news straight to your inbox every day
Writers Guild of America