In response to a lawsuit brought by the family of a grieving teenage girl, her suicide was said to have been inspired by the hit show. 13 Reasons WhyAnd Netflix Browse First Amendment Rights to argue who – which If the complaint is made, it will be dangerous for the artists’ freedom of expression and Netflix itself.
in a new documents Netflix filed in a California district court on Wednesday, invoking California’s anti-SLAPP law, which gives plaintiffs the right to file an application to dismiss a complaint filed against any content that could be considered protected speech. In his move, the broadcast giant argues that if the First Amendment’s challenge to its ability to produce stimulating content potentially successful, “a long line of creative work — from classics like I’m KareninaAnd AntigoneAnd the awakeningAnd Madame Bovary, And bell jar, to countless recent works such as Dear Evan Hansen,And the perks of being a WallflowerAnd The wristband: a love story, And Virgin’s suicide– He will also be in danger.
“Creators who are obligated to protect certain viewers from expressive acts depicting suicide will inevitably censor themselves to avoid the threat of liability,” Netflix lawyers wrote in the new files. “This would dampen activity and limit the diversity of public debate…The First Amendment does not permit such an outcome.”
Based on the youth novel of the same name by author Jay Asher, 13 Reasons Why It depicts the events that precipitated the narrator’s suicide of high school age. Although the Netflix lawsuit is brought by one grieving family, the study of Posted by The Journal of the American Academy of Child and Adolescent Psychiatry reported a 28.9% increase in suicides among Americans ages 10 to 17 in the following month. 13 Reasons Why First Shown – Bigger Increase Than Any Other Shown In One Month During the five-year period that the researchers studied.
In the strike motion filed on Wednesday, Netflix lawyers were keen to note that the platform has not been sued over content 13 Reasons Why itself, but rather because of its “…failure to adequately warn about its offerings, i.e. its dangerous product features” and its “individual dataset of its users to specifically target and manipulate children at risk to view content that was very harmful to them – despite the warnings horrific about the potential and foreseeable consequences for these children.”
This recommendation system — dictated by an algorithm — counts as protected speech, and is like a news editor deciding to “exercise” editorial control and judgment,” Netflix argues:
The motion to reject states that “the system of recommendations and the presentation of proposed titles is speech.” “The plaintiffs claim that the recommendations here are different because they are dictated by an algorithm. But the fact that the recommendations “can be algorithmically produced” makes no difference to the analysis. After all, the algorithms themselves were written by humans…”
Netflix and the plaintiffs are scheduled to appear in court on November 16.
If you or someone you know is considering suicide, please contact The National Suicide Prevention Lifeline At 800-273-TALK (8255).