RICHMOND, Va. (Legal Newsline) - A federal appeals court refuses to pin the blame for a mass shooting in South Carolina on Facebook, rejecting a lawsuit that said its algorithm radicalized murderer Dylann Roof.
Roof killed nine people and injured another at Emanuel African Methodist Episcopal Church in Charleston with the hopes of starting a race war. Facebook-parent Meta's desire for user engagement directed Roof to white supremacist propaganda.
The lawsuit was brought by the daughter of pastor Clementa Pinckney, who also served in the South Carolina Senate for 15 years before his murder at the age of 41.
Companies like Meta and YouTube routinely point to Section 230 of the Communications Decency Act when they are sued. It protects online platforms from liability for publishing third-party information.
Plaintiff M.P. challenged that protection in her lawsuit, but the U.S. Court of Appeals for the Fourth Circuit ruled otherwise. It rejected the notion that Facebook's algorithm to present unique third-party information to its users was outside the scope of Section 230.
The design is meant to facilitate radicalization and compulsive use by driving extreme emotional reactions, she said.
"M.P. thus characterizes her state tort claims as solely dealing with Facebook's algorithm, which she treats as a 'product,'" Judge Barbara Milano Keenan wrote for a three-judge panel.
"We are not persuaded by M.P.'s argument."
The Fourth Circuit found M.P.'s claims are "inextricably intertwined" with Facebook's role as a publisher of third-party content. It said M.P. couldn't show the algorithm was designed in an unreasonably dangerous manner.
"Indeed, without directing third-party content to users, Facebook would have little, if any, substantive content," Keenan wrote.
"Simply stated, M.P. takes issue with the fact that Facebook allows racist, harmful content to appear on its platform and directs that content to likely receptive users to maximize Facebook's profits."
Decisions by Facebook on what third-party content to show its users do not defeat its status as a publisher, Keenan wrote, just like a newspaper that prioritizes engagement in sorting its content.
"And the fact that Facebook uses an algorithm to achieve the same result of engagement does not change the underlying nature of the act that it is performing," she wrote.
"Decisions about whether and how to display certain information provided by third parties are traditional editorial functions of publishers, notwithstanding the various methods they use in performing that task."
Judge Allison Jones Rushing concurred but wrote Section 230 protections shouldn't extend to Facebook when it recommends that a user join a group, connect with another user or attend an event. That is Facebook's "own speech," she wrote.
"Unlike the majority, I would reverse the district court's dismissal of M.P.'s negligence claims on Section 230 grounds and remand her claims regarding Facebook's own conduct, including its group recommendations, for further proceedings," she wrote.