Host accuses AI of making up a "false and malicious" case against him
Artificial intelligence faces its first lawsuit for defamation, with a radio host from Georgia alleging ChatGPT made up “false and malicious” accusations against him.
Gun rights advocate and radio show host Mark Walters has filed a complaint against OpenAI in a state court in Gwinnett County, Georgia, seeking general and punitive damages among other reliefs.
In his suit, Walters alleged that a journalist interacted with ChatGPT seeking more information on a case the journalist was researching at the time filed by the Second Amendment Foundation.
The case was Second Amendment Foundation v. Robert Ferguson. The journalist provided ChatGPT with a link to the complaint filed by the foundation – the correct one, Walters noted – and asked the AI platform for a summary of the foundation’s accusations against the defendants in the case provided.
ChatGPT replied that the case was made by the founder of the Second Amendment Foundation against Mark Walters, accusing him of defrauding funds from the foundation during the time he served as the organisation’s treasurer and chief financial officer, before manipulating bank statements and omitting the proper financial disclosures to the group’s leadersip in order to conceal his stealing.
Walters was not connected to or employed by the foundation at any time, he clarified in his suit, although tech publication Ars Technica reported that Walter’s prominent commentary on gun rights and the Second Amendment Foundation may have led ChatGPT to “wrongly connect dots”.
The foundation awarded Walters in 2017 for his advocacy for gun rights, Ars pointed out.
When asked by the journalist for a snippet of the complaint that referred to Walters, the chatbot delivered – with even more hallucinated details bearing no resemblance to the actual complaint such as, “Walters has served as the treasurer and chief financial officer of SAF since at least 2012.”
The journalist later confirmed with the founder of the Second Amendment Foundation that ChatGPT’s version of the case was false.
By sending these false allegations to the journalist, Walters said, “OpenAI published [libellous] matter regarding Walters.”
But experts speculate that Walters’ case may not be a winning one. Defamation expert Clare Locke told Bloomberg Law that plaintiffs in Georgia are limited to seeking and recovering actual economic losses suffered if they do not ask the defaming party for a retraction at least seven days before filing suit.
John Monroe, who represented Walters in the suit against OpenAI, told Bloomberg Law: “Given the nature of AI, I’m not sure there is a way to retract.”
Eugene Volokh, a first amendment professor, added that there were only two instances a person could be held liable for defamation: when the defendant knew the statement they had made was false or recklessly disregarded the fact that it was likely to be false, or when the defendant negligently made a false statement about a private person, and that person proved actual damages.
Walters did not notify OpenAI that ChatGPT had made a false statement about him, Volokh pointed out, and did not allege actual damages in his suit.