A regional Australian mayor has threatened to sue OpenAI if it does not correct ChatGPT’s false claims that he had served time in prison for bribery.
Brian Hood, who was elected mayor of Hepburn Shire, northwest of Melbourne, last November, became concerned about his reputation when members of the public told him ChatGPT had falsely named him as a guilty party in a foreign bribery scandal.
Lawyers representing Hood said he had worked for a subsidiary of the Reserve Bank of Australia but was the person who notified authorities about payment of bribes to foreign officials to win currency printing contracts and was never charged with a crime.
The lawyers said they sent a letter of concern to ChatGPT owner, OpenAI on March 21, which gave OpenAI 28 days to fix the errors about their client or face a possible defamation lawsuit.
OpenAI, which is based in San Francisco, had not yet responded to Hood’s legal letter, the lawyers said.
If Hood sues, it would likely be the first time a person has sued the owner of ChatGPT for claims made by the automated language product.
Meanwhile, a Microsoft spokesperson was not immediately available for comment.
James Naughton, a partner at Hood’s law firm Gordon Legal, told Reuters that it would be a landmark moment in the sense that it is applying defamation law to a new area of artificial intelligence and publication in the IT space.
“He’s an elected official, his reputation is central to his role,” Naughton said. Hood relied on a public record of shining a light on corporate misconduct,
“so it makes a difference to him if people in his community are accessing this material,” he added.
Australian defamation damages payouts are generally capped at around AUD400,000 ($269,360). Hood did not know the exact number of people who had accessed the false information about him, but the nature of the defamatory statements was serious enough that he may claim more than AUD200,000, Naughton said.
If Hood files a lawsuit, it would accuse ChatGPT of giving users a false sense of accuracy by failing to include footnotes.
Naughton said, “It’s very difficult for somebody to look behind that to say ‘how does the algorithm come up with that answer? “It’s very opaque.”