When people in public informed Brian Hood, the mayor-elect of Hepburn Shire, 120 kilometres northwest of Melbourne, that ChatGPT had wrongly implicated him in an international bribery case involving a Reserve Bank of Australia business in the early 2000s, Hood expressed concern for his reputation.
Although Hood did work for the subsidiary, Note Printing Australia, he was the one who alerted law enforcement to payments of bribery to foreign officials in order to gain currency printing contracts and, according to his attorneys, was never charged with a crime.
On March 21, the lawyers allegedly wrote a letter of complaint to OpenAI, the company behind ChatGPT, giving it 28 days to correct the mistakes about their client or risk being subjected to a potential defamation action.
The attorneys said San Francisco-based OpenAI had not responded to Hood’s legal letter. An after-hours inquiry for comment from OpenAI received no response.
If Hood sues, it would likely be the first time a person has sued the owner of ChatGPT for claims made by the automated language product, which has become wildly popular since its launch last year. In February, Microsoft Corp. included ChatGPT in its Bing search engine.
A Microsoft official could not be reached for comment right away. According to James Naughton, a lawyer at Hood’s legal company Gordon Legal, “it might be a landmark event in how it applies this defamation legislation to a new area of artificial intelligence and publication in the IT world.”
According to Naughton, his reputation is essential to his job as an elected person. Hood relies on a public record to expose corporate wrongdoing. Thus it matters to him if members of his community have access to this information.
Defamation damages awards in Australia are typically restricted to $400,000 (US$269,360). Hood did not know the precise number of persons who had accessed the false material about him, which is a factor in determining the settlement amount, but Naughton said the seriousness of the defamatory remarks meant Hood may be able to recover more than $200,000.
According to Naughton, Hood would accuse ChatGPT of giving users a false impression of truth by leaving out footnotes if it brought a lawsuit.
It’s pretty challenging for someone to ask, “How does the algorithm come up with that answer?” behind that, according to Naughton. It is highly opaque.