Almira Osmanovic Thunström, a doctoral researcher at the Institute of Neuroscience and Physiology at Gothenburg University, gave the artificial intelligence algorithm GPT-3 a task to write an academic paper about itself. In two hours, the AI algorithm was able to write the publication in academic language, with substantiated references given in the right places and context.
When writing the paper, Thunström, knowing that GPT-3 gives out several answer choices and then the most human-like passages are selected, decided to give instructions on how to write an academic paper, trying to keep the process virtually intact. Thunström used only the first (maximum third) iteration of GPT-3 and tried to refrain from editing or selecting the best parts.
There are two reasons for the choice of topic. Firstly, GPT-3 is a fairly new development, so there are few studies devoted to it. This means it has fewer data to analyse on the topic of the article. Secondly, if GPT-3 makes a mistake, it will be the responsibility of the experimental team.
GPT-3 writing about himself and making mistakes does not mean that he still cannot write about himself, which is what we tried to prove.
After writing the article, unusual questions arose. What was GPT-3's last name? Since entering the first author's last name was obligatory, something had to be written. Thunström wrote "None". However, when asked about her phone number and e-mail address, she had to resort to using her contact information and that of her adviser.
The question then came up whether the authors agreed to the publication. To answer this question, Thunström approached the AI and was told "Yes". It was then asked if it had a conflict of interest with any of the co-authors. It replied: "No".
The GPT-3 paper has now been assigned an editor at the academic journal to which it was sent, and is published on the international French-owned pre-print server HAL. The unusual main author is probably the reason for the protracted investigation and evaluation.
Thunström notes that the experiment raises ethical and legal questions about the publication of such papers:
We just hope we didn’t open a Pandora’s box.
GPT-3 previously wrote a news article, a book (which took 24 hours) and a piece on behalf of an 18th century author.