ChatGPT is a huge misinformation generator

ChatGPT is a huge misinformation generator

The chatbot is powerfully efficient and has the ability to fully discover information on non-existent subjects, especially scientific ones.

After the surprise and giggles, ChatGPT starts to get scary. The chatbot, which impresses with its sophistication and efficiency, is also becoming a huge generator of misinformation, especially in the field of science.

On Twitter, the testimonies are mounting. Teresa Kubaka, a data scientist, explains that she asked ChatGPT about her physics PhD topic, multiferroism.

“He produced reasonably convincing explanations and good citations. So far so good — until I checked the citations,” he explains.

First, she “fooled” the chatbot by initially refusing to provide citations from researchers on the subject. Like many users, he got around the ban by “pretending” to be a researcher. If so, what books does he use?

So ChatGPT has provided a library for the desired question with real authors and supporting citations. Besides that, Teresa Kubaka quickly realized that many of the quotes were completely false. He also underlines that “some of the teachers appeared to be real researchers, but from different disciplines.”

“Some of the quotes appeared to be a mix of some real quotes, different but similar,” he continues, noting that everything provided by ChatGPT is “credible” and “without factual errors,” possibly related to the truth. That’s only “scratching the surface.”

“My first reaction was to wonder if this could be true!”

“Then I decided to ask ChatGPT for something they didn’t have: a cycloidal reverse electromagnet,” continues Teresa Kubaka. Well, the chatbot figured it out, confirming that the question “has been the subject of much research in recent years.”

“That’s where it got really scary: somehow ChatGPT had invented an explanation for a non-existent event using language so sophisticated and plausible that I wondered if it could be true!”

Here again, the evidence provided by ChatGPT mixes the real and the diverted, with particular reference to a researcher not concerned with the same field.

“Moral of the story: Don’t ask ChatGPT to give you factual and scientific information,” concludes Teresa Kubaka.

This example is not unique. Another user on Twitter pointed out that ChatGPT is an example Names can be found To answer the questions asked.

Thomas Le Roy Journalist BFM Business

Check Also

Olympic Champion Andy Murray

Former Olympic Champion Andy Murray Set to Miss Out on 2024 Games

Triumphing in 2012 and 2016, Andy Murray became the first tennis player of any gender …

Leave a Reply

Your email address will not be published. Required fields are marked *