ChatGPT has proven to be extremely popular since its launch late last year.
People have used artificial intelligence for everything from writing work reports to making diet plans to applying for jobs.
MailOnline looks at five things ChatGPT can’t do, from playing the popular online game Wordle to remembering your own name.
The Metaverse premium devices will provide new experiences for users in 2023
Five things ChatGPT artificial intelligence can’t do
1. Can’t remember own name
It’s amazing what ChatGPT can do. But one thing he doesn’t know, and that’s who he really is. Mention ChatGPT and you start saying you “don’t know anything about a ChatGPT”.
This is probably due to the fact that the creators had not yet found a name for it when it was programmed.
2. Provide information after 2021
ChatGPT training data will be discontinued in 2021. This means that you are not aware of current events, trends, or anything that happens next.
Cupidbot is the artificial intelligence that promises to get straight men dates without too much effort
So despite being trained on a large amount of text data to generate human-like responses to text-based prompts, you don’t know what the world was like for the last two years because you’re not connected to the internet. Does it remind you of Terminator 3?
3. Unable to play Wordle
You don’t know how to play the popular online word game and don’t understand the premise or rules. This is because Wordle gained popularity after 2021 and ChatGPT can no longer provide information after this.
4. You are unable to write accurate news articles
He has the power to pass legal exams, write entire articles, and even code entire websites.
It has even been suggested that ChatGPT could one day render journalists obsolete by writing stories for them.
Artificial intelligence is already part of our lives and these five everyday examples prove it
Tests have shown that ChatGPT can write quite well, but many of its stories contain errors. This is because it tends to fabricate knowledge without you understanding it.
The bot can do things pretty well, including writing quick intros, but it lacks the nuance and depth to write articles on its own.
Frankly, the bot warns that it may “occasionally generate incorrect information” and “produce malicious instructions or biased content”.
5. Provide advice on prescription drugs
ChatGPT recently caused a stir in the medical community after it was found to be able to pass the gold standard exam required to practice medicine in the United States, raising the possibility that it could one day replace human doctors.
However, one area where it should not be used is for prescription drug advice.
Expert in talking to the machine: This is the new job created by the rise of artificial intelligence
Why anyone would want to trust an AI bot instead of a real doctor is another question, but if you ask ChatGPT about prescription drugs, it will tell you that it cannot provide advice and suggestions to deal with a professional to talk.
However, the AI will provide basic medical advice, including recommendations for over-the-counter medications.
Source: Eluniverso

Mabel is a talented author and journalist with a passion for all things technology. As an experienced writer for the 247 News Agency, she has established a reputation for her in-depth reporting and expert analysis on the latest developments in the tech industry.