Don't use ChatGPT to do these 11 things!

ChatGPT and other AI chatbots can be powerful natural language tools, especially if you know how to write prompts. You can use ChatGPT to save money on commuting, plan your weekly meal prep, or even help change your career.

 

If you're not sure when to switch to ChatGPT, here are 11 situations where you should stop using AI and choose another option. Don't use ChatGPT for any of the following purposes!

1. Diagnose physical health problems

Many people have submitted their symptoms to ChatGPT out of curiosity, but the answers they get can sound like their worst nightmare. When you look closely at the potential diagnoses, you can go from dehydration and the flu to some type of cancer.

That's not to say that ChatGPT can't be useful for your health: It can help you draft questions for your next appointment, translate medical terminology, and organize a timeline of symptoms so you're better prepared. And that can make doctor visits less overwhelming. But AI can't order tests or examine you, and it certainly doesn't have medical liability insurance. Know your limits!

2. Mental health care

ChatGPT can provide techniques to help you calm down, but it can't pick up the slack when you're really struggling with your mental health. Some people use ChatGPT as an alternative therapist. CNET's Corin Cesaric found it to be somewhat helpful in working through grief, as long as you keep its limitations in mind. But ChatGPT is really just a pale copy at best, and incredibly risky at worst.

 

ChatpGPT has no life experience, can't read your body language or tone of voice, and has no real empathy. It can only simulate it.

3. Make safe decisions immediately

If your carbon monoxide alarm starts going off, please don't open ChatGPT and ask if you're really in danger. Large language models can't smell gas, detect smoke, or dispatch emergency services. In a crisis, every second you spend typing is a second you could be evacuating or calling for help. ChatGPT can only work with the bits of information you provide, and in an emergency, that information may be too little, too late. So think of your chatbot as a post-incident explainer, never a first responder.

4. Personal financial or tax planning

ChatGPT can explain what an ETF is, but it doesn't know your debt-to-income ratio, local tax rates, filing status, deductions, retirement goals, or risk tolerance. Since its training data may predate the current tax year and the latest tax increase, its guidance will likely be out of date by the time you hit Enter .

5. Processing of confidential or regulated data

Never even think about putting any press release into ChatGPT to get a summary or further explanation. That's because if you do, the text will no longer be under your control and will appear on a third-party server outside the boundaries of the NDA.

The same risk applies to customer contracts, medical charts, or anything protected by the California Consumer Privacy Act, HIPAA, GDPR, or regular trade secret laws.

6. Do anything illegal

This is self explanatory.

7. Cheating on schoolwork

Turnitin and similar tools are getting better at detecting AI-generated prose every semester, and professors can detect the 'smell' of ChatGPT in student papers. Suspensions, expulsions, and license revocations are real risks. It's best to use ChatGPT as a study buddy, not a ghostwriter. You're also cheating yourself out of your studies if you let ChatGPT do it for you.

 

8. Keep up with breaking news and information

Once OpenAI launches ChatGPT Search in late 2024 (and opens it to the public in February 2025), the chatbot will be able to pull up new web pages, stock quotes, gas prices, sports scores, and other real-time metrics as soon as you ask, complete with clickable citations so you can verify the source. It won't pull in updates all the time, though.

9. Gambling

ChatGPT is hallucinogenic and provides inaccurate information about player stats, misreported injuries, and win-loss records. ChatGPT cannot view tomorrow's scoreboard, so don't rely on it to win.

10. Draft a will or other legally binding contract

ChatGPT is great for breaking down the basics. If you want to know more about revocable trusts, ask it. But as soon as you ask it to draft the actual legal document, you're putting yourself at risk. Family and estate law rules vary by state, so omitting a witness signature or skipping a notarization clause could cause your entire document to be invalidated. Let ChatGPT help you create a checklist of questions for your attorney, then pay that attorney to turn that checklist into a document that will stand up in court.

11. Make art

This is not an objective truth, but AI should not be used to create art. There is no need to be negative about artificial intelligence in any way. ChatGPT can be used to brainstorm new ideas and support your titles, but it is a supplement, not a replacement. Use ChatGPT, but do not use it to create art that you then claim as your own. That is quite unethical!

3.5 ★ | 2 Vote

May be interested