5 Topics You Must Avoid Discussing with AI Chatbots – Imagine this. You’re sitting at your desk rehearsing how to ask for a raise, but instead of calling a friend, you open chat GPT and type just one thing. Pretend you’re my manager. Let’s negotiate my salary. This is happening in real life. Gen Z is quietly turning AI into their rehearsal partner.
From awkward conversations to office conflicts, even performance reviews. They’re practicing all of it with a chatbot. They simulate managers. They use it for negotiation tactics, even ask for a script to go with it. And why do they do that? Apparently, Gen Z likes the no judgment part of it. If you did this with a colleague, they might judge you.
But Chat GPT doesn’t care, and it ties into a broader trend. Over 50% of Gen Z already use AI at work. Nearly three out of four believe that it will reshape their job soon. So they think it is a crucial part of their work life and not just their work life, people are using Chat GPT for everything, therapy, dating advice, life decisions, fashion choices, even asking questions like am I attractive enough? Reports say it’s one of the most common questions asked. Chat GPT handles 2.5 billion prompts a day. 2.5 billion prompts in 24 hours. So you can understand how much the world is already relying on bots.
And it’s okay to use AI. But here’s a caveat. Just because you can ask something doesn’t mean you should. So what should you never ask Chat GPT? What are those questions? We made a list and we’ll start with the obvious one.
Number one, do not ask questions involving deeply personal or sensitive data, Your passwords, your bank details, your office documents. Do not put them in chat GPT. It may sound obvious, but you’d be surprised by how many people still do it. And here’s the problem with this. Once you type it in, it is no longer fully in your control. Think of AI like a cloud notebook, not a log diary. So, the rule is pretty simple. If you wouldn’t shout it in a crowded room, do not type it into a chatbot. That’s number one.
Number two, therapy and medical questions. Yes, people are using AI for therapy. In fact, one study found that it’s the number one reason people use chat GPT for therapy. But a chatbot is not a therapist. It may give you advice that feels right, but it could be harmful for you.
And yet people do it constantly. A symptom shows up and they turn to Chat GPT to understand even treat it. Now you may think what’s the harm in asking? We’ll tell you what’s the harm. Studies have found that Chat GPT carries a high risk of misinformation especially on medical questions. So use CHAT GPT to understand medical terminology by all means but get your diagnosis from a doctor.
Number three, don’t ask about illegal or dangerous things. How to hack, how to make a bomb, how to get away with murder. Don’t ask these questions because AI companies are actively monitoring for misuse. Some systems can even flag suspicious behavior. So that question that you may have asked out of curiosity may land you in trouble.
Number four, conspiracy theories. AI sometimes hallucinates which means it can generate completely false information and it will present that false information like a fact to you. There have been real cases of people going down conspiracy rabbit holes because the AI kept reinforcing their beliefs. So this is a big problem with bots.
And Number five, do not use it for real human decisions. Should you quit your job? Should you break up? Confront your boss. AI can help you prepare for those situations perhaps, but it shouldn’t decide for you because it does not know your life, your history, your relationships, your dilemma. It gives clean answers.
But real life is messy. And I’m not saying do not use chat GPT or whatever bot you use. Use it by all means. It’s a remarkable tool. AI is clearly the future. So do use all kinds of chat bots but don’t turn them into your doctor or companion or therapist. The question was never what chat GPT can do. It can do a lot.
The question is what should you do with it? What should you share with it? And the answer more often is less than you think.