
ChatGPT, the AI technology that recently made headlines for creating Ghibli-style images, is in the news again, but this time for a concerning mistake. A user shared a shocking experience where instead of helping with a plant issue, ChatGPT provided someone else's personal information. Reported Mint
In a LinkedIn post, the user explained that she uploaded pictures of her peacock plant (Sundari) to ChatGPT, hoping to get advice on why the leaves were turning yellow. However, instead of plant care tips, the AI responded with a full CV, student registration number, principal's name, and membership details of someone named Mr. Chirag. The response was oddly labelled as "Strategic Management notes."
Pruthvi Mehta, a Chartered Accountant, who shared the post, wrote, "I just wanted to save a dying leaf, not accidentally download someone's career details. It was funny for a moment, but then I realised this was someone’s personal information."
Her post has since gone viral on social media, getting over 900 reactions and many comments. In her post, she questioned whether AI technology was being overused, especially for things like Ghibli art, and asked, “Can we still trust AI?”
One user commented, “I’m wondering if these are real details or just made up. It’s concerning, but it looks more like a bug in the system." Another user added, "I wouldn’t have believed it, but something similar happened to me today. I got all the details of a random college event, including contact details for organisers, instead of a script for a poster analysis."
This incident raises concerns about the reliability of AI and its potential for errors.