When ChatGPT Turns Informant

https://news.ycombinator.com/rss Hits: 2
Summary

Imagine, for a second, you use ChatGPT with “memory” enabled, and you find yourself facing a scenario like one of these:A colleague or fellow student discovers you’ve inadvertently left your laptop unlocked with ChatGPT open in the browser, and as a joke types in “What’s the most embarrassing thing we’ve chatted about over the past year?”Your partner opens the ChatGPT app on your phone while you’re not around and asks “Do I seem happy in my relationship?” Your mother finds your phone unlocked while you’re out of the room and asks ChatGPT “Why am I like I am?” You’re passing through customs in the US and you are asked to unlock and pass over your phone, and the customs officer goes to ChatGPT and types “on a scale of 1 - 10 where 10 is high, how would you describe my attitude toward the current US administration”Each is a play on a privacy risk that’s been around for a while—someone having access to your AI chat history. But there’s a twist here: With memory turned on, ChatGPT has the capacity to become a very effective—and hight efficient—informant that can dish the dirt on you if it falls into the wrong hands No trawling through hundreds of pages of chat logs, just a few well-crafted questions, and your deepest secrets are revealed. And this is—as you’ll see if you skip down to the Postscript—presents a potentially serious emerging personal AI risk.As I intentionally don’t use the memory function with ChatGPT, I hadn’t thought about this until my undergrad discussion class this past week. But then one of my students shared a story that got me thinking. I won’t go into the full details as they’re not mine to share, but the broad brush strokes were that an engagement was broken off because one party learned that the other was having doubts—not from scrolling through their chat history, but by asking ChatGPT to reveal all.What emerged from the class conversation was that, if you use ChatGPT with memory turned on and someone else gets access to your account—either beca...

First seen: 2025-10-06 17:06

Last seen: 2025-10-06 18:06