They made computers behave like annoying salesmen

https://news.ycombinator.com/rss Hits: 6
Summary

Computers are precise machines. You can give a computer a precise command using an inhumane language, and it should perfome the command. It's not a human, and there is no point of treating it as one. The goal of humanizing user experience isn't to create an illusion of human interaction - it's to make these mechanical commands more accessible while preserving their precise, deterministic nature. UX designers and product managers of tech companies did a lot of damange to people's understanding of computers by making the software behave like a human; or to be more precise, behave like an annoying salesman. (Image from "Not Now. Not later either" by Chris Oliver) We're all familiar with this type. After receiving a clear "no thanks" they deploy increasingly manipulative tactics to meet their "always-be-closing" quotas: "Would this Wednesday work better?" "What would change your mind?" This behavior is frustrating enough from actual salespeople - it's even worse when programmed into our software. (Corporate LLM training session circa 2025) Personally, I can tolerate but deeply dislike software that pretends to have ulterior motives. Take YouTube, for instance. When I explicitly say "Not interested" to their damned shorts feature, I get this response: I understand that it's not the "YouTube program" having its own agency and making this decision - it's the team behind it, driven by engagement metrics and growth targets. But does the average user understand this distinction? The population (especially the younger generation, who never seen a different kind of technology at all) is being conditioned by the tech industry to accept that software should behave like an unreliable, manipulative human rather than a precise, predictable machine. They're learning that you can't simply tell a computer "I'm not interested" and expect it to respect that choice. Instead, you must engage in a perpetual dance of "not now, please" - only to face the same prompts again and again.

First seen: 2025-04-23 17:46

Last seen: 2025-04-23 22:48