Author: Michael

Chatbots simply cannot develop personalities—they don’t even understand what “personality” is. Deposit Photos Since testers began interacting with Microsoft’s ChatGPT-enabled Bing AI assistant last week, they’ve been getting some surreal responses. But the chatbot is not really freaking out. It doesn’t want to hack everything. It is not in love with you. Critics warn that this increasing focus on the chatbots’ supposed hidden personalities, agendas, and desires promotes ghosts in the machines that don’t exist. What’s more, experts warn that the continued anthropomorphization of generative AI chatbots is a distraction from more serious and immediate dangers of the developing technology.…

Read More