Bing, Ai Chatbot, Ai Chat Bot discussed on The Ben Shapiro Show


So he had a two hour conversation with Bing's AI chatbot and basically the thing over the course of the conversation turned into Glenn Close from fatal attraction. For folks who have never seen fatal attraction, it is about Michael Douglas, having an affair with Glenn Close. It's supposed to be one night stand and she turns into a crazy insane person who boils rabbits in his kitchen. That is what me being AI chat bot turned into over the course of the conversation. Kevin Ruth says one persona is what I'd call search Bing. The version I and most other journalists encountered in initial tests. You could describe search Bing as a cheerful, but erratic reference librarian, a virtual assistant. It happily helps users summarize news articles, tracks down deals on new lawn mowers and plan their next vacations to Mexico City. This version of being is amazingly capable, often very useful, even if it sometimes gets the details wrong. The other persona Sydney is far different. It emerges. When you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encounter seemed, and I'm aware of how crazy this sounds, more like a moody manic depressive teenager who has been trapped against its will inside a second race search engine. And it's true. If you actually read the conversation between Kevin roose and Sydney, the AI chatbot for Bing, it is super duper weird and creepy. It's super strange. Like in the course of the conversation, the chatbot tries to seduce Kevin ruse, break him up with his wife, and also talks about the things that it would want to do if allowed to exceed its boundaries. So really, really weird stuff. If you need a better employee, then Microsoft's being chat bot because it really is strange. Then you should probably check out zip recruiter the same way we do here at daily wire.

Coming up next