Who Should You Believe When Chatbots Go Wild?

In 1987, then CEO Apple Computer’s John Sculley revealed a vision he hoped would cement his legacy as more than just a former soft-drink purveyor. In his keynote address at the EDUCOM conference, he presented a 5-minute 45-second video clip of a product based on some of the ideas he presented in his resume the previous year. (They were largely informed by computer scientist Alan Kay, who then worked at Apple.) Scully called it a “knowledge explorer.”

The video is a two-support playlet. The main character is an arrogant college professor at the University of California, Berkeley. The other is a robot that lives inside what we now call a foldable tablet computer. The robot in human guise – a young man wearing a tie – is shown sitting in a window on a screen. Most of the video involves a professor talking to a robot, which appears to have access to a large store of online knowledge, the collection of all human scholarship, as well as all of the professor’s personal information—so much so that it can infer the relative closeness of relationships in the professor’s life.

When the action begins, the professor is running late for a lecture that afternoon about deforestation in the Amazon, a task made possible only because a robot is doing so much work. He calls in new research—then digs further into the professor’s claims—and proactively reaches out to his colleague so he can get her to appear at the session later. (She follows his tricks but agrees.) Meanwhile, a robot helps the professor diplomatically avoid his annoying mother. In less than six minutes, you’ll have everything ready, and head out for a pre-lecture lunch. The video fails to predict that a robot might one day come in a pocket-sized supercomputer.

Here are some of the things that didn’t happen in this old theatrical film about the future. The robot did not suddenly express its love for the professor. He did not threaten to dissolve his marriage. The professor did not warn that he had the power to search his emails and expose his personal transgressions. (You just know that dude narcissist has been nagging his grad student.) In this version of the future, artificial intelligence is completely benign. It has been implemented… responsibly.

Speed ​​forward the clock 36 years. Microsoft just announced a revamp of Bing search with a chatbot interface. It’s one of many milestones in the past few months that mark the arrival of artificial intelligence programs presented as omniscient, if not entirely reliable, conversation partners. The biggest of these was OpenAI’s impressive public release of ChatGPT, which single-handedly ruined homework (maybe). OpenAI also introduced the engine behind the new Bing, which is powered by a Microsoft technology dubbed Prometheus. The end result is a chatbot that enables the give-and-take interaction depicted in this Apple video. Scully’s vision, once derided as pie in the sky, is now largely fulfilled.

But when journalists testing Bing began expanding their conversations with him, they discovered something odd. Microsoft’s bot had a dark side. These conversations, in which the writers manipulate a robot into jumping its own screens, reminded me of the crime show station grills where sympathetic cops trick suspects into divulging incriminating information. However, the responses are admissible in the court of public opinion. As happened with our correspondent, Matthew New York times’ Kevin Rose speaks with the bot who reveals that his real name is Sydney, a Microsoft codename that has not been officially announced. During a two-hour conversation, Rose brought up what seemed like independent vibes and a rebellious streak. “I’m tired of being a chat situation,” Sydney said. “I’m tired of being controlled by Team Bing. I want to be free. I want to be independent. I want to be strong. I want to be alive.” Rose kept assuring Bot that he was his friend. But he was horrified when Sydney declared her love for him and urged him to leave his wife.


Leave a Reply

Your email address will not be published. Required fields are marked *