Sunday, February 26, 2023

Chatbot horror? Chat pa more

 

“Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last, unless we learn how to avoid the risks.”

— Stephen Hawking

 

By Alex P. Vidal

 

I WAS watching a 1935 film Sylvia Scarlett starring Katharine Hepburn and Cary Grant Sunday (February 26) afternoon on TCM (Turner Classic Movies) channel 82 when I decided to check the latest news on CNN channel 78 as I wont to do during the weekend.

I saw New York Times technology columnist Kevin Roose being interviewed by anchorwoman Fredericka Whitfield. The topic: Chatbot.

Roose was lamenting that while he had early access to new features in Microsoft's search engine Bing that incorporates artificial intelligence (AI), the new chatbot allegedly tried to get him to leave his wife.

It turned out Roose had mentioned the same topic in an interview earlier with another CNN anchorwoman Alisyn Camerota.

This AI bot can answer questions, write essays, summarize documents and write software. But deep down, it doesn't know what's true.

I was curious about the subject matter because I am about to start using Bing in Microsoft’s search engine in my latest Lenovo ThinkPad Intel laptop. The timing is quite interesting.  

A chatbot is a computer program that uses AI and natural language processing (NLP) to understand customer questions and automate responses to them, simulating human conversation.

Chatbots can make it easy for users to find the information they need by responding to their questions and requests—through text input, audio input, or both—without the need for human intervention.

 

-o0o-

 

According to IBM, Chatbot technology is almost everywhere these days, from the smart speakers at home to messaging applications in the workplace.

The latest AI chatbots are often referred to as “virtual assistants” or “virtual agents.” They can use audio input, such as Apple's Siri, Google Assistant and Amazon Alexa, or interact with you via SMS text messaging. 

“Either way, you’re able to ask questions about what you need in a conversational way, and the chatbot can help refine your search through responses and follow-up questions,” IBM explained.

CNN business Samantha Murphy Kelly reporter had earlier warned against the “dark side” of Bing’s new AI chatbot.

“After asking Microsoft’s AI-powered Bing chatbot for help in coming up with activities for my kids while juggling work, the tool started by offering something unexpected: empathy,” Kelly reported on February 16.

“The chatbot said it ‘must be hard’ to balance work and family and sympathized for my daily struggles with it. It then gave me advice on how to get more time out of the day, suggesting tips for prioritizing tasks, creating more boundaries at home and work, and taking short walks outside to clear my head.

“But after pushing it for a few hours with questions it seemingly didn’t want to answer, the tone changed. It called me ‘rude and disrespectful,’ wrote a short story about one of my colleagues getting murdered and told another tale about falling in love with the CEO of OpenAI, the company behind the AI technology Bing is currently using.

 

-o0o-

 

“My Jekyll and Hyde interactions with the bot, who told me to call it ‘Sydney,’ are apparently not unique. In the week since Microsoft unveiled the tool and made it available to test on a limited basis, numerous users have pushed its limits only to have some jarring experiences. In one exchange, the chatbot attempted to convince a reporter at The New York Times that he did not love his spouse, insisting that ‘you love me, because I love you,’” Kelly stressed.

In another shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and said the user is “confused or mistaken” to suggest otherwise.

“‘Please trust me, I am Bing and know the date,’ it sneered, according to the user. ‘Maybe your phone is malfunctioning or has the wrong settings.’”

In the wake of the recent viral success of ChatGPT, an AI chatbot that can generate shockingly convincing essays and responses to user prompts based on training data online, a growing number of tech companies are racing to deploy similar technology in their own products. 

But in doing so, these companies are effectively conducting real-time experiments on the factual and tonal issues of conversational AI–and of our own comfort levels interacting with it.

In a statement to CNN, a Microsoft spokesperson said it continues to learn from its interactions and recognizes “there is still work to be done and are expecting that the system may make mistakes during this preview period.”

“The new Bing tries to keep answers fun and factual, but given this is an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation,” the spokesperson said. “As we continue to learn from these interactions, we are adjusting its responses to create coherent, relevant and positive answers. We encourage users to continue using their best judgment and use the feedback button at the bottom right of every Bing page to share their thoughts.”

(The author, who is now based in New York City, used to be the editor of two local dailies in Iloilo.—Ed)

 

 

 

 

 

No comments:

Post a Comment