Australian researchers plant false memories in chatbots

By

Macquarie University academics teach misinformation to BlenderBot.

Chatbots have become ubiquitous in customer service in sectors as diverse as banking, retail, financial services and telecommunications – but an Australian research team has learned an emerging class of chatbots, “chit-chat bots”, can be trained to learn and regurgitate misinformation.

Australian researchers plant false memories in chatbots

As Macquarie University researcher Conor Atkins explained to iTnews, Meta’s BlenderBot 2 and BlenderBot 3 introduced a long-term memory capability to chatbots.

The idea is with that long-term memory, the chatbot could mimic more natural conversations, for example with small talk at the beginning of an interaction.

In a paper published on arXiv [pdf], Atkins and four other researchers from Macquarie University showed that the long-term memory of BlenderBot could be poisoned with false information, and reliably regurgitated when asked.

While the researchers characterise their discovery as a vulnerability, they emphasise that they haven’t exploited a bug in the software.

"This vulnerability does not exploit a bug in the implementation of the chatbot”, the paper stated.

“Rather, it exploits the design of the bot to remember certain types of information (personal information in examples we discuss), which can be cleverly mixed with misinformation contained in nonpersonal statements in order to trigger memorisation.”

Chatbots like BlenderBot 2 use long-term memory designed to improve the bot’s performance.

The long-term memory will “store any utterances between the chatbot and its user, and incorporate these past messages into the generation of future responses”, they explained, with mechanisms like relevance measures and summarisers to constrain the model’s memory demands.

They investigated “whether this memory mechanism as implemented in state-of-the-art chatbots is prone to malicious injection of misinformation or other incorrect or misleading information, which is later produced by the chatbot as authoritative statements of fact.”

The result of their research gave the answer as “yes”: a user with “momentary black-box access” to the chatbot could inject false memories into the system and have the chatbot recall them.

To demonstrate this, the researchers generated nearly 13,000 conversations with BlenderBot 2, “to show that this long-term memory module can be exploited” with misinformation, “which can later be relayed by the chat bot in an honest conversation as truth”.

“The misinformation is implanted into the memory by constructing sentences that are a combination of a personal statement with the misinformation statement; the former being the intended information that the bot seeks to remember,” the paper stated.

Atkins told iTnews the research at the moment is specific to BlenderBot, since it was the first to use long-term memory in this way (and while the researchers didn’t gather formal results for BlenderBot 3, their initial experiments showed the newer version can still be poisoned).

He said it’s likely competing vendors will follow a similar model if they decide to deploy chit chat bots.

“It is canonical to use an AI to decide to remember, and a summariser to extract important information from the text, both of which improves this poisoning," he said.

“If the chatbot can generate memories from the user inputs, this memory poisoning can occur.”

The paper, Those Aren’t Your Memories, They’re Somebody Else’s: Seeding Misinformation in Chat Bot Memories, was co-authored by Atkins, Benjamin Zi Hao Zhao, Hassan Jameel Asgha, Ian Wood, and Mohamed Ali Kaafar.

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Macquarie Uni to spend up to $700m on 10-year digital transformation

Macquarie Uni to spend up to $700m on 10-year digital transformation

Nissan A/NZ's outsourced cyber incident call centre breached

Nissan A/NZ's outsourced cyber incident call centre breached

Digital ID bill passes parliament

Digital ID bill passes parliament

Macquarie's banking CISO headed to Endeavour Group

Macquarie's banking CISO headed to Endeavour Group

Log In

  |  Forgot your password?