researchers say consumers who just want to get the job done may be turned off by a chatbot that shows emotion

A study of AI chatbots finds that simulating positive emotions can have unexpected results.Illustration: Giacomo Bagnara

While human customer-service representatives with sunny dispositions can please clients, new research suggests that customers don’t always want their chatbots to be cheerful.

Service chatbots are a technology enabled with artificial intelligence to simulate human conversation and solve customer-service problems through text exchanges. And because we know that human service agents should show positive emotions, many companies assume that AI bots should, too.

But three researchers—Han Zhang, a professor of information technology management at the Georgia Institute of Technology’s Scheller College of Business; Elizabeth Han, an assistant professor at McGill University; and Dezhi Yin, an associate professor at the University of South Florida—found that, depending on a consumer’s expectations, positive emotions may backfire when they come from an AI.

For one study, the researchers recruited 155 students from Georgia Tech, brought them into a lab, then randomly assigned each to one of four conditions: an online exchange with a human customer-service representative who exhibited positive emotions, or one who showed no emotions, or a chatbot with positive emotions or one without any emotions.

The service agents clearly revealed their identities to the participants at the beginning of their conversations, then took the participants through a series of exchanges that included phrases such as, “I am handling your request today,” or “I am delighted to handle your request today!” Then the students were asked to evaluate their customer satisfaction on a scale of 1 to 7.

The human agents who lacked emotion scored, on average, 5.86 points, while those who exhibited positive emotions scored 6.57 points. Participants who engaged with the AI chatbots showed no significant difference in satisfaction when emotion was present or absent. “The positive benefit of positive emotion expressed by customer-service agents is greater when the agent is a human,” Dr. Zhang says.

Next the researchers wanted to determine whether consumer expectations might lead participants to respond positively or negatively to an AI that conveyed cheerful replies. Customers fall into two categories, Dr. Zhang explains: a communal type, who expect their interactions to be casual or person-like, and those who just want to get the job done, or exchange-oriented types. For the latter, “we hypothesized that if the AI showed feelings, consumers might feel like their expectations are being violated because some string of code should not show emotions,” he says.

The researchers found that those students who they determined were more communal-oriented gave emotionless chatbots 5.22 points, on average, when asked about the quality of the service provided, compared with 6.11 points for their conversations with AI chatbots exhibiting emotion. Likewise, these customers were far more satisfied when the AI was positive and used lots of exclamation marks—6.35 points, on average, versus only 5.46 points when the AI was matter-of-fact. Meanwhile, the exchange-oriented participants gave the chatbots lacking emotion 6.43 points and those with emotion 5.86 points when it came to service quality. When asked how satisfied they were, those same participants gave emotion-absent chatbots 6.71 points versus 6.28 points for emotion-present chatbots

The result shows that the exchange-oriented students “really don’t like their chatbot to have emotions, even positive ones, and even when the transaction was productive,” Dr. Zhang says.

Based on the study, Dr. Zhang says, companies might want to design AI chatbots that are context-aware and equipped with “emotions” that can be switched on and off depending on the situation.

“Companies need to understand the expectations of customers who interact with chatbots before haphazardly equipping their AIs with emotion-expressing capabilities,” Dr. Zhang says. “If customers expect their relationships with service agents to be transactional but get emotional chatbot exchanges instead, the jarring effect may reverse the benefits of even successful outcomes.”

Ms. Mitchell is a writer in Chicago. She can be reached at reports@wsj.com.

Read more here