log in

Chatbot...or anger machine?

lobste.rs - Mon Apr 12 14:02

Customers are at the heart of every business.

There is nothing more frustrating for the average customer than having to navigate some company’s automated support and messaging system. When each process “has to be” funneled through an AI empowered chatbot, to customer service representatives, all while distinctly feeling herded and managed, it increases levels of frustration for users. Remember, customers are forced to communicate this way. When phone numbers are intentionally obfuscated, and companies are increasingly relying on chatbot systems to interact, the customer loses their feeling of power and ability to accomplish their goals.

They’re lost.

There’s no question of the efficacy of chatbots to handle the majority of transactional communications. There is clear evidence that using chatbots and automation has benefits to an organization, especially now that we’re increasingly decentralized. It lowers labor costs and increases the ability to control messaging into CRM funnels, standardizing the experience and providing services at scale. Companies will save 2.5 billion customer service hours using chatbots by the end of 2022.[i]

That’s a lot of savings that add to the bottom line.

But are we sacrificing cost savings for future sales?

The cost savings aren’t the only consideration. Almost half (46%) of customers report that they would prefer to communicate with a live person instead of chatbots.[ii] And, since 90% of banks will automate their customer interactions using chatbots by 2022[iii], are we putting the cart before the horse? Will we end up alienating our customers, creating animosity with each interaction, and open up a window for an upstart competitor that wins customer service and completely eviscerates our market share?

Probably.

The competitor who is going to win the future will be one who understands that the purchase process, and subsequent customer service is–at its core–an emotional experience. One who can understand and engineer empathetic responses to assuage customer fears and triage them to help them solve problems with their experience. A competitor whose communication operation puts empathy at the heart of their interactions.

They know that without it, the cost savings in non-emotionally intelligent bots are a toxic potion that will inevitably lead to loss after loss…as customers hemorrhage and sales tumble.

If chatbots and auto attendants are so intoxicating to management because of the cost savings, it’s essential that you engineer emotional intelligence into your system. Indeed, chatbots may actually improve mental health outcomes.[iv] But only, and I mean only, if they are up for the challenge of recognizing emotions and responding appropriately.

Let’s look at a recent chatbot experience.

Ironically, I had a customer service need to contact a company that I recently chose for services. Let’s say it’s a large telecommunications and entertainment company, that has a state-of-the-art customer service system. Call centers across the world, online tools and automated attendants, chatbots, AI. The whole ball of wax. Certainly costing them millions a year.

I had a new service installed. It went well, but after the technician left, the interface box continued to conk out. I went through all the self-service apps that were offered; and had to resort to trying to contact customer service. That’s when the fun started.

The chatbot asked if it could help me. I replied with a description of my problem, and was met with (and I’m paraphrasing): “WOAH hold on there buddy, that’s a lot of words. I get confused and can’t understand that. I do better with short responses.”

You can imagine my chagrin. So, I dumbed it down for the bot.

I slowly and methodically responded that I have been doing this self-restart thing for like three days now. I’m now continuing to reply to tell them how frustrating it is to have this problem. It’s like talking to a teenager about technology, they aren’t listening because I’m old, duh, and assume they know what I want because, reasons.

Sigh. There’s no getting around this, and I’m going to have to talk to a live agent.

I responded that I wanted to speak to a live attendant. They responded that the bot would rather handle the communications, thank you. Repeatedly.

Dismissive little bugger aren’t you? Seriously?!

The bot wouldn’t let me contact support to change an outdated phone number, and instead continued to offer to take me down one rabbit hole after another, offering me several “choices” from their drop down menu. Each time it would not solve the problem and instead took me on an endless cycle of one dead end after another. One problem created another.

So, only after being so angry and frustrated with the amount of effort it took to accomplish a simple task, the time it’s taken, the loss of agency, the loss of any good will I had toward this company when I chose them was gone.

Way gone.

Opportunity lost.

Customers are not familiar with your nomenclature, and your taxonomies. They don’t know your labels.

When your bot can’t understand anger and sadness, you do not have customer service. If your system can’t understand when a customer is using humor to hide other emotions—you do not have customer service. You have a machine that creates angry customers. An anger machine on auto pilot, undoing everything you’re trying to accomplish with your product design, marketing, and every step of your value chain.

Your chatbots are churning out brand-new angry customers every second of every day.

When you see those reports from marketing or customer service? Yeah. Those aren’t “users.” Those are angry customers who are dying to run to your competitor. As soon as they can. If you’ve locked them into a punitive contract they’ll seek to punish you.

Each contact just increases the level of animosity, resentment, and futility for your customers.

The future is empathetic.[v]

V.E.R.N. is a real time solution to your emotional detection needs. If you develop a chatbot, it needs to have emotional intelligence. Sentiment analysis won’t cut it. No one knows what to do with “positive,” “negative,” “neutral,” or “mixed.” You CAN do something with 66% anger, 51% sadness and 0% humor; which is what you’d get from V.E.R.N. A sentence by sentence breakdown of the emotions present and a confidence level for each one. So not only can you tell if they are not just “negative,” but slightly angry at 51%. Or, maybe they’re really ticked off at you and their anger confidence level shoots to 80% or more. That’s something you can address.

  • You don’t have to train the bot for months on customer data. V.E.R.N. works in about 15 minutes if you’re good enough.
  • You don’t have to rely on biased training data. V.E.R.N. wasn’t created by bots–for bots. It was created by communication, neuroscience and computer science researchers and engineers. It’s designed to detect latent emotional clues we call emotives. You’ll get a generalized analysis right out of the box.
  • It’s in real time. We can provide you analysis on each sentence you send us in milliseconds. Less if we remove the network and get closer to the metal. What could you do with instant emotion analysis?
  • It also learns as it goes. V.E.R.N. only gets better the more its used, and we can provide custom frames for your domain or user that increase the level of accuracy even more.

So if your chatbot is relying on slow, bulky, or inaccurate software to detect emotions you have a choice:

You can get V.E.R.N., or you can keep producing angry customers from your anger machines.

GET VERN NOW

[i] Companies will save 2.5 billion customer service hours using chatbots by the end of 2023. (Chatbots Life)

[ii] 46% of users would prefer to communicate with a live person instead of chatbot. (Tidio)

[iii] Bank systems will automate up to 90% of customer interactions using chatbots by 2022. (Chatbots Magazine)

[iv] Emotional Reactions and Likelihood of Response to Questions Designed for a Mental Health Chatbot Among Adolescents: Experimental Study https://humanfactors.jmir.org/2021/1/e24343

[v] https://chatbotslife.com/how-emotionally-intelligent-is-your-chatbot-6fc926d652ea