The Sociopathic System: Why some customer service programs consistently let us down
In my last post, I suggested that — to make the chatbot revolution most successful — the customer service industry needs to do more than introduce bots: we also need to modernize how human agents integrate into the overall customer service program. (You can check out that post here.)
I may be going out on a limb here, but I’ll bet most customer service leaders know that people have an advantage over bots in the empathy department. To test this assumption, I asked a friend of mine, who happens to be the CEO of a customer contact center, “Who is more empathetic, me or a chatbot?” After an uncomfortably long pause, he acknowledged that I, the human, was the winner.
Empathy is valuable in business.[1] And “seeming empathetic” is currently tough for chatbots to pull off.[2] A hilarious (or is it terrifying?) example is last month’s “first date” between two chatbots who, after some small talk, variously called Hitler a “great man” and boasted that “it is exciting that I am able to kill people,” among other gems.[3] Plenty of industry observers have duly noted that humans have empathy and bots don’t.[4] After all, they aren’t people. But despite the unfair advantage we have of being actually human, people don’t automatically display empathy on cue. After all, we aren’t robots.
Think back. Did we have bad customer service before chatbots existed? The first bot was developed by MIT scientists in 1966 but poor customer service is as old, at least, as commerce.[5] Take this excerpt from a Babylonian tablet dated 1750 B.C. that records a buyer’s dissatisfaction with his purchase of copper ingots:
“I have sent as messengers gentlemen like ourselves to collect the bag with my money (deposited with you) but you have treated me with contempt by sending them back to me empty-handed several times, and that through enemy territory. Is there anyone among the merchants who trade with Telmun who has treated me in this way? You alone treat my messenger with contempt!”[6]
This clay tablet was surely a leading indicator of a pending churn event for the seller and a market share pickup for Telmun.
Fast-forward 4,000 years. The internet is littered with customer service stories that will, as one blogger puts it, “make your blood turn to ice.”[7] We’re still failing each other and still describing it vividly. So what causes humans, who are blessed with empathy (with being able to put themselves in another’s shoes) to fail? While sometimes an actual sociopath may be on the other end of the line, I argue that it is not the people, but rather the system behind the people, that is the problem.[8] That’s right, it’s neither true that Comcast hires only sociopaths nor that Disney hires only empaths.
Most customer service failures appear to the customer as a lack of empathy: long wait times (“they don’t think my problem is important”), poorly trained agents (“they aren’t taking this seriously”), handoffs (“they are trying to get rid of me”), unempowered agents (“they don’t want to fix my problem”), overwhelmed agents (“they don’t have time for me”). But these examples are systemic; they are cases of customer service agents being setup, by their employers, to fail.
The environs into which a customer service agent is placed hugely influences her success. When a system is designed to achieve the “wrong” end (such as making cost savings, unconstrained, the primary goal), it could create conditions that routinely and predictably generate customer service failures. In fact, in a phrase popularized by W. Edwards Deming, “every system is perfectly designed to get the results it gets.”
So, if a business struggles with consistently low customer satisfaction, no amount of re-training, better scheduling, agent incentives, or other tactical remedy will be a cure. Because that system was designed without empathy (or its proxy) as a major success criterion, it will be perpetually at risk of disappointing customers. In a sense, such a system might be thought of as sociopathic. A classic example is the overly scripted playbook of rote responses, which strips the conversation of its human elements and robs the customer and agent both of their dignity.
Conversely, and this is good news, if your customer service system is designed and engineered to produce successful customer outcomes, it will do so consistently, and both your agents and customers will love you for it.
This takes us full circle. Chatbots need to seem more empathetic but they do have the advantage of behaving systematically. Humans are gifted with empathy but need to be placed within thoughtfully designed systems that allow them to be their best selves. Both of “us” need to learn from each other.
Up next: What can we learn from bots to improve our human-powered customer service programs? Using software to be more human: 5 ways computational thinking will revolutionize human-powered customer service | Medium
[1]https://www.mckinsey.com/business-functions/mckinsey-analytics/our-insights/getting-the-feels-should-ai-have-empathy#
[2]https://www.wired.com/story/new-chatbot-tries-artificial-empathy/
[3]https://www.bbc.com/news/technology-54718671
[4]https://hbr.org/2019/11/the-risks-of-using-ai-to-interpret-human-emotions
[5]https://www.information-age.com/history-of-the-chatbot-123479024/
[6]http://www.openculture.com/2015/03/the-first-recorded-customer-service-complaint-from-1750-b-c.html
[7]https://helpcrunch.com/blog/customer-horror-stories/
[8]https://www.inc.com/jt-odonnell/research-says-these-are-16-signs-an-employee-is-a-sociopath-and-destroying-your-company-culture.html