Anyone who’s seen David Fincher’s classic thriller Se7en will recall Kevin Spacey’s chilling portrayal of serial killer John Doe toying with Morgan Freeman’s wizened detective and his doomed partner, Brad Pitt. But the real stars of the film are the seven deadly sins, which drive the macabre plot. While the film presents a gruesome twist on the capital vices, the lesson is that there are common passions that inevitably lead to catastrophe. These sins manifest in countless human pursuits — stories, art, life … and software deployments.
Consider the virtues of chatbots, the current software star across the enterprise. They’re easy to create, simple to use, and can work with just about any messaging system. And people are already comfortable with the conventions of chat, which makes chatbots supremely attractive for business uses.
But for every virtue, there’s a vice. Don’t leave your customers wailing “What’s in the box?” — avoid these seven deadly chatbot sins.
1. Lust: The sin of intense longing
There’s nothing wrong with novelty or creative exploration, and it’s only natural that IT leaders might yearn for the new hotness of chatbots. Plus, all the dazzling data customer conversations produce represents a goldmine for marketers. But want does not equal need or value.
Simply tacking on every chatbot API that arrives on the scene isn’t the basis for a healthy implementation strategy. And even picking only “the best” doesn’t help much if your platform isn’t designed with enough attention to reporting and routing.
Resist the temptation to chase every chatbot that comes along by focusing on the realities of integration. Enterprises will use multiple virtual agents, so variety is fine. But seamless transition between silicon (virtual) and carbon (human) along with skills-based routing should remain the focus to ensure effective chatbot behavior.
2. Gluttony: The sin of indulgent waste
Once an enterprise tastes success with a chatbot initiative, it’s easy to start seeing more chatbots as the solution to other challenges. But as appealing as they may seem, an endless series of chatbot deployments will choke an IT department.
Chatbots are a young technology, and you should weigh their benefits against the cost of connecting everything together under the broader umbrella of your platform while safeguarding human priorities. Unrestrained gorging on any new technology, no matter how tantalizing, is never wise.
3. Greed: The sin of rapacious possessiveness
“Greedy” chatbots often result from enthusiasm — a team gets excited and starts deploying a piece of technology in isolation from the rest of the enterprise’s systems. Isolated technology creates a silo. And silos break organizations.
By being tight-fisted — walling themselves away from the other applications or individuals that will be impacted by their interactions or could benefit from their collected information — greedy chatbot deployments cause more harm than good. They don’t share properly, and everyone suffers for it.
Well-tested and integrated access to information — regardless of whether it comes from websites, phone systems, human contact, or AI — is essential to healthy enterprise IT ecosystems. Chatbots are not exempt.
4. Sloth: The sin of habitual apathy
Chatbots may be handy, but typing can be tedious. Trying to describe something in words that could be better explained with an image, video, or screenshot can frustrate customers. Rich media, now supported by most messaging systems, often communicates more efficiently and in a more compellingly way.
Embedding app-like functionality inside a chat experience can dramatically enhance customer experience. Imagine lengthy question-and-answer texting sessions replaced with a troubleshooting microapp that a user merely talks to or taps to summon the desired microforms or payment portals.
The key is knowing what the chatbot can support and having hundreds of microapps and microforms ready for each occasion, selectively sharing them only when relevant to the specific conversation. This is the type of dynamism required of an industrious chatbot deployment.
5. Wrath: The sin of uncontrolled anger
Most customer service veterans can recount horror stories about callers seething with rage at issues associated with technology failure.
Simply understanding what people are writing or saying is half the challenge. Slang, sarcasm, and jargon can flummox both systems and personnel. However, speech and text analytics technology can provide relief.
For instance, some chatbots can now provide translation services on the fly. A customer can ask a question in Spanish and an agent can answer in English with the chatbot translating in real time.
The partnership between people and bots can work wonders. But managed poorly, it can easily lead to fury.
6. Envy: The sin of resentful covetousness
It’s natural to covet technology available in one place and not another. But it’s dangerous to pursue quick and dirty replication simply to soothe the green-eyed monster. Enterprises need to be selective in deployment lest things go off the rails.
You simply cannot train a bot for everything, and they should never operate with unsupervised authority. Instead of allowing chatbots to perform functions beyond their capabilities, it’s better to aim them solely at what they do best. Chatbots are great, for example, in assistive circumstances. Instead of serving as the primary customer service interface, an enterprise chatbot could silently monitor a customer service conversation and offer suggestions in real time to human agents on their computer screens. It can proactively and preemptively identify and surface helpful information related to the topics under discussion, aiding both agent and customer simultaneously.
7. Pride: The progenitor of all other sins
Hubris is the most egregious of the seven deadly sins. It has maimed or killed more major tech rollouts in the enterprise than just about everything else combined. With chatbots, there are real dangers to CTOs and CIOs who lack the humility to grasp the risks.
Consider the implications if hackers breach a bot platform shared extensively across industries. The potential for exposure gets markedly amplified. And even as we embrace automation, it’s presumptuous to let bots do task fulfilment on their own. Imagine errant chatbots providing incorrect customer service instructions or broadcasting sensitive information across unsecured public channels.
This is where analytics and institutional safeguards prove their worth. As do standards and regulations such as HIPAA and PCI-DSS. Human supervisors are responsible for the actions of the technology they employ. We are now in an era where it’s conceivable that a CEO will be sacked not for personal transgressions or acts by wayward human employees, but due to unsupervised enterprise bot crimes and misdemeanors. There is an element of reliance and control that people are handing over to bots, which must be managed humbly and cautiously.
Chris Connolly is the global director for digital engagement at Genesys, a customer experience platform.