We spoke previously about the notion in AI – that failure to train is training to fail. The quality and nature of the data used to train AI algorithms, profoundly impact the performance and end-customer experience that AI based applications deliver.
Bots for long, have been the poster child of AI and have invariably fascinated one and all with their ability to interact using natural language and provide answers in a human-like manner, be it in the embodiment of home digital assistants such as Alexa or Google Home, chatbots hooked to proprietary chat platforms, or the more democratic Facebook or Microsoft bot frameworks.
Now that Virtual Agents are on every enterprise’s agenda, a few vexing questions arise:
- What is bot training?
- What skillsets do I need in order to train a bot?
- Can my human agents and customer service representatives (CSRs), as part of their day-to-day customer conversations, train their virtual counterparts?
Machines have systemized components to perform what humans do instinctively – ask questions and resolve issues through conversations. Every AI bot has the following mandatory components:
- Natural Language Processing and Understanding (NLP/NLU), which transcribes spoken or written phrases into their machine representation equivalents
- Intents, as an outcome of NLU, tells the bot what the customer is asking about
- Context, the often unspoken facet that actually enables triaging the issue and personalizing answers
- Content, which comprises of phrases, articles or media returned by the bot
Natural Language Processing and Understanding leverage machine learning to transcribe natural language into a “semantic vector” that machines can process. NLP/NLU engines are typically trained by a specialized team of linguists and data scientists, who with their depth and expertise in language, develop models that lead to precise detection of Intents.
Language constantly evolves and so do the products and services offered by an enterprise. These can result in mismatched intents or “gaps” i.e. scenarios where the intent cannot be inferred. More so, even a simple bot can receive tens-of-thousands of questions each week.
Combing through such volumes and unlocking hidden patterns in data, needs specialized personnel with expertise in data science. NLP/NLU engines need ongoing upkeep, that include vocabulary expansion for newly launched offerings, semantic expansion for new diction or slangs and in some cases, a change to the underlying machine learning algorithms that better represent and detect the intents for the nature of questions customers ask.
Not all NLP/NLU engines are alike. Some perform better than others, due to the choice of the machine learning algorithms in play, the data used to train the engine and the training upkeep (or lack there of).
Context on the other hand, is derived by gathering customer data such as profile information, preferences, and device telemetry, etc. Context engines infer data that is relevant to the issue on hand, triage the issue to a resolution path and then personalizing the answer to the customer on hand. Context engines may take the form of API calls, mobile and web SDKs and are typically maintained by technical teams who are well versed with web & mobile platforms and integration technologies.
As for Content, once the Virtual Agent identifies the intent and the corresponding resolution path, it needs to return relevant content to the end customer using natural language. These span media such as images & videos, web-URLs that may point to support pages or deep-linked into self-service apps or pages, and the associated dialogues that mimic how humans would answer questions.
Content provides a great avenue for collaboration, where enterprises may leverage their own employees such as CSRs and business analysts, to augment their knowledge repositories with articles & solutions and also enhance their business practices and customer journeys. Good content can make or break the success of the bot’s adoption by end-customers.
Behind every successful Virtual Agent, is a diligent team that identifies emerging patterns in customers interactions and follows a “train ahead” practice i.e. proactively train the bot on an ongoing basis in collaboration with the enterprise’s personnel for appropriate content. It takes expertise, experience and collaboration to succeed after all.