Week of May 15th
New Updates include: Conversation Builder and Conversational Cloud Infrastructure Enhancements
Features
Generative AI solutions - Goodbye concierge menus! Routing AI agents are here
Innovative advancements in Generative AI, exemplified by platforms like ChatGPT, have revolutionized consumer interactions by delivering remarkably human-like experiences, thereby raising the standard for AI engagement. In today's digital landscape, consumers demand more from their interactions: they crave intuitive, personalized, and naturally flowing conversations. Consequently, they’re often frustrated when they encounter the cumbersome and inflexible AI solutions deployed by many businesses.
That’s why in this release we're thrilled to introduce the Routing AI agent, a new type of Conversation Builder bot. This is a revolutionary, LLM-powered AI agent that can transform consumer engagement in your contact center.
The Routing AI agent represents a paradigm shift in routing. Traditional bots offer a rule-based experience that’s highly deterministic and often brittle. It falls short of the mark in important ways:
Traditional bot: Predictable. But struggles with complex language.
Traditional bot: Doesn’t handle unsupported intents gracefully
In contrast, our Routing AI Agent harnesses the power of LLMs and Generative AI to dynamically discern consumer intents and adapt responses and routing in real-time.
A logical fallback flow. Intent disambiguation. Multi-intent handling. It’s all there. And no intent model—that takes months to set up—is required.
Now you can offer more conversational routing to consumers:
Routing AI agent: Handles complex, natural language
Routing AI agent: Asks questions to clarify the consumer’s intent
Routing AI agent: Seamlessly remembers and routes to ‘the next” intent
Routing AI agent: Gracefully handles unsupported consumer intents
Key features
- Advanced intent disambiguation: The Routing AI agent can ask clarifying questions to determine the most relevant routing destination, enhancing accuracy in fulfilling specific intents.
- Customizable routing descriptions: In a Routing AI agent, you focus on tailoring the name and description of each routing option that you want to make available (and you can use Generative AI to assist you!). The Routing AI agent takes care of the rest: discerning the consumer’s intent and routing them accordingly. There’s no need to anticipate and accommodate every conceivable scenario, as you must when building an intent-based, rule-based system.
- Conversational context for warm start: The Routing AI agent retains and uses previous turns from the current conversation to interpret the most recent consumer message. Thus, it can engage the consumer with relevant and personalized dialogue.
- Multi-intent handling: Routing AI agents can handle multiple intents, moving across dialogs or bots while maintaining continuity with respect to the routing experience.
Guided Routing interaction: It’s simple to configure
Guided Routing interaction: Define routes to power the routing experience
Defining a route: Invest in good, clear route names and descriptions
Key benefits
- Accelerate deployment to just a few weeks by bypassing the lengthy process of building and training a vast intent model, which typically takes months.
- Increase operational efficiency and service delivery, and reduce costs: Routing AI agents effectively decipher and address multiple, even ambiguous, consumer intents, and they transfer important context. These capabilities solve issues of wasted time, longer handling times, and increased consumer frustration.
- Watch your intent match rate go up. Check out the performance of the Routing AI Agent in Bot Analytics. You can deep dive into the data on specific types of Guided Routing events: “Route within same LivePerson bot,” “Route to different LivePerson bot,” “Route to default route,” and more.
- Enjoy higher FCR and CSAT scores.
Language support
American English and British English are fully supported, with additional languages on the horizon.
Experimental support for other languages is available. Please test thoroughly before rolling out to Production.
Get started today!
Upgrade to the Routing AI Agent to revolutionize your contact center operations, enhance the experience for your consumers, and elevate performance like never before.
Feel free to experiment, build, and test. However, before you roll out a Routing AI agent for use with live traffic in Production, it’s vital that you contact your LivePerson representative! Please fill out this form to schedule a meeting with us.
We want to ensure that it won’t be adversely impacted by the capacity constraints of the third-party LLM that’s used. Please collaborate with us regarding your plans and expected traffic, so we can work together to ensure a successful release.
Voice bots - More options for handling consumer silence
Want to close the automated conversation when the consumer doesn’t respond to the bot’s question within 2 minutes? Now you can. This third option is now available:
Voice bots - Optional message in No Match rule now supports SSML
In a question, a No Match rule triggers the Next Action when a match to an earlier rule in the interaction isn’t found. In general, the rule is used to catch all consumer utterances other than those caught and handled by other rules.
Within the No Match rule, you have the option to specify a message to send before triggering the Next Action. As of this release, you can apply SSML to the message:
Since No Match rules are possible, we’ve removed the Message When Response Unrecognized setting from questions:
Features
Generative AI solutions - More transparency and control when creating prompts
A well-crafted prompt can unlock the full potential of your LLM-powered solution, so we are excited to announce several enhancements to Conversational Cloud’s Prompt Library, our prompt creation and management tool.
We’ve redesigned the UI and surfaced more customization and configuration options, giving you greater levels of flexibility and personalization. Here’s the rundown:
- Bring your own LLM: If you have an in-house LLM that you’d like to use in some Conversational Cloud use cases but not others, now you can. Contact your LivePerson representative with this request. We’ll set it up so that, within the prompt, you can select the model to use: LivePerson’s or your own in-house LLM that you’ve onboarded.
- Use variables to contextualize, customize, personalize, and streamline: Being able to use variables within prompts is a game changer. So in this release we’ve made this aspect much more flexible and powerful. Client type-specific and custom variables are easier to work with as you build a prompt. And, importantly, you can now define default values for custom variables; the default values are used at runtime whenever the variable’s value can’t be resolved. Thus, your prompts remain effective even when issues are encountered.
- Control the conversation context: Do you find yourself questioning just how much conversation context is sent to the LLM by the KnowledgeAI agent? We get it. Now, not only can you readily see this info, you can also configure it on a prompt-by-prompt basis. Hooray! This option is only available for KnowledgeAI agents though; the behavior is automated in other use cases.
- Engineer the user content: There’s a new tab for specifying instructions to append to the consumer’s most recent message, for those times when you need to reinforce certain guidance.
- View earlier versions: Want to refer back—or even revert back—to an older version of a prompt? This too is possible. Use the prompt’s editing history.
Basic tab in a prompt: Specify name, client type, and more
Variables tab in a prompt: Insert variables to dynamically include personalized data
Advanced tab in a prompt: For KnowledgeAI prompts, choose how much conversation context to send to the LLM
Important note for brands automating enriched answers in a bot, or offering enriched answers to agents via Conversation Assist:
You asked, we listened. In this release, we’ve exposed the {knowledge_articles_matched}
variable, so you can include the articles wherever you require in the prompt. We know that research indicates that the positioning of the articles can strongly influence the generated response.
We’ve also migrated your existing prompts accordingly: The {knowledge_articles_matched}
variable is now located at the end of all of your prompts. You can keep the variable there, or move it as desired.