My thoughts from the Call & Contact Center Expo in Las Vegas
Author: Simon Black | Date: 04/05/2023
I’ve just returned from Las Vegas after a few mindbogglingly busy days with the team at the Call & Contact Center Expo.
Productive days, though, and it was great to see everyone. There was certainly a lot to take in.
The buzz around AI was ever-present, and I had some excellent debates with people about the future of AI in the contact center.
I also gave a talk on the Future of Agent Guidance that was well attended. Clearly our North American friends are facing a lot of the same challenges that we are here in the UK and Europe.
I’ve summarized a few of my thoughts from the event below. Let me know what you think by emailing firstname.lastname@example.org or connecting with me on LinkedIn.
AI dominating conversation
As you’d expect, the hype around Artificial Intelligence (AI) and the contact center has only grown with the emergence of Large Language Models (LLM), such as ChatGPT and Bard.
And rightly so. We’re in a transitional period that will change how we work forever.
It was great to see so many exciting use cases already identified for AI. It really will help contact centers do things faster, with more accuracy. AND ensure the experience of both the customer and agent is protected.
Large Language Models will help contact centers – but tread with caution
Here at Awaken, we’ve been busy working with the API endpoints of a few different LLMs, and I’m pleased to share that we can plug in our technology and use these models to accelerate learning in the contact center.
Here are a few of the scenarios we’ve identified that are immediately supported using LLM’s:
- Call summarization – we already create a transcript of calls but LLM will help us to further distill all that information into something digestible for the operations and quality teams.
- Quality Assurance – Accelerating completions of scorecards and quality checks
- Call classification – Ensuring 99%+ accuracy when identifying the nature of calls
- FAQ on knowledge base articles – The ability to upload articles, allow agents to ask questions, with the LLM providing a summarized answer
And we’ve only just scratched the surface.
I say ‘tread with caution’ though, as we’ve all seen in the news how business IP and people’s PPI have accidentally been fed into these LLM.
Businesses need to be clear about what data they are happy feeding in under what circumstances.
Lots of interest in Agent Assist
Later this year we’ll be releasing our AI-driven Agent Assist tool.
Which will be an exciting development for our customers and the market.
It’s an upgrade on our current Real-Time Agent Guidance by adding real-time analytics and intent-based assistance.
Crucially, it will be plugged in to back-end systems, such as the CRM, and so based on the intent of either the customer or agent, will be able retrieve information to assist based on what is being said. Live in the moment. It’s really exciting.
At the show Agent Assist really captured the imagination of the people we spoke to as they could see the potential.
Build an agent guidance model based on your use cases
My talk at the show looked at The Future of Agent Guidance.
And how you’ll need to combine process-driven and AI-driven guidance in order to build a comprehensive real-time agent guidance solution.
This is because not every customer use case requires the same level of support / assistance.
For example, when completing insurance policy or mortgage documents with clients an agent must jump through a number of procedural and compliance hoops. This is the perfect scenario for process-driven guidance. Do this, now do this, and this etc. etc.
If the conversation is less structured then AI-driven guidance might be more important as it can retrieve helpful information, unlock discounting or promte cross selling, right there and then.
Crucially, you must have the flexibility to build a bespoke guidance solution for each of your customer use cases.
It can’t be one-size-fits-all.
Looking forward to the expo next year
The event goes by in a flash. So much to take in and now to digest.
We’ll be back, and given what I’ve talked about above, we’ll have a lot to share with you all.
All the best,