Search
Close this search box.

Webex Contact Center Incorporates Generative AI Conversation Summaries

Low-Latency Summarization Capabilities to Improve Efficiency and Collaboration

The News:

Cisco announced at its Cisco Live customer event on June 7 the incorporation of generative AI summarization technology into Webex Contact Center, which is designed to provide workers with better efficiency and collaboration enhancements. The announcement was made in parallel with the preview of generative AI capabilities to simplify policy management and threat response within Cisco Security Cloud.

Cisco announced several summarization use cases in which generative AI can be deployed to help drive efficiency and collaboration between customers and employees within Webex.

  • Catch Me Up will allow users to quickly catch up on missed interactions, including meetings, calling, chats and more.
  • Intelligent meeting summaries with key points and action items, which allow users to opt in to automatically generate the most important elements of a Webex meeting, extract the key points, and capture action items with owners.
  • Create summaries in Vidcast, the company’s video messaging tool, which will produce highlights and chapters so viewers can navigate to the most important parts of the video quickly.

Meanwhile, generative AI-driven conversation summaries within Webex Contact Center will provide agents with a fast, automated way to consume long-form text from digital chats with customers, as well as facilitating post-call wrap-up and resolution tasks with customers. These low-latency chat summaries are designed to provide the agent with a clear summarization of issues and resolutions already explored via self-service and previous interactions, as a summary of the call to both the agent and customer once it ends.

The Catch Me Up summaries, Intelligent meeting summaries, summaries in Vidcast, and conversation summaries in Webex Contact Center are not yet available in general release, but will be available by the end of 2023.

You can read the full press release here.

Webex Contact Center Incorporates Generative AI Conversation Summaries

Analyst Take:

Cisco announced on June 7 at its Cisco Live customer event several use cases for its generative AI-powered summarization function, which are designed to improve agent efficiency and collaboration with others. The key features include Catch Me Up, which utilizes summaries to let users quickly catch up on missed interactions, including meetings, calling, and chats; intelligent meeting summaries that automatically generate the most important elements of a Webex meeting, extract the key points, and capture action items with owners; and the creation of summaries in Vidcast, the company’s video messaging tool, which will produce highlights and chapters so viewers can navigate to the most important parts of the video quickly.

Large Language Model Selection

According to Cisco, Webex is using a range of state-of-the-art open-source and proprietary large language models (LLMs) in current released and upcoming Webex products. The company explained in an email interview that it conducts prompt engineering for each use case and each model they integrate and deploy.

A key question remains on the user expectations for generative AI functionality within CX platforms. Using smaller generative AI models is not terribly expensive, but more complex models that are more accurate and powerful can quickly escalate the cost of deploying generative AI within a platform that is used to manage hundreds or thousands of interactions per day, every day. As such, vendors are going to be faced with the challenge of assessing which generative AI models are used for which use cases, and how to appropriately cover the costs.

For its part, Cisco is still in the process of working out pricing for its generative AI summarization; Cisco indicated that the company is still developing the specific features and will be releasing further details later in the year. This approach mirrors that of many other vendors that have recently announced generative AI features, including Salesforce, Twilio, and others.

Training and Grounding Generative AI Models with Customer Data

Cisco says that it does not rely on using customer data or agent interactions to train general AI models. Essentially, this means that customer data is not used to train larger generative AI models, such as those from OpenAI and others, as this data would then be available for use by any other entity that uses these models.

However, Cisco says it is experimenting with the fine-tuning of models for each individual customer, using their own data, essentially creating mini models for each customer that is not fed back into the LLM. This also ensures that no cross-use of customer data is shared between models. The company also says it is working with a select group of customers who explicitly “opt in” for use of their data for broader features development.

The training of the company-specific, internal models using opt-in customer data likely will help the models become more effective and accurate. My sense is that customers—particularly large enterprises with a range of use cases—will largely be willing to share this data with platforms such as Cisco, due to the desire to “push the envelope,” in terms of what can be accomplished with generative AI, while understanding that there still need to be guardrails around the technology.

Low-Latency Summary Generation Permits Seamless Handoffs During Interactions

One of the other interesting features in Webex Contact Centers is the conversation summaries. Cisco says its approach to summarization is to improv efficiency and collaboration during engagements. The conversation summaries are created using generative AI models that quickly summarize interactions with “sufficiently low latency,” ensuring that neither the customer nor another agent needs to wait to see a summary of the previous interactions. This facilitates smooth handoffs between self-service, previous human agents, and dropped and reconnected calls.

This low-latency approach is a requirement for contact centers. In the past, handoffs between agents were accompanied by a delay, as one agent relayed the details of the interaction to another. This created additional wait time for the customer, and additional friction in the agent-to-agent process, particularly if the issue was significantly complex. The ability to minimize the elapsed time between handoffs, while improving the accuracy and completeness of interaction notes, will be a key factor in ensuring a good contact center experience.

Author Information

Keith has over 25 years of experience in research, marketing, and consulting-based fields.

He has authored in-depth reports and market forecast studies covering artificial intelligence, biometrics, data analytics, robotics, high performance computing, and quantum computing, with a specific focus on the use of these technologies within large enterprise organizations and SMBs. He has also established strong working relationships with the international technology vendor community and is a frequent speaker at industry conferences and events.

In his career as a financial and technology journalist he has written for national and trade publications, including BusinessWeek, CNBC.com, Investment Dealers’ Digest, The Red Herring, The Communications of the ACM, and Mobile Computing & Communications, among others.

He is a member of the Association of Independent Information Professionals (AIIP).

Keith holds dual Bachelor of Arts degrees in Magazine Journalism and Sociology from Syracuse University.

SHARE:

Latest Insights:

OCI Zero Trust Packet Routing Zeros in on Enabling Organizations to Minimize Data Breaches by Decoupling Network Configuration from Network Security
Futurum’s Ron Westfall examines why newly proposed OCI ZPR technology can usher in a new era of network security across multi-cloud environments by decoupling security policies from the complexities of network configurations and simplifying security policy management.
Microsoft Turns Its Sights to the Commercial PC Segment With a New Set of Surface Copilot+ Laptops and 5G Integration
Olivier Blanchard, Research Director at The Futurum Group, provides an overview of Microsoft’s latest Surface Copilot+ PC announcements aimed at the commercial PC segments, and why 5G integration makes more sense in that segment than in the consumer side of the Surface ecosystem.
Qualcomm’s New Mainstream-Tier Snapdragon X Plus 8-Core AI PC Processor May Be the Key to Scale for Copilot+ Windows-On-Arm Adoption
Olivier Blanchard, Research Director at The Futurum Group, discusses why Qualcomm’s new Snapdragon X Plus 8-core AI PC processor, unveiled at IFA Berlin, could be the scale and adoption accelerator MVP that the Copilot+ Windows-on-Arm AI PC segment needs as the PC refresh cycle begins to gather momentum.

Latest Research:

In our latest Research Brief, Leveraging Intelligence to Maximize the Value of CRM Data, completed in partnership with SugarCRM, we discuss how AI can be harnessed to deliver real-world benefits quickly through intelligent summarization and cover how SugarCRM is making this capability within its customer relationship management offering.
In our latest research report, Boost Efficiencies and Optimize Performance with Turnkey AI Enterprise Solutions, we examine how organizations can plan and build private AI solutions and why private AI addresses the growing demand for greater control over AI solutions, including protection and transparency.
In our latest research report, Achieve Better Economics and Performance Through Hybrid AI, we examine why the Lenovo Smarter AI for All portfolio and vision can meet the unique demands of selecting and implementing hybrid AI organization wide.