The Futurum Group's Statement on Israel

Webex Contact Center Incorporates Generative AI Conversation Summaries

Low-Latency Summarization Capabilities to Improve Efficiency and Collaboration

The News:

Cisco announced at its Cisco Live customer event on June 7 the incorporation of generative AI summarization technology into Webex Contact Center, which is designed to provide workers with better efficiency and collaboration enhancements. The announcement was made in parallel with the preview of generative AI capabilities to simplify policy management and threat response within Cisco Security Cloud.

Cisco announced several summarization use cases in which generative AI can be deployed to help drive efficiency and collaboration between customers and employees within Webex.

  • Catch Me Up will allow users to quickly catch up on missed interactions, including meetings, calling, chats and more.
  • Intelligent meeting summaries with key points and action items, which allow users to opt in to automatically generate the most important elements of a Webex meeting, extract the key points, and capture action items with owners.
  • Create summaries in Vidcast, the company’s video messaging tool, which will produce highlights and chapters so viewers can navigate to the most important parts of the video quickly.

Meanwhile, generative AI-driven conversation summaries within Webex Contact Center will provide agents with a fast, automated way to consume long-form text from digital chats with customers, as well as facilitating post-call wrap-up and resolution tasks with customers. These low-latency chat summaries are designed to provide the agent with a clear summarization of issues and resolutions already explored via self-service and previous interactions, as a summary of the call to both the agent and customer once it ends.

The Catch Me Up summaries, Intelligent meeting summaries, summaries in Vidcast, and conversation summaries in Webex Contact Center are not yet available in general release, but will be available by the end of 2023.

You can read the full press release here.

Webex Contact Center Incorporates Generative AI Conversation Summaries

Analyst Take:

Cisco announced on June 7 at its Cisco Live customer event several use cases for its generative AI-powered summarization function, which are designed to improve agent efficiency and collaboration with others. The key features include Catch Me Up, which utilizes summaries to let users quickly catch up on missed interactions, including meetings, calling, and chats; intelligent meeting summaries that automatically generate the most important elements of a Webex meeting, extract the key points, and capture action items with owners; and the creation of summaries in Vidcast, the company’s video messaging tool, which will produce highlights and chapters so viewers can navigate to the most important parts of the video quickly.

Large Language Model Selection

According to Cisco, Webex is using a range of state-of-the-art open-source and proprietary large language models (LLMs) in current released and upcoming Webex products. The company explained in an email interview that it conducts prompt engineering for each use case and each model they integrate and deploy.

A key question remains on the user expectations for generative AI functionality within CX platforms. Using smaller generative AI models is not terribly expensive, but more complex models that are more accurate and powerful can quickly escalate the cost of deploying generative AI within a platform that is used to manage hundreds or thousands of interactions per day, every day. As such, vendors are going to be faced with the challenge of assessing which generative AI models are used for which use cases, and how to appropriately cover the costs.

For its part, Cisco is still in the process of working out pricing for its generative AI summarization; Cisco indicated that the company is still developing the specific features and will be releasing further details later in the year. This approach mirrors that of many other vendors that have recently announced generative AI features, including Salesforce, Twilio, and others.

Training and Grounding Generative AI Models with Customer Data

Cisco says that it does not rely on using customer data or agent interactions to train general AI models. Essentially, this means that customer data is not used to train larger generative AI models, such as those from OpenAI and others, as this data would then be available for use by any other entity that uses these models.

However, Cisco says it is experimenting with the fine-tuning of models for each individual customer, using their own data, essentially creating mini models for each customer that is not fed back into the LLM. This also ensures that no cross-use of customer data is shared between models. The company also says it is working with a select group of customers who explicitly “opt in” for use of their data for broader features development.

The training of the company-specific, internal models using opt-in customer data likely will help the models become more effective and accurate. My sense is that customers—particularly large enterprises with a range of use cases—will largely be willing to share this data with platforms such as Cisco, due to the desire to “push the envelope,” in terms of what can be accomplished with generative AI, while understanding that there still need to be guardrails around the technology.

Low-Latency Summary Generation Permits Seamless Handoffs During Interactions

One of the other interesting features in Webex Contact Centers is the conversation summaries. Cisco says its approach to summarization is to improv efficiency and collaboration during engagements. The conversation summaries are created using generative AI models that quickly summarize interactions with “sufficiently low latency,” ensuring that neither the customer nor another agent needs to wait to see a summary of the previous interactions. This facilitates smooth handoffs between self-service, previous human agents, and dropped and reconnected calls.

This low-latency approach is a requirement for contact centers. In the past, handoffs between agents were accompanied by a delay, as one agent relayed the details of the interaction to another. This created additional wait time for the customer, and additional friction in the agent-to-agent process, particularly if the issue was significantly complex. The ability to minimize the elapsed time between handoffs, while improving the accuracy and completeness of interaction notes, will be a key factor in ensuring a good contact center experience.

Author Information

Keith has over 25 years of experience in research, marketing, and consulting-based fields.

He has authored in-depth reports and market forecast studies covering artificial intelligence, biometrics, data analytics, robotics, high performance computing, and quantum computing, with a specific focus on the use of these technologies within large enterprise organizations and SMBs. He has also established strong working relationships with the international technology vendor community and is a frequent speaker at industry conferences and events.

In his career as a financial and technology journalist he has written for national and trade publications, including BusinessWeek,, Investment Dealers’ Digest, The Red Herring, The Communications of the ACM, and Mobile Computing & Communications, among others.

He is a member of the Association of Independent Information Professionals (AIIP).

Keith holds dual Bachelor of Arts degrees in Magazine Journalism and Sociology from Syracuse University.


Latest Insights:

On this episode of The Six Five – On The Road, hosts Daniel Newman and Patrick Moorhead welcome Emily Shea and Nick Smit at AWS to discuss Event-Driven Architecture and the role of application integration in modernization.
On this episode of The Six Five – On The Road, hosts Daniel Newman and Patrick Moorhead welcome Rob Miller, GreenLake Cloud Services Sales for HPE, for a conversation on HPE’s robust partnership with AWS and HPE’s hybrid by design strategy.
VAST Enhances Its Data Platform with the Release of Version 5.0
Mitch Lewis, Research Analyst at The Futurum Group, shares his insights on VAST Data’s version 5.0 announcement.

Latest Research:

In our latest research brief, Micro-segmentation Keeps Sensitive Mainframe Data in Compliance, done in partnership with Vertali, we analyze how micro-segmentation makes a network easier to secure and manage by isolating segments.
In our latest Market Insight Report, Private 5G Networks – Hyperscaler Cloud Providers, we find that private 5G Networks are key to transforming business practices and improving business outcomes by enabling organizations to gather massive amounts of data about their operations and customers.
In our latest Research Brief, Acknowledging the Impact of Agent Experience on Customer Experience, completed in partnership with Local Measure, we discuss the drivers of agent frustration and burnout, the benefits of implementation new technology such as AI to better support agents and enable greater automation, and the advantages provided by the use of a modern, cloud-based contact center solution that can integrate and activate technology that creates better experiences and outcomes for both agents and customers.