ServiceNow Vancouver Embeds Generative AI Into Now Assist Workflows

ServiceNow Vancouver Embeds Generative AI Into Now Assist Workflows

The News: ServiceNow announced on September 20 major enhancements within the Now Assist family of solutions within its Now Platform Vancouver Release. Now Assist for IT Service Management (ITSM), Customer Service Management (CSM), HR Service Delivery (HRSD), and Creator each are embedded with generative AI technology across all workflows, with the goal of enabling more productivity, reducing costs, and improving employee experience (EX) and CX. You can read the Press Release highlighting the enhancements in the Now Platform Vancouver Release at ServiceNow’s website.

ServiceNow Vancouver Embeds Generative AI Into Now Assist Workflows

Analyst Take: ServiceNow announced the availability of generative AI technology across all of its workflows within its Now Platform Vancouver Release, including Now Assist for ITSM, Now Assist for CSM, Now Assist for HRSD, and Now Assist for Creator. To facilitate the embedding of generative AI in these solutions, ServiceNow is releasing a domain specific ServiceNow large language model (Now LLM) built for enterprises and optimized for productivity and data privacy.

According to ServiceNow, incorporating generative AI within Now Assist will help organizations drive growth and reduce costs by accelerating productivity, improving EX and CX, and increasing agility around organizational visibility and control, enabling faster and more informed decision-making.

The company’s Now Assist product strategy is largely focused on a few core areas of focus, including employee growth and development, HR service delivery enhancements, workplace space management, and legal service delivery improvements. Generative AI is used to enable a variety of new functions and use cases within each of these areas, with the goal of making it easier, more efficient, and more intuitive to accomplish both basic and more advanced tasks.

ServiceNow’s Approach to Generative AI

ServiceNow’s generative AI strategy provides customers with broad and secure LLM support, through either general-purpose LLMs or ServiceNow‑developed models. The use of general‑purpose LLMs provides customer flexibility and currently includes access to Microsoft Azure OpenAI Service LLM and OpenAI API. To enable more specific functions, domain-specific LLMs have been built into the Now platform that were specifically created to address ServiceNow’s workflows, use cases, and processes. They are tailored to agents, employees, customers, and IT administrators.

The combination of general-purpose and domain-specific LLMs will allow ServiceNow to ensure that generative AI tools can be optimized for key processes and users, particularly with respect to the creation of workflows within ServiceNow (which utilizes its own configuration management database and way of writing JavaScript), while maintaining an efficient structure that allows new public models to be incorporated where appropriate within the same framework.

Embedding Generative AI Across All Workflows

ServiceNow has focused on key areas of AI enablement including personalization, code generation, and language generation. Now Assist incorporates generative AI features such as case, incident, and agent chat summarization; virtual agent; and search capabilities that can be applied to a broad range of scenarios and functions so that every persona from employees to agents to developers can harness the power of generative AI. Key generative AI-powered features include:

  • Now Assist for ITSM provides summaries of incidents for improved IT team handoffs and problem resolutions and can generate and update work notes.
  • Now Assist for HRSD improves HR team productivity and efficiency and EX through instant summaries and conversational chats.
  • Now Assist for CSM reduces manual work and helps resolve customer issues by rapidly generating summaries for cases and chats.
  • Now Assist for Creator empowers developers of any skill level to build apps fast, supporting the development of code generation from text and code prompts.

Pricing of Generative AI Services

Like many other enterprise platform vendors, the use of generative AI technology comes at an additional price. Enterprises will need to spring for a Professional Plus, Enterprise Plus, or Creator Plus add-on pack for a given workflow. These packs will include a bundle of “assists,” or what ServiceNow defines as individual calls to the generative AI functionality, regardless of whether the customer uses ServiceNow’s LLM, an open hosted model such as OpenAI’s ChatGPT, or their own model.

Because pricing is based on the number of calls to the LLM, more complex interactions will likely eat up a greater number of assists than simpler ones. However, ServiceNow says that most customers will be fine with the number of assists that are bundled into the packs, and if necessary, additional packs of assists can be purchased.

ServiceNow Highlighting Its First-Mover Status

As of September 20, all the generative AI features discussed earlier are generally available within Now Assist, making ServiceNow one of the few organizations to let all customers have access to the new technology. It is unclear as to whether there is a real benefit to either users or ServiceNow, beyond marketing hype, as the technology and field of generative AI is still quite nascent.

The level of functionality, types of models, and depth and breadth of generative AI use cases will continue to evolve as the technology continues to evolve and improve, and it is unlikely that enterprise organizations are simply going to standardize on a single vendor and its own approach to generative AI. Further, other vendors have been making their technology available to select customers over the past few months and likely will publicly release their approaches by the end of 2023, so any first-mover advantages will be short-lived.

The advanced functionality offered by generative AI embedded into Now Assist should help enterprises get more out of the software. The open question, as with all of these generative AI platforms, is whether the productivity value outweighs the additional cost of deploying generative AI functionality.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

ServiceNow Announces Expanded GenAI Tools

ServiceNow Announces AI Lighthouse Program to Fast-Track GenAI Adoption

ServiceNow Revenue Up in Q2 2023 to $2.15 Billion, Beating Estimates

Author Information

Keith has over 25 years of experience in research, marketing, and consulting-based fields.

He has authored in-depth reports and market forecast studies covering artificial intelligence, biometrics, data analytics, robotics, high performance computing, and quantum computing, with a specific focus on the use of these technologies within large enterprise organizations and SMBs. He has also established strong working relationships with the international technology vendor community and is a frequent speaker at industry conferences and events.

In his career as a financial and technology journalist he has written for national and trade publications, including BusinessWeek, CNBC.com, Investment Dealers’ Digest, The Red Herring, The Communications of the ACM, and Mobile Computing & Communications, among others.

He is a member of the Association of Independent Information Professionals (AIIP).

Keith holds dual Bachelor of Arts degrees in Magazine Journalism and Sociology from Syracuse University.

SHARE:

Latest Insights:

Q2 Revenue and Margin Expansion Driven by AI-Powered Data Center Demand, Telecom Recovery, and Industrial Resilience
Olivier Blanchard, Research Director at The Futurum Group, analyzes Coherent’s Q2 FY 2025 earnings, highlighting record networking revenue fueled by AI data center demand, telecom recovery, and margin expansion.
Strong Handset Demand and Automotive Growth Propel Record QCT Revenues
Olivier Blanchard, Research Director at The Futurum Group, analyzes Qualcomm’s record Q1 FY 2025 earnings, highlighting strong growth in premium-tier smartphones, AI-driven IoT, and automotive.
Join us as Jim Anderson discusses Coherent's next chapter, focusing on innovation and strategic growth in the evolving tech space.
The Latest Earnings Reinforce Arm’s Competitive Momentum, Driven by the Increasing Integration of Its Technology in AI Workloads and High-Performance Computing
Richard Gordon, VP & Practice Lead, Semiconductors at The Futurum Group, examines Arm’s Q3 FY 2025 earnings, highlighting AI-driven royalty growth, Compute Subsystem expansion, and key AI collaborations.

Thank you, we received your request, a member of our team will be in contact with you.