Menu

ServiceNow Announces AI Lighthouse Program to Fast-Track GenAI Adoption

ServiceNow Announces AI Lighthouse Program to Fast-Track GenAI Adoption

The News: ServiceNow in late July announced the launch of AI Lighthouse, a novel program designed to fast-track the development and adoption of enterprise generative AI capabilities by letting customers collaborate in the creation of custom generative AI large language models (LLMs) and applications. AI Lighthouse brings together ServiceNow’s enterprise automation platform and engine, Nvidia AI supercomputing and software, and Accenture AI transformation services to help organizations quickly prepare for and deploy enterprise-grade generative AI tools and features.

According to the release announcing the AI Lighthouse program, Nvidia accelerated computing and software, including Nvidia DGX AI supercomputing and Nvidia DGX Cloud, as well as Nvidia NeMo LLM software, will provide full-stack computing for model training and tuning; ServiceNow will be the front-end workflow automation and intelligence platform; and Accenture will leverage its deep functional and industry knowledge and generative AI strategy, design and delivery experience to bring use cases to life for customers.

For more information on the launch of AI Lighthouse, being developed by ServiceNow, NVIDIA, and Accenture, please see the Press Release here.

ServiceNow Announces AI Lighthouse Program to Fast-Track GenAI Adoption

Analyst Take: ServiceNow announced AI Lighthouse, a program designed to fast-track the development and adoption of enterprise generative AI capabilities by letting customers collaborate to develop generative AI LLMs and applications using ServiceNow’s enterprise automation platform and engine, Nvidia’s AI supercomputing and software, and Accenture’s AI transformation services. It is a collaboration that is designed to leverage each participant’s strengths to gain traction in the nascent, yet growing, generative AI market.

Partnership Leverages Strengths of ServiceNow, NVIDIA, and Accenture

ServiceNow currently offers an automation platform and engine to help develop LLMs. But the AI Lighthouse program takes this to the next step, allowing ServiceNow customers to leverage the power of Nvidia DGX AI supercomputers and DGX Cloud platforms and NeMo software, which it says will provide AI Lighthouse users with “full‑stack computing for model training and tuning.”

The third leg of the AI Lighthouse program is provided by Accenture, which will offer design and engineering services for apps within the ServiceNow platform. Accenture, which has committed $3 billion of investment money towards AI, will leverage its expertise to assist with the design and engineering of domain-specific LLMs and generative AI capabilities within the ServiceNow platform to make functional and industry workflows more intelligent and efficient.

Focusing On Efficiency and Productivity via Generative AI

ServiceNow has launched several powerful generative AI capabilities since May 2023, which are purpose-built for the Now Platform, and it has been engaged with large pharmaceutical, financial services, manufacturing, and health care companies to test them in enterprise environments. According to ServiceNow, AI Lighthouse will build on that early progress to collaborate on designing, developing, and implementing new generative AI use cases with a group of customers across IT service management (ITSM), customer service management (CSM), and employee experience (EX).

This approach appears to dovetail with the most popular use strategies for generative AI, which revolve around using the technology to eliminate tedious or repetitive tasks, so that workers can be redeployed to higher-value interactions. Accenture’s significant level of experience working with clients across a range of industries, combined with ServiceNow’s extensive expertise handling ITSM and CSM use cases – as well as the development of domain-specific LLMs – likely will enable customers to create more refined, purpose-built generative AI tools that can be quickly scaled.

Could AI Lighthouse Be a “Generative AI-in-a-Box” Solution?

Perhaps most interestingly, AI Lighthouse solves one of the key challenges that are faced by any organization seeking to deploy their own LLMs: finding compute that can be right-sized to handle the training and tuning of the models.

Allowing customers to deploy generative AI functionality without making them do the heavy lifting appears to be the thinking behind the AI Lighthouse program. This announcement illustrates ServiceNow’s commitment to helping its customers roll out generative AI technology quickly while minimizing the tasks and challenges related to cobbling together compute, application development, and AI transformations resources on their own.

While pricing has yet to be disclosed, the real value to customers comes in the form of having all the resources and expertise needed to develop generative AI tools in one place.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

ServiceNow Revenue Up in Q2 2023 to $2.15 Billion, Beating Estimates

ServiceNow, Cognizant Team to Push AI Innovation

Juniper and ServiceNow Deliver AI-powered Automation Now

Author Information

Keith Kirkpatrick is VP & Research Director, Enterprise Software & Digital Workflows for The Futurum Group. Keith has over 25 years of experience in research, marketing, and consulting-based fields.

He has authored in-depth reports and market forecast studies covering artificial intelligence, biometrics, data analytics, robotics, high performance computing, and quantum computing, with a specific focus on the use of these technologies within large enterprise organizations and SMBs. He has also established strong working relationships with the international technology vendor community and is a frequent speaker at industry conferences and events.

In his career as a financial and technology journalist he has written for national and trade publications, including BusinessWeek, CNBC.com, Investment Dealers’ Digest, The Red Herring, The Communications of the ACM, and Mobile Computing & Communications, among others.

He is a member of the Association of Independent Information Professionals (AIIP).

Keith holds dual Bachelor of Arts degrees in Magazine Journalism and Sociology from Syracuse University.

Related Insights
Canva Doubles Down on AI and Martech to Bolster its Creative OS
April 9, 2026

Canva Doubles Down on AI and Martech to Bolster its Creative OS

Keith Kirkpatrick, VP and Research Director at Futurum, covers Canva’s Simtheory and Ortto acquisitions, and discusses the impact on Canva and its competitors in the highly competitive enterprise content creation...
Anthropic's Gigawatt-Scale TPU Deal with Broadcom Creates a Structural Advantage
April 9, 2026

Anthropic’s Gigawatt-Scale TPU Deal with Broadcom Creates a Structural Advantage

Brendan Burke, Research Director at Futurum, examines Anthropic TPU expansion with Google and Broadcom, highlighting how multi-gigawatt compute deals and custom silicon are reshaping AI infrastructure scale and competition....
Slack Expands Slackbot for Enterprise Work; Can It Simplify Execution?
April 9, 2026

Slack Expands Slackbot for Enterprise Work; Can It Simplify Execution?

Keith Kirkpatrick, VP and Research Director at Futurum, examines Salesforce’s Slackbot enterprise update, expanding Slack into a unified work interface with AI skills, CRM, and orchestration capabilities....
Does Honoring Matei Zaharia Signal a New Era for Open-Source Data and AI Systems?
April 9, 2026

Does Honoring Matei Zaharia Signal a New Era for Open-Source Data and AI Systems?

Matei Zaharia's ACM Prize for Apache Spark reflects enterprise AI's shift toward open-source platforms, showing how democratized data infrastructure is transforming competitive dynamics across the industry....
Can Nasuni’s File Data Activation Drive Real AI ROI, or Is It More AI Hype?
April 8, 2026

Can Nasuni’s File Data Activation Drive Real AI ROI, or Is It More AI Hype?

Alastair Cooke, Research Director, Cloud and Data Center at Futurum, shares his insights on Nasuni’s announcement of the Resilio Active Everywhere V6 and AI Active products, which enable file data...
IBM and Arm Partner on Dual-Architecture Computing To Redefine Mainframes for AI
April 7, 2026

IBM and Arm Partner on Dual-Architecture Computing To Redefine Mainframes for AI

Brendan Burke, Research Director at Futurum, shares insights on IBM/Arm dual architecture. Arm workloads on IBM Z systems can expand software compatibility, flexibility, and support AI in regulated enterprise environments....

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.