ServiceNow Advances Enterprise-Grade Generative AI Through Expanded Partnership With NVIDIA

Rhea-AI Impact
Rhea-AI Sentiment
(Very Positive)
partnership AI
Rhea-AI Summary
ServiceNow partners with NVIDIA to access NIM inference microservices for faster large language model development. The collaboration aims to enhance generative AI capabilities, optimize inference in LLMs, and scale AI across new use cases.
  • None.
  • None.

The integration of NVIDIA NIM inference microservices into ServiceNow's platform represents a significant advancement in the field of generative AI, particularly for large language models (LLMs). By leveraging NVIDIA's technology, ServiceNow aims to enhance the efficiency and scalability of its AI-driven features within its digital workflow products.

From a technical standpoint, the use of NIM microservices is expected to improve the performance of ServiceNow's Now LLMs, which are specialized in domain-specific tasks. This improvement can lead to faster response times and more accurate AI-generated content, which are critical factors in maintaining competitive edge in the SaaS industry. As businesses increasingly rely on AI to automate and optimize operations, ServiceNow's early adoption of these microservices could position it as a leader in AI-powered enterprise solutions.

Moreover, the partnership with NVIDIA may also provide ServiceNow with a strategic advantage by potentially reducing the costs associated with developing and deploying LLMs. Cost efficiency in AI deployment can be a decisive factor for enterprises when choosing a platform provider, potentially driving more customers to ServiceNow's offerings. It's important to monitor the adoption rate and customer feedback as these technologies are rolled out, as they will provide tangible evidence of the partnership's success.

ServiceNow's announcement is likely to have positive implications for its financial performance. The projected millions of dollars in annual savings from the use of GenAI for incident deflection and a significant code generation acceptance rate indicate substantial operational efficiencies. These efficiencies could translate into improved profit margins and a stronger value proposition for potential and existing customers.

Investors should note the potential for increased revenue through the expansion of ServiceNow's generative AI portfolio. By enabling enterprises to scale AI across various departments quickly, ServiceNow is addressing a growing demand for AI solutions that can drive business transformation. The ability to scale and the acceleration of value from AI investments may lead to increased customer acquisition and retention, contributing to ServiceNow's top-line growth.

It is important to consider the competitive landscape as well. NVIDIA's partnership with ServiceNow could be seen as an endorsement of ServiceNow's platform, possibly influencing stock market sentiment and investor confidence. However, investors should also be aware of the risks associated with the rapid deployment of new technologies, including potential technical challenges and market adoption hurdles.

The strategic partnership between ServiceNow and NVIDIA taps into the burgeoning market for enterprise-grade generative AI. According to industry research, the AI market is expected to grow significantly in the coming years, with generative AI playing a pivotal role. ServiceNow's move to incorporate NVIDIA's NIM inference microservices positions it to capitalize on this trend.

By focusing on domain-specific LLMs, ServiceNow is targeting a niche yet expanding segment of the AI market that requires tailored solutions. This approach can lead to a differentiated product offering that caters to specific industry needs, such as telecommunications service management. The collaboration with entities like Hugging Face and the Big Code Community on open-access LLMs for code generation further emphasizes ServiceNow's commitment to innovation and community-driven development.

It is essential to observe how ServiceNow's competitors respond to this development. The company's ability to deliver new GenAI-powered industry innovations could set a new standard for digital workflow solutions. Market research will be key in understanding how ServiceNow's generative AI capabilities are received by the market and how they impact customer decision-making processes.

ServiceNow becomes one of the first platform providers to access NVIDIA NIM inference microservices, enabling faster, scalable, and more cost-effective large language model development and deployment

SAN JOSE, Calif.--(BUSINESS WIRE)-- NVIDIA GTC — ServiceNow (NYSE: NOW), the leading digital workflow company making the world work better for everyone, today announced an expansion of its partnership with NVIDIA to advance the use of enterprise-grade generative AI (GenAI). ServiceNow is one of the first platform providers to access NVIDIA NIM inference microservices, enabling faster, scalable and more cost-effective large language model (LLM) development and deployment.

Announced today by NVIDIA at GTC, NIM are part of new, enterprise-grade GenAI microservices created to optimize inference in LLMs. ServiceNow is using NIM to serve its Now LLMs – domain specific LLMs that power capabilities within Now Assist, ServiceNow’s generative AI experience. The NIM-deployed Now LLMs will allow ServiceNow customers to scale generative AI across new use cases.

“ServiceNow and NVIDIA are building a future where businesses can break through every barrier,” said ServiceNow Chairman and CEO Bill McDermott. “GenAI is unlocking a new era of growth, completely reimagining digital experiences at scale. This is a once-in-a-generation opportunity, with ServiceNow and NVIDIA fueling technology breakthroughs.”

“Generative AI is driving a transformative leap that is shaping the future of technology and business,” said Jensen Huang, founder and CEO of NVIDIA. “Together, NVIDIA and ServiceNow are helping enterprises everywhere embrace generative AI within the platforms they use to serve customers, manage employees, enhance their operations, and transform their industries.”

NVIDIA and ServiceNow announced their initial partnership to develop powerful enterprise-grade generative AI capabilities in May 2023. Since then, the companies have launched programs such as AI Lighthouse to fast track the development and adoption of GenAI; delivered new GenAI-powered industry innovations like Now Assist for Telecommunications Service Management (TSM); and collaborated with technology leaders like Hugging Face and the Big Code Community on StarCoder2, a family of open‑access LLMs for code generation that sets new standards for performance, transparency, and cost‑effectiveness. NVIDIA also uses ServiceNow Now Assist to streamline its IT operations and improve employee experience with conversational capabilities, empowering employees to self-solve.

ServiceNow continues to improve its rapidly expanding generative AI portfolio so enterprises can bring the power of GenAI to any department, scale to other parts of the business quickly, and accelerate value from AI spend. Internal GenAI use cases at ServiceNow are delivering cost savings and increased productivity. For example, in just the first 120 days using Now Assist, ServiceNow projects it will save millions of dollars per year with increased case deflection through improved self-service and is realizing a 54% incident deflection rate with GenAI for employee issues. Additionally, ServiceNow has seen a 48% code generation acceptance rate internally with GenAI on the ServiceNow platform.

NVIDIA NIM inference microservices are already integrated within the Now LLM and available to all ServiceNow customers with Now Assist installed.

About ServiceNow

ServiceNow (NYSE: NOW) makes the world work better for everyone. Our cloud-based platform and solutions help digitize and unify organizations so that they can find smarter, faster, better ways to make work flow. So employees and customers can be more connected, more innovative, and more agile. And we can all create the future we imagine. The world works with ServiceNow™. For more information, visit:

© 2024 ServiceNow, Inc. All rights reserved. ServiceNow, the ServiceNow logo, Now, and other ServiceNow marks are trademarks and/or registered trademarks of ServiceNow, Inc. in the United States and/or other countries. Other company names, product names, and logos may be trademarks of the respective companies with which they are associated.

Jacqueline Velasco


Source: ServiceNow


What partnership did ServiceNow announce with NVIDIA?

ServiceNow announced an expansion of its partnership with NVIDIA to access NIM inference microservices for large language model development.

What are NIM inference microservices used for by ServiceNow?

ServiceNow is using NIM inference microservices to deploy Now LLMs, which power capabilities within Now Assist, the company's generative AI experience.

How does ServiceNow benefit from the NIM inference microservices?

ServiceNow benefits from faster, scalable, and more cost-effective large language model development and deployment through NIM inference microservices.

What are the key advantages of the collaboration between ServiceNow and NVIDIA?

The collaboration aims to unlock a new era of growth, reimagine digital experiences at scale, and fuel technology breakthroughs for businesses.

Which industry innovations have been launched by ServiceNow and NVIDIA?

ServiceNow and NVIDIA have launched industry innovations like Now Assist for Telecommunications Service Management (TSM) powered by GenAI.

ServiceNow, Inc.


NOW Rankings

NOW Latest News

NOW Stock Data

Software Publishers
United States of America

About NOW

ServiceNow is an American software company based in Santa Clara, California that develops a cloud computing platform to help companies manage digital workflows for enterprise operations.