STOCK TITAN

Datadog Expands LLM Observability with New Capabilities to Monitor Agentic AI, Accelerate Development and Improve Model Performance

Rhea-AI Impact
(Moderate)
Rhea-AI Sentiment
(Neutral)
Tags
AI
Datadog (NASDAQ: DDOG) has unveiled new AI monitoring capabilities to help organizations better manage and evaluate their AI investments. The company introduced three key features: AI Agent Monitoring, LLM Experiments, and AI Agents Console. AI Agent Monitoring, now generally available, provides interactive mapping of agent decision paths and debugging tools. LLM Experiments, in preview, enables testing and validation of LLM application changes. AI Agents Console, also in preview, offers visibility into both in-house and third-party agent behavior. These tools address a critical market need, as only 25% of AI initiatives currently deliver promised ROI. The new capabilities aim to provide comprehensive observability, testing, and governance for AI systems, helping organizations measure performance, optimize costs, and ensure security compliance.
Loading...
Loading translation...

Positive

  • Addresses a critical market gap in AI observability and monitoring
  • Enables organizations to measure and optimize ROI on AI investments
  • Provides comprehensive visibility into both in-house and third-party AI agents
  • Partnership with major AI companies like Mistral AI and Anthropic demonstrates market validation

Negative

  • Products like LLM Experiments and AI Agents Console are still in preview phase
  • Enters a highly competitive market space in AI monitoring

News Market Reaction

-1.07%
1 alert
-1.07% News Effect

On the day this news was published, DDOG declined 1.07%, reflecting a mild negative market reaction.

Data tracked by StockTitan Argus on the day of publication.

AI Agent Monitoring, LLM Experiments and AI Agents Console help organizations measure and justify agentic AI investments

New York, New York--(Newsfile Corp. - June 10, 2025) - Datadog, Inc. (NASDAQ: DDOG), the monitoring and security platform for cloud applications, today announced new agentic AI monitoring and experimentation capabilities to give organizations end-to-end visibility, rigorous testing capabilities, and centralized governance of both in-house and third-party AI agents. Presented at DASH, Datadog's annual observability conference, the new capabilities include AI Agent Monitoring, LLM Experiments and AI Agents Console.

The rise of generative AI and autonomous agents is transforming how companies build and deliver software. But with this innovation comes complexity. As companies race to integrate AI into their products and workflows, they face a critical gap. Most organizations lack visibility into how their AI systems behave, what agents are doing and whether they are delivering real business value.

Datadog is addressing this gap by bringing observability best practices to the AI stack. Part of Datadog's LLM Observability product, these new capabilities allow companies to monitor agentic systems, run structured LLM experiments, and evaluate usage patterns and the impact of both custom and third-party agents. This enables teams to deploy quickly and safely, accelerate iteration and improvements to their LLM applications, and prove impact.

"A recent study found only 25 percent of AI initiatives are currently delivering on their promised ROI—a troubling stat given the sheer volume of AI projects companies are pursuing globally," said Yrieix Garnier, VP of Product at Datadog. "Today's launches aim to help improve that number by providing accountability for companies pushing huge budgets toward AI projects. The addition of AI Agent Monitoring, LLM Experiments and AI Agents Console to our LLM Observability suite gives our customers the tools to understand, optimize and scale their AI investments."

Now generally available, Datadog's AI Agent Monitoring instantly maps each agent's decision path—inputs, tool invocations, calls to other agents and outputs—in an interactive graph. Engineers can drill down into latency spikes, incorrect tool calls or unexpected behaviors like infinite agent loops, and correlate them with quality, security and cost metrics. This simplifies the debugging of complex, distributed and non-deterministic agent systems, resulting in optimized performance.

"Agents represent the evolution beyond chat assistants, unlocking the potential of generative AI. As we equip these agents with more tools, comprehensive observability is essential to confidently transition use cases into production. Our partnership with Datadog ensures teams have the visibility and insights needed to deploy agentic solutions at scale," said Timothée Lacroix, Co-founder & CTO at Mistral AI.

In preview, Datadog launched LLM Experiments to test and validate the impact of prompt changes, model swaps or application changes on the performance of LLM applications. The tool works by running and comparing experiments against datasets created from real production traces (input/output pairs) or uploaded by customers. This allows users to quantify improvements in response accuracy, throughput and cost—and guard against regressions.

"AI agents are quickly graduating from concept to production. Applications powered by Claude 4 are already helping teams handle real-world tasks in many domains, from customer support to software development and R&D," said Michael Gerstenhaber, VP of Product at Anthropic. "As these agents take on more responsibility, observability becomes key to ensuring they behave safely, deliver value, and stay aligned with user and business goals. We're very excited about Datadog's new LLM Observability capabilities that provide the visibility needed to scale these systems with confidence."

Moreover, as organizations embed external AI agents—such as OpenAI's Operator, Salesforce's Agentforce, Anthropic's Claude-powered assistants or IDE copilots—into critical workflows, they need to understand their behavior, how they're being used, and what permissions they have across multiple systems to better optimize their agent deployments. To overcome this, Datadog unveiled AI Agents Console in preview, which allows organizations to establish and maintain visibility into in-house and third-party agent behavior, measure agent usage, impact and ROI, and proactively check for security and compliance risks.

To learn more about Datadog's latest AI Observability capabilities, please visit: https://www.datadoghq.com/product/llm-observability/.

AI Agent Monitoring, LLM Experiments and AI Agents Console were announced during the keynote at DASH, Datadog's annual conference. The replay of the keynote is available here. During DASH, Datadog also announced launches in Applied AI, AI Security, Log Management and released its Internal Developer Portal.

About Datadog

Datadog is the observability and security platform for cloud applications. Our SaaS platform integrates and automates infrastructure monitoring, application performance monitoring, log management, user experience monitoring, cloud security and many other capabilities to provide unified, real-time observability and security for our customers' entire technology stack. Datadog is used by organizations of all sizes and across a wide range of industries to enable digital transformation and cloud migration, drive collaboration among development, operations, security and business teams, accelerate time to market for applications, reduce time to problem resolution, secure applications and infrastructure, understand user behavior and track key business metrics.

Forward-Looking Statements

This press release may include certain "forward-looking statements" within the meaning of Section 27A of the Securities Act of 1933, as amended, or the Securities Act, and Section 21E of the Securities Exchange Act of 1934, as amended including statements on the benefits of new products and features. These forward-looking statements reflect our current views about our plans, intentions, expectations, strategies and prospects, which are based on the information currently available to us and on assumptions we have made. Actual results may differ materially from those described in the forward-looking statements and are subject to a variety of assumptions, uncertainties, risks and factors that are beyond our control, including those risks detailed under the caption "Risk Factors" and elsewhere in our Securities and Exchange Commission filings and reports, including the Annual Report on Form 10-K filed with the Securities and Exchange Commission on May 6, 2025, as well as future filings and reports by us. Except as required by law, we undertake no duty or obligation to update any forward-looking statements contained in this release as a result of new information, future events, changes in expectations or otherwise.

Contact
Dan Haggerty
press@datadoghq.com

To view the source version of this press release, please visit https://www.newsfilecorp.com/release/255068

FAQ

What new AI monitoring features did Datadog (DDOG) announce in June 2025?

Datadog announced three new features: AI Agent Monitoring (generally available), LLM Experiments (in preview), and AI Agents Console (in preview) to provide end-to-end visibility and testing capabilities for AI systems.

How does Datadog's AI Agent Monitoring work?

AI Agent Monitoring maps each agent's decision path, including inputs, tool invocations, calls to other agents, and outputs in an interactive graph, allowing engineers to debug issues and optimize performance.

What problem is Datadog's new AI monitoring solution addressing?

It addresses the fact that only 25% of AI initiatives deliver promised ROI, helping organizations measure, optimize, and justify their AI investments through comprehensive monitoring and testing capabilities.

What is the purpose of Datadog's LLM Experiments feature?

LLM Experiments allows users to test and validate the impact of prompt changes, model swaps, or application changes on LLM performance using real production traces or customer-uploaded datasets.

How does Datadog's AI Agents Console benefit organizations?

AI Agents Console helps organizations maintain visibility into both in-house and third-party agent behavior, measure usage and ROI, and monitor security and compliance risks across their AI deployments.
Datadog, Inc.

NASDAQ:DDOG

DDOG Rankings

DDOG Latest News

DDOG Latest SEC Filings

DDOG Stock Data

41.03B
322.51M
2.32%
88.04%
3.79%
Software - Application
Services-prepackaged Software
Link
United States
NEW YORK