Datadog Q4 2025 Datadog Inc Earnings Call | AllMind AI Earnings | AllMind AI
Q4 2025 Datadog Inc Earnings Call
Speaker #1: After the speaker's presentation, there'll be a question-and-answer session. To ask a question, please press star 11. If your question has been answered and you'd like to remove yourself from the queue, please press star 11 again.
Speaker #1: As a reminder, this call may be recorded. I'm now going to turn the call over to Yuka Broderick, Senior Vice President of Investor Relations.
Operator: Good day and welcome to the Q4 2025 Datadog Earnings Conference Call. At this time, all participants are in listen-only mode. After the speaker's presentation, there will be a question-and-answer session. To ask a question, please press star 11. If your question has been answered and you'd like to remove yourself from the queue, please press star 11 again. As a reminder, this call may be recorded. I'm now going to turn the call over to Yuka Broderick, Senior Vice President of Investor Relations. Please go ahead.
Speaker #1: Please go ahead.
Speaker #2: Thank you, Michelle. Good morning, and thank you for joining us to review Datadog's fourth-quarter 2025 financial results, which we announced in our press release issued this morning.
Speaker #1: question-and-answer session. If yourself from the queue, please press star your question has been answered and you'd like to remove this call may be recorded.
Speaker #2: Joining me on the call today are Olivier Pomel, Datadog's co-founder and CEO, and David Obstler, Datadog CFO. During this call, we will make forward-looking statements, including statements related to our future financial performance, our outlook for the first quarter and fiscal year 2026, and related notes and assumptions, our product capabilities, and our ability to capitalize on market opportunities.
Speaker #1: I'm now
Yuka Broderick: Thank you, Michelle. Good morning, and thank you for joining us to review Datadog's fourth quarter 2025 financial results, which we announced in our press release issued this morning. Joining me on the call today are Olivier Pomel, Datadog's co-founder and CEO, and David Obstler, Datadog's CFO. During this call, we will make forward-looking statements, including statements related to our future financial performance, our outlook for the first quarter and fiscal year 2026, and related notes and assumptions, our product capabilities, and our ability to capitalize on market opportunities. The words "anticipate," "believe," "continue," "estimate," "expect," "intend," "will," and similar expressions are intended to identify forward-looking statements or similar indications of future expectations. These statements reflect our views today and are subject to a variety of risks and uncertainties that could cause actual results to differ materially.
Yuka Broderick: Thank you, Michelle. Good morning, and thank you for joining us to review Datadog's fourth quarter 2025 financial results, which we announced in our press release issued this morning. Joining me on the call today are Olivier Pomel, Datadog's co-founder and CEO, and David Obstler, Datadog's CFO. During this call, we will make forward-looking statements, including statements related to our future financial performance, our outlook for the first quarter and fiscal year 2026, and related notes and assumptions, our product capabilities, and our ability to capitalize on market opportunities. The words "anticipate," "believe," "continue," "estimate," "expect," "intend," "will," and similar expressions are intended to identify forward-looking statements or similar indications of future expectations. These statements reflect our views today and are subject to a variety of risks and uncertainties that could cause actual results to differ materially.
Speaker #2: The words "anticipate," "believe," "continue," "estimate," "expect," "intend," "will," and similar expressions are intended to identify forward-looking statements or similar indications of future expectations. These statements reflect our views today and are subject to a variety of risks and uncertainties that could cause actual results to defer materially.
Speaker #2: For a discussion of the material risks and other important factors that could affect our actual results, please refer to our Form 10Q for the quarter ended September 30, 2025.
Speaker #2: Additional information will be made available in our upcoming Form 10K for the fiscal year ended December 31, 2025, and other filings through the SEC.
Speaker #2: This information is also available on the Investor Relations section of our website, along with a replay of this call. We will discuss non-gap financial measures, which are reconciled to their most directly comparable gap financial measures in the tables in our earnings release, which is available at investors.datadog.hq.com.
Yuka Broderick: For discussion of the material risks and other important factors that could affect our actual results, please refer to our Form 10-Q for the quarter ended September 30, 2025. Additional information will be made available in our upcoming Form 10-K for the fiscal year ended December 31, 2025, and other filings through the SEC. This information is also available on the Investor Relations section of our website along with a replay of this call. We will discuss non-GAAP financial measures, which are reconciled to their most directly comparable GAAP financial measures in the tables in our earnings release, which is available at investors.datadoghq.com. With that, I'd like to turn the call over to Olivier.
Yuka Broderick: For discussion of the material risks and other important factors that could affect our actual results, please refer to our Form 10-Q for the quarter ended September 30, 2025. Additional information will be made available in our upcoming Form 10-K for the fiscal year ended December 31, 2025, and other filings through the SEC. This information is also available on the Investor Relations section of our website along with a replay of this call. We will discuss non-GAAP financial measures, which are reconciled to their most directly comparable GAAP financial measures in the tables in our earnings release, which is available at investors.datadoghq.com. With that, I'd like to turn the call over to Olivier.
Speaker #2: 2025 most directly comparable GAAP financial measures in the tables in our earnings release, which is Additional investors.datadoghq.com. With that, I'd like to turn the call over to financial measures, which are reconciled to their Olivier.
Speaker #2: With that, I'd like to turn the call over to Olivier.
Speaker #3: Thanks, Yuka, and thank you all for joining us this morning to go over what was a very strong Q4 and overall a really productive 2025.
Speaker #3: Let me begin with this quarter's business drivers. We continue to see broad-based positive trends in the demand environment, with the ongoing momentum of cloud migration, we experience strength across our business, across our product lines, and across our diverse customer base.
Olivier Pomel: Thanks, Yuka, and thank you all for joining us this morning to go over what was a very strong Q4 and overall a really productive 2025. Let me begin with this quarter's business drivers. We continue to see broad-based positive trends in a demand environment. With the ongoing momentum of cloud migration, we experienced strengths across our business, across our product lines, and across our diverse customer base. We saw a continued acceleration of our revenue growth. This acceleration was driven in large part by the inflection of our broad-based business outside of the AI-native group of customers we discussed in the past. And we also continued to see very high growth within this AI-native customer group as they go into production and grow in users, tokens, and new products. Our go-to-market teams executed to a record $1.63 billion in bookings, up 37% year-over-year.
Olivier Pomel: Thanks, Yuka, and thank you all for joining us this morning to go over what was a very strong Q4 and overall a really productive 2025. Let me begin with this quarter's business drivers. We continue to see broad-based positive trends in a demand environment. With the ongoing momentum of cloud migration, we experienced strengths across our business, across our product lines, and across our diverse customer base. We saw a continued acceleration of our revenue growth. This acceleration was driven in large part by the inflection of our broad-based business outside of the AI-native group of customers we discussed in the past. And we also continued to see very high growth within this AI-native customer group as they go into production and grow in users, tokens, and new products. Our go-to-market teams executed to a record $1.63 billion in bookings, up 37% year-over-year.
Speaker #3: We saw a continued acceleration of our revenue growth. This acceleration was driven in large part by the inflection of our broad-based business outside of the AI-native group of customers we discussed in the past.
Speaker #3: broad-based positive trends in the demand for cloud
Speaker #3: And we also continue to see very high growth within this AI-native customer group, as they're going to production and growing users, tokens, and new products.
Speaker #3: product lines, and across our diverse customer acceleration of our revenue large part by the inflection of our growth. This acceleration was driven in base.
Speaker #3: Our go-to-market teams executed to a record 1.63 billion in bookings, up 37% year over year. This included some of the largest deals we've ever $10 million in TCV this made.
Speaker #3: We signed 18 deals over quarter. Of which two were over $100 million, and one was an eight-figure land with a leading AI model company.
Speaker #3: This AI-native customer group, as they go into production and grow past, and we also have the AI-native group of customers we discussed in the strength across our business, across our broad-based business outside of the billion in bookings, up 37% year over year. We saw a continued TCV this quarter.
Speaker #3: Finally, churn has remained low, with gross revenue retention stable in the mid to high 90s, highlighting the mission-critical nature of our platform for our customers.
Olivier Pomel: This included some of the largest deals we've ever made. We signed 18 deals over $10 million in TCV this quarter, of which two were over $100 million, and one was an eight-figure land with a leading AI model company. Finally, churn has remained low, with gross revenue retention stable in the mid to high 90s, highlighting the mission-critical nature of our platform for our customers. Regarding our Q4 financial performance and key metrics, revenue was $953 million, an increase of 29% year-over-year and above the high end of our guidance range. We ended Q4 with about 32,700 customers, up from about 30,000 a year ago. We also ended Q4 with about 4,310 customers with an ARR of $100,000 or more, up from about 3,610 a year ago. These customers generated about 90% of our ARR.
Olivier Pomel: This included some of the largest deals we've ever made. We signed 18 deals over $10 million in TCV this quarter, of which two were over $100 million, and one was an eight-figure land with a leading AI model company. Finally, churn has remained low, with gross revenue retention stable in the mid to high 90s, highlighting the mission-critical nature of our platform for our customers. Regarding our Q4 financial performance and key metrics, revenue was $953 million, an increase of 29% year-over-year and above the high end of our guidance range. We ended Q4 with about 32,700 customers, up from about 30,000 a year ago. We also ended Q4 with about 4,310 customers with an ARR of $100,000 or more, up from about 3,610 a year ago. These customers generated about 90% of our ARR.
Speaker #3: Regarding our Q4 financial performance and key metrics, revenue was $953 million, an increase of 29% year over year, and above the high end of our guidance range.
Speaker #3: Of signed 18 deals over $10 million in million, and one was an $8-figure land with a leading AI model company. Finally, churn has remained low with gross revenue retention stable in the mid to high 90s.
Speaker #3: We ended Q4 with about $32,700 customers, up from about $30,000 a year ago. We also ended Q4 with about $4,310 customers, with an ARR of $100,000 or more.
Speaker #3: Highlighting the go-to-market teams executed to a record 1.63 products, customers. Regarding our Q4 financial performance and key mission-critical nature of our platform for our $953 million, an increase of 29% year over year, range.
Speaker #3: Up from about $3,610 a year ago. These customers generated about 90% of our ARR. And we generated free cash flow of $291 million, with a free cash flow margin of 31%.
Speaker #3: We ended Q4 with about two metrics that were over $100. Revenue was 32,700 customers, up from about 30,000 a year ago. We had 4,310 customers with an more.
Speaker #3: Turning to product adoption, our platform strategy continues to resonate in the market. At the end of Q4, 84% of customers used two or more products, up from 83% a year ago.
Speaker #3: ARR of $100,000 or Up from about ago. These customers generated about 90% of our and above the high end of our guidance ARR. And we generated free cash flow of also ended Q4 with about With a free cash flow margin of continues to resonate in the adoption, our platform strategy year.
Speaker #3: 55% of customers used four or more products, up from 50% a year ago. 33% of our customers used six or more products, up from 26% a year ago.
Olivier Pomel: We generated free cash flow of $291 million, with a free cash flow margin of 31%. Turning to product adoption, our platform strategy continues to resonate in the market. At the end of Q4, 84% of customers use two or more products, up from 83% a year ago. 55% of customers use four or more products, up from 50% a year ago. 33% of our customers use six or more products, up from 26% a year ago. 18% of our customers use eight or more products, up from 12% a year ago. And as a sign of continued penetration of our platform, 9% of our customers use 10 or more products, up from 6% a year ago. During 2025, we continue to land and expand with larger customers. As of December 2025, 48% of the Fortune 500 are Datadog customers.
Olivier Pomel: We generated free cash flow of $291 million, with a free cash flow margin of 31%. Turning to product adoption, our platform strategy continues to resonate in the market. At the end of Q4, 84% of customers use two or more products, up from 83% a year ago. 55% of customers use four or more products, up from 50% a year ago. 33% of our customers use six or more products, up from 26% a year ago. 18% of our customers use eight or more products, up from 12% a year ago. And as a sign of continued penetration of our platform, 9% of our customers use 10 or more products, up from 6% a year ago. During 2025, we continue to land and expand with larger customers. As of December 2025, 48% of the Fortune 500 are Datadog customers.
Speaker #3: 18% of our customers used eight or more products, up from 12% a year ago. And as a sign of continued penetration of our platform, 9% of our customers used 10 or more products, up from 6% a year ago.
Speaker #3: Q4, 84% of market. customers used two or more products. At the end of deals we've ever made. Up from 83% a year We 31%. This included some of the largest ago.
Speaker #3: Q4, 84% of market. customers used two or more products. At the end of deals we've ever made. Up from 83% a year We 31%.
Speaker #3: products. Up from 50% a year ago. 33% of our customers used six or more customers used four or more products. Up from 26% a year $291 million.
Speaker #3: During 2025, we continue to land and expand with larger customers. As of December 2025, 48% of the Fortune 500 are Datadog customers. We think many of the largest enterprises are still very early in their journey to the cloud.
Speaker #3: Customers used eight or more products, 55% a year ago. And as a sign of continued penetration of our platform, 9% a year ago used 18% of our products.
Speaker #3: Up from 12% a year ago. During 2025, we continue to land and Up from 6% a year users, tokens, and new of our customers used 10 or more expand with larger customers.
Speaker #3: The median Datadog ARR for our Fortune 500 customers is still less than half a million dollars, which leaves a very large opportunity for us to grow with these customers.
Speaker #3: So we're landing more customers and delivering more value, and we're also seeing that with the ARR milestones we're reaching with our products. We continue to see strong growth dynamics with our core three pillars of observability: infrastructure monitoring, APM, and log management.
Speaker #3: 48% of the Fortune Datadog ARR for our Fortune 500 customers. 500 are Datadog. As of December 2025, we think many of the largest customers are still less than half. So we're landing more customers and customers.
Olivier Pomel: We think many of the largest enterprises are still very early in their journey to the cloud. The median Datadog ARR for our Fortune 500 customers is still less than half a million dollars, which leaves a very large opportunity for us to grow with these customers. So we're landing more customers and delivering more value, and we also see that with the ARR milestones we're reaching with our products. We continue to see strong growth dynamics with our core three pillars of observability: Infrastructure Monitoring, APM, and Log Management, as customers are adopting the cloud, AI, and modern technologies. Today, Infrastructure Monitoring contributes over $1.6 billion in ARR. This includes innovations that deliver visibility and insights across our customers' environments, whether they are on-prem, virtualized servers, containerized hosts, serverless deployments, or parallelized GPU fleets. Meanwhile, Log Management is now over $1 billion in ARR.
Olivier Pomel: We think many of the largest enterprises are still very early in their journey to the cloud. The median Datadog ARR for our Fortune 500 customers is still less than half a million dollars, which leaves a very large opportunity for us to grow with these customers. So we're landing more customers and delivering more value, and we also see that with the ARR milestones we're reaching with our products. We continue to see strong growth dynamics with our core three pillars of observability: Infrastructure Monitoring, APM, and Log Management, as customers are adopting the cloud, AI, and modern technologies. Today, Infrastructure Monitoring contributes over $1.6 billion in ARR. This includes innovations that deliver visibility and insights across our customers' environments, whether they are on-prem, virtualized servers, containerized hosts, serverless deployments, or parallelized GPU fleets. Meanwhile, Log Management is now over $1 billion in ARR.
Speaker #3: As customers are adopting the cloud, AI, and modern technologies, today, infrastructure monitoring contributes over $1.6 billion in ARR. This includes innovations delivered visibility and insights across our customers' environments, whether they are on-prem, virtualized servers, containerized hosts, serverless deployments, or parallelized GPU fleets.
Speaker #3: delivering more value. And we also see million-dollar enterprises are still very early in their journey to that, with the ARR milestones we're hitting. Which leaves a very large market for our products.
Speaker #3: We continue to see strong growth dynamics with our core three pillars of reaching with our monitoring, APM, and log technologies. Today, cloud, AI, and modern infrastructure monitoring contributes over $1.6 billion in management.
Speaker #3: Meanwhile, log management is now over $1 billion in ARR. And this includes continued rapid growth with flex logs, which is nearing $100 million in ARR, and our third pillar, the end-to-end suite of APM and DEM products, also crossed $1 billion in ARR.
Speaker #3: ARR. This includes innovations delivered visibility and insights across our customers' environments, whether they are on-prem, virtualized servers, containerized As customers are adopting the hosts, serverless deployments, or parallelized GPU fleets.
Speaker #3: This includes an acceleration of our core APM products into the mid-30s percent year over year and currently our fastest growing core pillar. We have now enabled our customers with the easiest onboarding and implementation in the market, while delivering unified, deep end-to-end visibility into the applications.
Speaker #3: Meanwhile, log management is now over $1 billion in ARR. This includes continued rapid growth with Flex Logs, which is nearing $100 million in ARR.
Olivier Pomel: This includes continued rapid growth with Flex Logs, which is nearing $100 million in ARR. Our third pillar, the end-to-end suite of APM and DEM products, also crossed $1 billion in ARR. This includes an acceleration of our core APM products into the mid-30s% year-over-year and currently our fastest-growing core pillar. We have now enabled our customers with the easiest onboarding and implementation in the market while delivering unified, deep end-to-end visibility into their applications. Now, remember that even with these three pillars, we're still just getting started, as about half of our customers do not buy all three pillars for us, or at least not yet. Moving on to R&D and what we built in 2025. We released over 400 new features and capabilities this year. That's too much for us to cover today, but let's go over just some of our innovations.
Olivier Pomel: This includes continued rapid growth with Flex Logs, which is nearing $100 million in ARR. Our third pillar, the end-to-end suite of APM and DEM products, also crossed $1 billion in ARR. This includes an acceleration of our core APM products into the mid-30s% year-over-year and currently our fastest-growing core pillar. We have now enabled our customers with the easiest onboarding and implementation in the market while delivering unified, deep end-to-end visibility into their applications. Now, remember that even with these three pillars, we're still just getting started, as about half of our customers do not buy all three pillars for us, or at least not yet. Moving on to R&D and what we built in 2025. We released over 400 new features and capabilities this year. That's too much for us to cover today, but let's go over just some of our innovations.
Speaker #3: And our third pillar, the end-to-end suite of APM and DEM products, also crossed observability infrastructure $1 billion in ARR. This includes an acceleration of our core APM products into the mid-30s percent year over year and pillar.
Speaker #3: Now, remember that even with these three pillars, we're still just getting started, as about half of our customers do not buy all three pillars from us, or at least not yet.
Speaker #3: Moving on to R&D and what we built in 2025. We released over 400 new features and capabilities this year. That's too much for us to cover today, but let's go over just some of our innovations.
Speaker #3: We have now enabled our customers with the easiest onboarding and currently our fastest growing core implementation in the market, while delivering unified, deep end-to-end visibility into the applications.
Speaker #3: Now, remember that even getting started, about half of our customers do not buy all three pillars from us, or at least not moving on to R&D.
Speaker #3: We are executing relentlessly on our very ambitious AI roadmap, and I will split our AI efforts into two buckets. AI for Datadog and Datadog for AI.
Speaker #3: And what we built in yet. 2025. We released over 400 new—year. That's too much for us to cover today. But let's go over just some of our—with these three pillars, we're still just innovations.
Speaker #3: So first, let's look at AI for Datadog. These are AI products and capabilities that make the Datadog platform better and more useful for customers.
Speaker #3: We launched BIP AI SRE agent for general availability in December to accelerate root cause analysis and incident response. Over 2,000 trial and paying customers have run investigations in the past months, which indicates significant interest and showed great outcomes with BIP AI SRE.
Speaker #3: We are executing relentlessly on our very ambitious AI roadmap. And I will split our AI efforts into two buckets. AI for Datadog and Datadog for AI.
Olivier Pomel: We are executing relentlessly on our very ambitious AI roadmap, and I will split our AI efforts into two buckets: AI for Datadog and Datadog for AI. So first, let's look at AI for Datadog. These are AI products and capabilities that make the Datadog platform better and more useful for customers. We launched Bits AI SRE Agent for general availability in December to accelerate root cause analysis and incident response. Over 2,000 trial and paying customers have run investigations in the past months, which indicate significant interest and showed great outcomes with Bits AI SRE. And we're well on our way with Bits AI Dev Agent, which detects code-level issues, generates fixes with production context, and can even help release and monitor a fix, and Bits AI Security Agent, which autonomously triages SIEM signals, conducts investigations, and delivers recommendations.
Olivier Pomel: We are executing relentlessly on our very ambitious AI roadmap, and I will split our AI efforts into two buckets: AI for Datadog and Datadog for AI. So first, let's look at AI for Datadog. These are AI products and capabilities that make the Datadog platform better and more useful for customers. We launched Bits AI SRE Agent for general availability in December to accelerate root cause analysis and incident response. Over 2,000 trial and paying customers have run investigations in the past months, which indicate significant interest and showed great outcomes with Bits AI SRE. And we're well on our way with Bits AI Dev Agent, which detects code-level issues, generates fixes with production context, and can even help release and monitor a fix, and Bits AI Security Agent, which autonomously triages SIEM signals, conducts investigations, and delivers recommendations.
Speaker #3: So first, let's look at AI for Datadog. These are AI products and capabilities that make the Datadog platform features and capabilities better and more useful for the AI SRE agent, for general availability in December, to accelerate root cause analysis and incident response.
Speaker #3: And we're well on our way with BIP AI DevAgent, which detects code-level issues, generates fixes with production context, and can even help release and monitor a
Speaker #1: And BCCI agent , security which autonomously seem signals triages , conducts investigations and delivers recommendations , conducts investigations and delivers recommendations . The Datadog, Inc. MCP server is being used by thousands of customers in preview on MCP server responds to agent and AI user prompts and time uses real data and context to drive troubleshooting , root cause analysis and automation .
Speaker #3: Over 2,000 trial and paying customers have run investigations in the past months, which indicates significant interest and showed great outcomes with Beats. We're well on our way with Beats AI and AI SRE.
Speaker #3: DevAgent, which detects code level And customers. We launched Beats issues, generates fixes with production context, and can even help release and monitor a fix.
Speaker #3: And Beats AI Security Agent, which signals, conducts investigations, and delivers autonomously triages SIEM recommendations. The Datadog MCP server is being used by thousands of customers in server responds to AI agent and user prompts, and uses real-time production drive troubleshooting, root cause analysis, and automation.
Speaker #1: And we're seeing explosive growth in MCP usage with a number of tool calls growing 11 fold in Q4 compared to Q3 . Second , let's talk about Datadog for AI .
Olivier Pomel: The Datadog MCP Server is being used by thousands of customers in preview. Our MCP Server responds to AI agent and user prompts and uses real-time production data and rich Datadog context to drive troubleshooting, root cause analysis, and automation. We're seeing explosive growth in MCP usage, with the number of tool calls growing 11-fold in Q4 compared to Q3. Second, let's talk about Datadog for AI. This includes capabilities that deliver end-to-end observability and security across the AI stack. We are seeing an acceleration in growth for AI observability. Over 1,000 customers are using the product, and the number of spans sent has increased 10 times over the last six months. In 2025, we broadened this product to better support application development and iteration, adding capabilities such as LLM experiments, LLM playground, LLM prompt analysis, and custom LLM as a judge.
Olivier Pomel: The Datadog MCP Server is being used by thousands of customers in preview. Our MCP Server responds to AI agent and user prompts and uses real-time production data and rich Datadog context to drive troubleshooting, root cause analysis, and automation. We're seeing explosive growth in MCP usage, with the number of tool calls growing 11-fold in Q4 compared to Q3. Second, let's talk about Datadog for AI. This includes capabilities that deliver end-to-end observability and security across the AI stack. We are seeing an acceleration in growth for AI observability. Over 1,000 customers are using the product, and the number of spans sent has increased 10 times over the last six months. In 2025, we broadened this product to better support application development and iteration, adding capabilities such as LLM experiments, LLM playground, LLM prompt analysis, and custom LLM as a judge.
Speaker #1: This includes capabilities that deliver end to end observability and security across the AI stack . We are seeing an acceleration in growth for observability .
Speaker #3: data and rich Datadog context to preview. And we're seeing explosive growth in MCP usage. With the number of Our MCP tool calls growing 11-fold in Q4 compared to Q3.
Speaker #1: Over 1000 customers are using the product and the number of fans sets has increased ten times over the last six months . In 2025 , we brought product to better support Rapid agent development and iteration , adding capabilities such as LLM experiments , playground and prompt and custom LLM .
Speaker #3: AI. This includes capabilities that deliver end-to-end observability and security across the AI stack. We are seeing an acceleration in growth Second, let's talk about Datadog for product.
Speaker #1: As a judge , and we will soon release our AI agent console to monitor usage and adoption of AI agents , and coding assistance .
Speaker #3: And the number of spans sent months. has increased 10 times over the last six In 2025, we broadened this development and iteration. Adding capabilities such as LLM experiments, analysis, and custom LLM as a for observability.
Speaker #1: We are working with design partners on GPU monitoring , and we are seeing GPU usage increase in our customer base . Overall , and we are building into our products the ability to secure the AI stack against prompt injection attacks , model hijacking and data poisoning , among many other risks .
Speaker #3: LLM playground, LLM prompt judge, and we will soon console to monitor usage and adoption of AI assistants. We're working with design agents and coding partners on GPU monitoring overall.
Olivier Pomel: We will soon release our AI Agents console to monitor usage and adoption of AI agents and coding assistants. We're working with design partners on GPU monitoring, and we are seeing GPU usage increase in our customer base overall. We're building into our products the ability to secure the AI stack against prompt injection attacks, model hijacking, and data poisoning, among many other risks. Overall, we continue to see increased interest among our customers in next-gen AI. Today, about 5,500 customers use one or more Datadog AI integrations to send us data about their machine learning, AI, and LLM usage. In 2025, our observability platform delivered deeper and broader capabilities for our customers. We reached a major milestone of more than 1,000 integrations, making it easy for our customers to bring in every type of data they need and engage with the latest technologies from cloud to AI.
Olivier Pomel: We will soon release our AI Agents console to monitor usage and adoption of AI agents and coding assistants. We're working with design partners on GPU monitoring, and we are seeing GPU usage increase in our customer base overall. We're building into our products the ability to secure the AI stack against prompt injection attacks, model hijacking, and data poisoning, among many other risks. Overall, we continue to see increased interest among our customers in next-gen AI. Today, about 5,500 customers use one or more Datadog AI integrations to send us data about their machine learning, AI, and LLM usage. In 2025, our observability platform delivered deeper and broader capabilities for our customers. We reached a major milestone of more than 1,000 integrations, making it easy for our customers to bring in every type of data they need and engage with the latest technologies from cloud to AI.
Speaker #1: Overall , we continue to see increased interest among our customers in next gen . Today , about 5500 customers use one or more Datadog integrations to send us data about their machine learning , AI , and usage .
Speaker #3: And we're building into our products are seeing GPU usage increase in our the ability to secure the AI stack against prompt injection attacks, release our AI agents model hijacking, and data poisoning, among many other risks.
Speaker #1: In 2025 , our observability platform delivered deeper and broader capabilities for our customers . We reached a major milestone of more than 1000 integrations , making it for our customers to bring in every type of data they need and engage with the latest technologies from cloud AI to .
Speaker #3: We continue to see increased interest overall, as customers look to better support rapid agent AI with one or more Datadog AI integrations. To send us data about their machine learning, among our customers in next-gen AI and LLM usage.
Speaker #1: In management , we are seeing success with our motion . During 2025 , we saw an increasing demand to replace a large legacy vendor with take out in nearly 100 deals for tens of millions of dollars of new revenue .
Speaker #3: In 5,500 customers use observability platform delivered deeper 2025, our major milestone of more than 1,000 integrations making it easy for our customers to bring in every type of data customers.
Speaker #3: and broader capabilities for our 2025, we saw an increasing they need and engage with the latest vendor with takeouts in nearly 100 deals for tens of millions of dollars We reached a AI.
Speaker #1: we And improved log management with notebooks , reference tables , log patterns , calculated fields , and an improved lifestyle , among many other innovations .
Olivier Pomel: In Log Management, we're seeing success with our consolidation motions. During 2025, we saw an increasing demand to replace a large legacy vendor with takeouts in nearly 100 deals for $ tens of millions of new revenue. We improved Log Management with notebooks, reference tables, log patterns, calculated fields, and an improved Live Tail, among many other innovations. We launched Data Observability for general availability. Data is becoming even more critical in the AI era. With Data Observability, we are enabling end-to-end visibility across the entire data lifecycle. We launched storage management last month, providing granular insights into cloud storage and recommendations to reduce spend. We delivered Kubernetes Autoscaling so users can quickly identify over-provisioned clusters and deployments and right-size their infrastructure. In the Digital Experience Monitoring area, we launched Product Analytics to help product designers make better design decisions with clear data about user experience and behavior.
Olivier Pomel: In Log Management, we're seeing success with our consolidation motions. During 2025, we saw an increasing demand to replace a large legacy vendor with takeouts in nearly 100 deals for $ tens of millions of new revenue. We improved Log Management with notebooks, reference tables, log patterns, calculated fields, and an improved Live Tail, among many other innovations. We launched Data Observability for general availability. Data is becoming even more critical in the AI era. With Data Observability, we are enabling end-to-end visibility across the entire data lifecycle. We launched storage management last month, providing granular insights into cloud storage and recommendations to reduce spend. We delivered Kubernetes Autoscaling so users can quickly identify over-provisioned clusters and deployments and right-size their infrastructure. In the Digital Experience Monitoring area, we launched Product Analytics to help product designers make better design decisions with clear data about user experience and behavior.
Speaker #1: that We launched observability for general availability . Data is becoming even more critical in the AI around their observability . with enabling end to end visibility across the entire data lifecycle .
Speaker #3: Of new revenue. And we improved log management with notebooks, reference tables, log patterns, calculated fields, and an improved Live in Log motion. Innovations. We launched data during Observability for general availability.
Speaker #1: We launched storage Management last month , providing insights into cloud storage and recommendations to reduce spend . We delivered Kubernetes autoscaling so users can quickly identify overprovisioned clusters and rightsize their deployments and infrastructure in the digital experience monitoring area .
Speaker #3: Data is becoming even more critical in the AI era. demand to replace a large legacy With data observability, we are enabling end-to-end visibility across the entire data lifecycle.
Speaker #1: We launched product analytics to help product make better design designers decisions with clear data about user experience and behavior . And we delivered Room Without Limits , giving front end full teams visibility into user traffic and performance .
Speaker #3: We launched storage management last month, providing granular insights into Cloud Tail, among many other storage and recommendations to reduce, identify which over-provisioned clusters and Kubernetes autoscaling so users can quickly deploy and right-size their infrastructure.
Speaker #1: And dynamically choosing the most useful sessions to retain in security . We are seeing increasing traction in are actively displacing existing market leading solutions with cloud theme in large enterprises this year , our engineers shipped many new capabilities , including a tripling of the amount of content packs built into the product .
Speaker #3: In the digital we launched product analytics to help experience monitoring area, decisions with clear data about user experience and spend. product designers make better design We delivered visibility into user traffic and performance and dynamically choosing the behavior.
Olivier Pomel: And we delivered RUM without limits, giving front-end teams full visibility into user traffic and performance and dynamically choosing the most useful sessions to retain. In security, we're seeing increasing traction and are actively displacing existing market-leading solutions with Cloud SIEM in large enterprises. This year, our engineers shipped many new capabilities, including a tripling of the amount of content packs built into the product and, most importantly, the tight integration with Bits AI Security Agent, which has already shown promise as a strong differentiator in the market. We launched Code Security, enabling customers to detect and remediate vulnerabilities in their code and open-source libraries from development to production. And we continue to advance our cloud security offering, adding infrastructure as code or IaC security, which detects and resolves security issues with Terraform. And we launched our security graphs to identify and evaluate attack paths.
Olivier Pomel: And we delivered RUM without limits, giving front-end teams full visibility into user traffic and performance and dynamically choosing the most useful sessions to retain. In security, we're seeing increasing traction and are actively displacing existing market-leading solutions with Cloud SIEM in large enterprises. This year, our engineers shipped many new capabilities, including a tripling of the amount of content packs built into the product and, most importantly, the tight integration with Bits AI Security Agent, which has already shown promise as a strong differentiator in the market. We launched Code Security, enabling customers to detect and remediate vulnerabilities in their code and open-source libraries from development to production. And we continue to advance our cloud security offering, adding infrastructure as code or IaC security, which detects and resolves security issues with Terraform. And we launched our security graphs to identify and evaluate attack paths.
Speaker #1: And most importantly , the tight integration with AI security Agent , which has already shown promise as a strong differentiator in the market .
Speaker #3: In limits, giving front-end teams full security, we're seeing increasing traction in our actively displacing existing market-leading solutions in enterprises. This year, our engineers shipped many new content packs built into the product.
Speaker #1: We launched Code Security , enabling customers to detect and remediate linear abilities in their code and open source libraries from development to production .
Speaker #1: And we continue to advance our cloud security offering , adding infrastructure as code or IAC , which detects and security resolves security issues with Terraform .
Speaker #3: And most importantly, the tight integration with Beats AI Security Agent, which has already And we delivered ROM without shown promise as a strong differentiator in the with cloud SIEM in large security enabling customers to detect and market.
Speaker #1: And we launched our security graph to identify and evaluate attack paths . In software delivery . In January , we launched feature flags .
Speaker #3: Remediate vulnerabilities in their code and open source production. And we continue to—we launched code capabilities, including a tripling of the offering. Adding infrastructure as code, or resolves security issues with libraries, from development to graph, to identify and advance our cloud security, evaluate attack paths.
Speaker #1: They combined with our real time observability to enable canary rollouts so teams can deploy new code with confidence . And we expect them to gain importance in the future as they serve as a foundation for automating the validation and release of applications in an AI development world .
Speaker #3: In software, FutureFlag—combined with our real-time observability—enables canary rollouts so teams can do IaC security, which detects and deploys new code with delivery. In January, we launched Confidence.
Olivier Pomel: In software delivery, in January, we launched Feature Flags. They combine with our real-time observability to enable canary rollouts so teams can deploy new code with confidence. We expect them to gain importance in the future as they serve as a foundation for automating the validation and release of applications in an AI-agentic development world. We're also building out our internal developer portal, which includes software catalog, and scorecards to help developers navigate infrastructure and application complexity, provide rich context to AI development agents, and ultimately enable a faster release cadence. In cloud service management, we launched On-Call and now support over 3,000 customers with their incident response processes. I already mentioned Bits AI SRE Agent, which pairs with On-Call to accelerate our customer incident resolution. As you can tell, we've been very busy, and I want to thank our engineers for a very productive 2025.
Olivier Pomel: In software delivery, in January, we launched Feature Flags. They combine with our real-time observability to enable canary rollouts so teams can deploy new code with confidence. We expect them to gain importance in the future as they serve as a foundation for automating the validation and release of applications in an AI-agentic development world. We're also building out our internal developer portal, which includes software catalog, and scorecards to help developers navigate infrastructure and application complexity, provide rich context to AI development agents, and ultimately enable a faster release cadence. In cloud service management, we launched On-Call and now support over 3,000 customers with their incident response processes. I already mentioned Bits AI SRE Agent, which pairs with On-Call to accelerate our customer incident resolution. As you can tell, we've been very busy, and I want to thank our engineers for a very productive 2025.
Speaker #1: We are also building out our internal developer portal , which includes software catalog and scorecards to help developers navigate infrastructure and application complexity , provide rich context to AI development agents , and ultimately enable a faster release cadence in cloud service management .
Speaker #3: And we expect them to gain importance in the future, as they serve as a foundation—Terraform—for automating the validation. And we launched our security release of applications in an AI agentic development world.
Speaker #1: We launched on call and now support over 3000 customers with their incident response processes , and I already mentioned beats which with on , Acer , pairs Agent , call to our accelerate customer incident .
Speaker #3: We're also building out our internal developer portal, which includes software catalog and scorecards. To help developers navigate infrastructure and application complexity, provide rich context to AI development faster release cadence.
Speaker #3: We're also building out our internal developer portal, which includes a software catalog and scorecards. To help developers navigate infrastructure and application complexity, provide rich context to AI development, enable faster release agents, and ultimately enable management, we launched Encore and now support over 3,000 customers with their incident response processes.
Speaker #1: resolution As you can tell , we've been very busy and I want to thank our engineers for a very productive 2025 and most importantly , I am even more excited about our plans for 2026 .
Speaker #1: So let's move on to sales and marketing . I want to highlight some of the great deals we closed this quarter . First , we landed an eight figure annualized deal and our biggest new deal to date with one of the largest AI financial model companies .
Speaker #3: And I already mentioned Beats AI SRE agent. Which pairs with Encore to accelerate our customer incident resolution. busy. And I want to thank our engineers for a very productive 2025.
Speaker #3: And most importantly, I'm even more excited about our plans for—as you can tell, we've been very—move on to sales and marketing. I want to highlight some of the great deals we closed this quarter.
Olivier Pomel: Most importantly, I'm even more excited about our plans for 2026. Let's move on to sales and marketing. I want to highlight some of the great deals we closed this quarter. First, we landed an eight-figure annualized deal, our biggest deal to date with one of the largest AI financial model companies. This customer had a fragmented observability stack and cumbersome monitoring workflows, leading to poor productivity. This is a consolidation of more than five open-source, commercial, hyperscaler, and in-house observability tools into the unified Datadog platform that has returned meaningful time to developers and has enabled a more cohesive approach to observability. This customer is experiencing very rapid growth. Datadog allows them to focus on product development and supporting their users, which is critical to their business success.
Olivier Pomel: Most importantly, I'm even more excited about our plans for 2026. Let's move on to sales and marketing. I want to highlight some of the great deals we closed this quarter. First, we landed an eight-figure annualized deal, our biggest deal to date with one of the largest AI financial model companies. This customer had a fragmented observability stack and cumbersome monitoring workflows, leading to poor productivity. This is a consolidation of more than five open-source, commercial, hyperscaler, and in-house observability tools into the unified Datadog platform that has returned meaningful time to developers and has enabled a more cohesive approach to observability. This customer is experiencing very rapid growth. Datadog allows them to focus on product development and supporting their users, which is critical to their business success.
Speaker #1: This customer had a fragmented observability stack and cumbersome monitoring workflows , leading to poor productivity . This is a consolidation of more than five open source commercial hyperscaler and in-house observability tools into the unified Datadog platform .
Speaker #3: First, we landed a 2026 eight-figure annualized deal and our biggest new logo deal to date with one of the largest AI financial model companies.
Speaker #1: That has returned meaningful time to developers and has enabled a more cohesive approach to observability . This customer is experiencing rapid very growth .
Speaker #3: These customers had a fragmented observability stack and cumbersome monitoring workflows leading to poor productivity. This is a consolidation of more than five open source, commercial, hyperscaler, and in-house observability tools into platform.
Speaker #1: Datadog allows them to focus on product development and supporting their users , which is critical to their business success . Next , we welcome back a customer that was a European data company in a nearly 70 year annualized deal .
Speaker #3: That has returned meaningful time to developers and has enabled a more cohesive approach to observability. These customers are experiencing very rapid, unified Datadog growth.
Speaker #1: These customers logs focused observability had poor user experience and integrations , which led to limited user adoption and gaps in coverage . By returning to Datadog and consolidating seven observability tools , they expect to reduce tooling overhead and improve engineering productivity with faster incident resolution .
Speaker #3: Datadog allows them to focus on product development and supporting their users, which is critical to their business. We welcomed back a customer that was a European data company in a next deal.
Olivier Pomel: Next, we welcome back a customer that was a European data company in a nearly seven-figure annualized deal. This customer's legacy observability solution had poor user experience and integrations, which led to limited user adoption and gaps in coverage. By returning to Datadog and consolidating seven observability tools, they expect to reduce tooling overhead and improve engineering productivity with faster incident resolution. It will adopt nine Datadog products at the start, including some of our newer products such as Flex Logs, Observability Pipelines, Cloud Cost Management, Data Observability, and On-Call. Next, we signed an eight-figure annualized expansion with a leading e-commerce and digital payments platform. This customer's products have an enormous reach. Its commercial APM solution had scaling issues, lacked correlation across silos, and had a pricing model that was difficult to understand or predict.
Olivier Pomel: Next, we welcome back a customer that was a European data company in a nearly seven-figure annualized deal. This customer's legacy observability solution had poor user experience and integrations, which led to limited user adoption and gaps in coverage. By returning to Datadog and consolidating seven observability tools, they expect to reduce tooling overhead and improve engineering productivity with faster incident resolution. It will adopt nine Datadog products at the start, including some of our newer products such as Flex Logs, Observability Pipelines, Cloud Cost Management, Data Observability, and On-Call. Next, we signed an eight-figure annualized expansion with a leading e-commerce and digital payments platform. This customer's products have an enormous reach. Its commercial APM solution had scaling issues, lacked correlation across silos, and had a pricing model that was difficult to understand or predict.
Speaker #3: This customer's log-focused observability solution had poor user experience and adoption and gaps in coverage. By returning to Datadog and consolidating seven observability productivity with faster incident overhead and improve engineering resolution.
Speaker #1: They will adopt nine Datadog products as a start , including some of our newer products , such as flex Logs , observability pipelines , cloud cost management data , availability , and on call .
Speaker #1: Next , we signed an eight figure annualized expansion with a leading e-commerce and digital payments platform . These customers products have an reach enormous commercial in its APM scaling solution .
Speaker #3: Tools, they expect to reduce tooling. It will adopt nine Datadog products at the start, including some of our newer products such as Flex Integrations, which led to limited user logs, Observability Observability, and Pipelines, Cloud Cost Management, Data Encore.
Speaker #1: Had issues , lack across correlation silos , and had a pricing model that was difficult to understand or predict . With this expansion , they are standardizing on Datadog APM using Opentelemetry for their teams can correlate metrics , traces , and logs to detect and resolve issues faster .
Speaker #3: we signed an eight-figure annualized Next, expansion with a leading e-commerce and digital enormous reach in its commercial platform. payments This customer's products have an APM solution had scaling issues, lack correlation across silos, and had a pricing model that was difficult to understand or standardizing on Datadog APM using OpenTelemetry so their teams predict.
Speaker #1: And they have already seen meaningful impact with a 40% reduction in resolution times by their own estimates . This customer has adopted 17 products across the data platform .
Olivier Pomel: With this expansion, they are standardizing on Datadog APM using OpenTelemetry so their teams can correlate metrics, traces, and logs to detect and resolve issues faster. And they have already seen meaningful impact with a 40% reduction in resolution times by their own estimates. This customer has adopted 17 products across the Datadog platform. Next, we signed a seven-figure annualized expansion for an eight-figure annualized deal with a Fortune 500 food and beverage retailer. This longtime customer uses the Datadog platform across many products but still has over 30 other observability tools and embarked on consolidating for cost savings and better outcomes. With this expansion, Datadog Log Management and Flex Logs would replace their legacy logging product for all ops use cases with expected annual savings in the millions of dollars. This customer is expanding to 17 Datadog products.
Olivier Pomel: With this expansion, they are standardizing on Datadog APM using OpenTelemetry so their teams can correlate metrics, traces, and logs to detect and resolve issues faster. And they have already seen meaningful impact with a 40% reduction in resolution times by their own estimates. This customer has adopted 17 products across the Datadog platform. Next, we signed a seven-figure annualized expansion for an eight-figure annualized deal with a Fortune 500 food and beverage retailer. This longtime customer uses the Datadog platform across many products but still has over 30 other observability tools and embarked on consolidating for cost savings and better outcomes. With this expansion, Datadog Log Management and Flex Logs would replace their legacy logging product for all ops use cases with expected annual savings in the millions of dollars. This customer is expanding to 17 Datadog products.
Speaker #1: Next , we signed a seven figure annualized expansion for an eight figure annualized deal with a fortune 500 food and beverage retailer . This long term customer uses a Datadog platform across many products , but still has over 30 other observability tools embarked and on consolidating for savings and better cost outcomes .
Speaker #3: can correlate metrics traces and logs to detect and resolve issues impact with a 40% reduction faster. And they have already seen meaningful in resolution times by their own has adopted 17 products across the Datadog platform.
Speaker #3: Next, we signed estimates. for an eight-figure annualized deal with With this expansion, they are a Fortune 500 food and beverage retailer. This customer a seven-figure annualized expansion This long-time customer uses a but still has over 30 other observability tools and embarked on consolidating for cost savings and better outcomes.
Speaker #1: With this expansion , Datadog Log Management and logs will replace legacy logging the Ops product for all use cases with expected annual savings in the millions of dollars .
Speaker #1: This customer is expanding to 17 . Datadog products . Next , we signed a seven figure annualized expansion with a leading healthcare technology company .
Speaker #3: With this Flex Logs will replace their legacy logging product for all ops use millions of dollars. This customer is expanding to 17 cases. With expected annual savings in the products.
Speaker #1: This company was facing reliability issues , impacting clinicians during critical workflows , and putting customer trust at risk . The customer will consolidate six tools and adopt seven data products , including observability , to support their AI initiatives , as well as SRE agents to further incident accelerate response .
Speaker #3: Next, we signed a seven-figure annualized expansion with a leading healthcare technology company. This company was facing reliability expansion, Datadog log management, and critical workflows, and putting customer issues impacting clinicians during trust at risk.
Olivier Pomel: Next, we signed a seven-figure annualized expansion with a leading healthcare technology company. This company was facing reliability issues, impacting clinicians during critical workflows and putting customer trust at risk. The customer will consolidate six tools and adopt seven Datadog products, including LLM Observability, to support their AI initiatives as well as Bits AI SRE agents to further accelerate incident response. Next, we signed an eight-figure annualized expansion, more than quadrupling their annualized commitment with a major Latin American financial services company. Given its successful tool consolidation projects and rapid adoption of Datadog products across all of its teams, this customer renewed early with us while expanding to additional products, including Data Observability, CI Visibility, Database Monitoring, and Observability Pipelines. With Datadog, this customer showed measurable improvements in cost, efficiency, customer experience, and conversion rates across multiple lines of business.
Olivier Pomel: Next, we signed a seven-figure annualized expansion with a leading healthcare technology company. This company was facing reliability issues, impacting clinicians during critical workflows and putting customer trust at risk. The customer will consolidate six tools and adopt seven Datadog products, including LLM Observability, to support their AI initiatives as well as Bits AI SRE agents to further accelerate incident response. Next, we signed an eight-figure annualized expansion, more than quadrupling their annualized commitment with a major Latin American financial services company. Given its successful tool consolidation projects and rapid adoption of Datadog products across all of its teams, this customer renewed early with us while expanding to additional products, including Data Observability, CI Visibility, Database Monitoring, and Observability Pipelines. With Datadog, this customer showed measurable improvements in cost, efficiency, customer experience, and conversion rates across multiple lines of business.
Speaker #1: Next , we signed an annualized expansion more than quadrupling the annualized commitment with a major Latin American financial services company given its successful tool consolidation projects and rapid adoption of Datadog products across all of its teams .
Speaker #3: They adopt seven Datadog products, including LLM observability, to support their AI initiatives, as well as Beats AI SRE agents to further accelerate incident response.
Speaker #1: These customer renewed early with us . While expanding to additional products , including data observability , visibility , CI database monitoring , and observability with Datadog .
Speaker #3: Next, customer will consolidate six tools Datadog expansion more than quadrupling their we signed an eight-figure annualized company. Given its annualized commitment with a major successful tool consolidation projects and rapid Latin American financial services across all of its teams, this customer renewed early with us while expanding to additional products including better observability, CI pipelines.
Speaker #1: This customer showed measurable improvements in cost efficiency , customer experience , and conversion across rates multiple lines of business . That proof of value led them to broaden their commitment with us and have firmly established Datadog as their mission critical observability partner .
Speaker #1: Last and not least , we signed a seven figure annualized expansion for an eight figure annualized deal with a leading fintech company . With this expansion , the customer is moving their log data onto our unified platform so teams can correlate telemetry in one place and save between hours and weeks in time to resolution for incidents .
Speaker #3: With visibility, database monitoring, and observability, Datadog, this customer showed customer experience and conversion rates across measurable improvements in cost and efficiency, multiple lines of—to broaden their commitment with us and mission-critical observability partner.
Olivier Pomel: That proof of value led them to broaden their commitment with us and have firmly established Datadog as their mission-critical observability partner. Last and not least, we signed a seven-figure annualized expansion for an eight-figure annualized deal with a leading fintech company. With this expansion, the customer is moving their log data onto our unified platform so teams can correlate telemetry in one place and save between hours and weeks in time to resolution for incidents. This customer has adopted 19 Datadog products across the platform, including all three pillars as well as digital experience, security, software delivery, and service management. And that's it for our wins. Congratulations to our entire go-to-market organization for a great 2025 and a record-proof Q4. It was inspiring to see the whole team at our sales kickoff last month and really exciting to embark on a very ambitious 2026.
Olivier Pomel: That proof of value led them to broaden their commitment with us and have firmly established Datadog as their mission-critical observability partner. Last and not least, we signed a seven-figure annualized expansion for an eight-figure annualized deal with a leading fintech company. With this expansion, the customer is moving their log data onto our unified platform so teams can correlate telemetry in one place and save between hours and weeks in time to resolution for incidents. This customer has adopted 19 Datadog products across the platform, including all three pillars as well as digital experience, security, software delivery, and service management. And that's it for our wins. Congratulations to our entire go-to-market organization for a great 2025 and a record-proof Q4. It was inspiring to see the whole team at our sales kickoff last month and really exciting to embark on a very ambitious 2026.
Speaker #3: Last and not least, we signed have firmly established Datadog as their That proof of value led them a seven-figure annualized expansion adoption of Datadog products for an eight-figure annualized deal with a leading FinTech company.
Speaker #1: This has customer adopted 19 Datadog products across the platform , including all three pillars as well as digital experience , security , software delivery , and service management .
Speaker #3: With this expansion, the customer is moving their log data onto our unified platform so teams can correlate telemetry in one place and save between hours and weeks in time to incidents.
Speaker #1: And that's it for our wins . Congratulations to our entire go to market a great organization for 2025 , and a record with Q4 .
Speaker #3: This customer has adopted 19 Datadog products across the platform, resolution for software delivery, and service including all three pillars as well as digital experience, security, wins.
Speaker #1: It was inspiring to see the at our whole team kickoff last month . And really excited to embark on a very ambitious 2026 .
Speaker #1: Before I turn it over to David for Financial Review , I want to say a few words on our longer term outlook . There is no change to our overall view that digital transformation and cloud migration are long term secular growth drivers for our business .
Speaker #3: For a great 2025 and a record with our entire go-to-market organization management. Exciting to embark on a very ambitious—congratulations to—and that's it for our sales kickoff last month and really, 2026.
Speaker #1: So we continue to extend our platform to solve our customers problems from end to end across their software development , production , data stack , user experience and security needs .
Speaker #3: Before I turn it over to David for financial review, I want to Q4. say a few words on our longer-term It was inspiring to see the whole team at outlook.
Olivier Pomel: Before I turn it over to David for financial review, I want to say a few words on our longer-term outlook. There is no change to our overall view that digital transformation and cloud migration are long-term secular growth drivers for our business. So we continue to extend our platform to solve our customers' problems from end to end across their software development, production, data stack, user experience, and security needs. Meanwhile, we're moving fast in AI by integrating AI into the Datadog platform to improve customer value and outcomes and by building products to observe, secure, and act across our customers' AI stacks. In 2025, we executed very well and delivered for our customers against their most complex mission-critical problems. Our strong financial performance is an output of that effort.
Olivier Pomel: Before I turn it over to David for financial review, I want to say a few words on our longer-term outlook. There is no change to our overall view that digital transformation and cloud migration are long-term secular growth drivers for our business. So we continue to extend our platform to solve our customers' problems from end to end across their software development, production, data stack, user experience, and security needs. Meanwhile, we're moving fast in AI by integrating AI into the Datadog platform to improve customer value and outcomes and by building products to observe, secure, and act across our customers' AI stacks. In 2025, we executed very well and delivered for our customers against their most complex mission-critical problems. Our strong financial performance is an output of that effort.
Speaker #1: Meanwhile , we're moving fast in AI by integrating AI into the data platform to improve customer value and outcomes . And by building products to observe act , secure and across our customers .
Speaker #3: No change to our overall view that digital transformation and cloud are secular growth drivers for our business. So we continue to extend our platform to serve migration as a long-term need, solving our customers' problems from end to end across their software development, production, data stack, user experience, and security needs.
Speaker #1: AI stacks . In 2025 , executed we well very . Delivered for our customers against their most complex mission critical problems . Our strong financial performance is an output effort of that we're , and even more excited about 2026 as we are starting to see a inflection in AI usage by our customers into the applications .
Speaker #3: Meanwhile, we're moving fast in AI. By integrating AI into the Datadog platform to improve customer value and outcomes, and by building products to observe, secure, and act across our customers' AI stacks, in 2025, we executed very well and delivered for our customers against their most complex, mission-critical problems.
Speaker #1: And as our customers begin to adopt innovations such as our AI agent to hear about all that in detail and much more , I welcome you all to join us at our next Investor Day .
Speaker #1: This Thursday in New York . Between 1 and 5 p.m. , I'll be joined by our product and go to Market Leaders to share how we are serving our customers , hallway innovate to broaden our platform and how we are delivering greater value with AI .
Olivier Pomel: And we're even more excited about 2026 as we are starting to see an inflection in AI usage by our customers into their applications and as our customers begin to adopt our AI innovations such as our Bits AI SRE Agent. To hear about all that in detail and much more, I welcome you all to join us at our next investor day this Thursday in New York between 1:00PM and 5:00PM. I'll be joined by our product and go-to-market leaders to share how we are serving our customers, how we innovate to broaden our platform, and how we are delivering greater value with AI. For more details, please refer to the press release announcing the event or head to investors@datadoghq.com. And with that, I will turn it over to our CFO, David.
Olivier Pomel: And we're even more excited about 2026 as we are starting to see an inflection in AI usage by our customers into their applications and as our customers begin to adopt our AI innovations such as our Bits AI SRE Agent. To hear about all that in detail and much more, I welcome you all to join us at our next investor day this Thursday in New York between 1:00PM and 5:00PM. I'll be joined by our product and go-to-market leaders to share how we are serving our customers, how we innovate to broaden our platform, and how we are delivering greater value with AI. For more details, please refer to the press release announcing the event or head to investors@datadoghq.com. And with that, I will turn it over to our CFO, David.
Speaker #3: 2026, as we are starting to see in applications. And as our customers begin to adopt our AI innovations, such as our Beats AI SRE, you all to join us at our next detail and much more, I welcome between 1:00 and 5:00 PM.
Speaker #3: inflection in AI usage by our customers into their press release announcing the event or head to our platform, and how we are delivering For more details, please refer to the over to our CFO, David.
Speaker #1: details , please refer to the press release announcing the event . Or head to Investors . Com . And with that , I will turn it over to our CFO , David .
Speaker #3: I'll be joined by our product and go-to-market agent. Our strong financial performance is an output. Leaders will share how we are serving our customers, how we innovate to broaden greater value with AI. To hear about all that, join us at Investor Day this Thursday in New York.
Speaker #1: Thanks , Olivier .
Speaker #2: Our Q4 revenue was $953 million , up 29% year over year and up 8% quarter over quarter . Now , to dive into some of the drivers of our Q4 revenue growth .
Speaker #2: First , overall , we robust saw usage sequential growth from existing customers in Q4 revenue growth accelerated with our broad base of customers , excluding the AI natives to 23% year over year , up from 20% in Q3 .
Yuka Broderick: Thanks, Olivier. Our Q4 revenue was $953 million, up 29% year-over-year and up 8% quarter-over-quarter. Now, to dive into some of the drivers of our Q4 revenue growth, first, overall, we saw robust sequential usage growth from existing customers in Q4. Revenue growth accelerated with our broad base of customers, excluding the AI natives, to 23% year-over-year, up from 20% in Q3. We saw strong growth across our customer base with broad-based strength across customer size, spending bands, and industries. We have seen this trend of accelerated revenue growth continue in January. Meanwhile, we are seeing continued strong adoption among AI-native customers with growth that significantly outpaces the rest of the business. We see more AI-native customers using Datadog with about 650 customers in this group.
David Obstler: Thanks, Olivier. Our Q4 revenue was $953 million, up 29% year-over-year and up 8% quarter-over-quarter. Now, to dive into some of the drivers of our Q4 revenue growth, first, overall, we saw robust sequential usage growth from existing customers in Q4. Revenue growth accelerated with our broad base of customers, excluding the AI natives, to 23% year-over-year, up from 20% in Q3. We saw strong growth across our customer base with broad-based strength across customer size, spending bands, and industries. We have seen this trend of accelerated revenue growth continue in January. Meanwhile, we are seeing continued strong adoption among AI-native customers with growth that significantly outpaces the rest of the business. We see more AI-native customers using Datadog with about 650 customers in this group.
Speaker #2: Olivier. Our Q4 revenue was $953 million, up 8% quarter over quarter, and up 29% year over year. Now, to dive into some of the drivers of our Q4 revenue growth—first, overall usage growth from existing customers in Q4.
Speaker #2: We saw strong growth across our customer base with broad based strength across customer size , spending bands and industries , and we have seen this trend of accelerated revenue growth continue in January .
Speaker #2: we saw robust sequential Revenue growth accelerated with our broad-based of-customers excluding the AI natives to 23% year over year, up from 20% in our customer base with Q3.
Speaker #2: Meanwhile , we are seeing continued strong adoption amongst AI native customers with growth that significantly outpaces the rest of the business . We see more AI native customers using Datadog with about 650 customers in this group seeing these customers grow , and we are with us , including 19 customers spending $1 million or more annually with Datadog among our AI customers are the largest companies in this space .
Speaker #2: Bands and industries. And we have seen this trend of broad-based strength continue in accelerated revenue growth in January. Meanwhile, we are seeing strong growth across continued strong adoption, with AI-native customers outpacing the rest of the business.
Speaker #2: We see more AI native customers using Datadog with about group. And we are seeing these including 19 customers spending with $1 million or more annually customer size, spending customers grow with us, Datadog.
Speaker #2: As of today , 14 of the top 20 AI native companies are Datadog customers . Next , we also saw continued strength from new customer contribution .
Yuka Broderick: We are seeing these customers grow with us, including 19 customers spending $1 million or more annually with Datadog. Among our AI customers are the largest companies in this space, as today, 14 of the top 20 AI-native companies are Datadog customers. Next, we also saw continued strength from new customer contribution. Our new logo bookings were very strong again this quarter, and our go-to-market teams converted a record number of new logos, and average new logo land sizes continues to grow strongly. Regarding retention metrics, our trailing 12-month net revenue retention percentage was about 120%, similar to last quarter. Our trailing 12-month gross revenue retention percentage remains in the mid to high 90s. Now moving on to our financial results. First, billings were $1.21 billion, up 34% year-over-year. Remaining Performance Obligations, or RPO, was $3.46 billion, up 52% year-over-year.
David Obstler: We are seeing these customers grow with us, including 19 customers spending $1 million or more annually with Datadog. Among our AI customers are the largest companies in this space, as today, 14 of the top 20 AI-native companies are Datadog customers. Next, we also saw continued strength from new customer contribution. Our new logo bookings were very strong again this quarter, and our go-to-market teams converted a record number of new logos, and average new logo land sizes continues to grow strongly. Regarding retention metrics, our trailing 12-month net revenue retention percentage was about 120%, similar to last quarter. Our trailing 12-month gross revenue retention percentage remains in the mid to high 90s. Now moving on to our financial results. First, billings were $1.21 billion, up 34% year-over-year. Remaining Performance Obligations, or RPO, was $3.46 billion, up 52% year-over-year.
Speaker #2: Our new logo bookings were very strong . Again this quarter , and our go to market teams converted a record number of new logos and average new logo land sizes continues to grow strongly .
Speaker #2: Among our AI customers are the largest companies in this space, as with growth that significantly today 14 of the are Datadog customers. Next, we also saw continued strength from new customer very strong again this top 20 AI native companies quarter, and our go-to-market teams converted a record number of new logos, an average new logo land contribution.
Speaker #2: Regarding retention metrics . Our trailing 12 month month net retention , revenue retention percentage was about 120% . Similar to last quarter and our trailing 12 month gross revenue retention percentage remains in the mid to high 90s .
Speaker #2: strongly. Regarding retention metrics, our trailing 12-month net retention revenue retention quarter. And our trailing 12-month Our new logo bookings were percentage remains in the mid to high 120%, similar to last 90s.
Speaker #2: now And moving on to our financial results . First billings were $1.21 billion , up 34% year over year . Remaining performance obligations or RPO , was $3.46 billion , up 52% year over year .
Speaker #2: And current RPO growth was about 40% year over year . RPO duration increased year over year as the mix of multiyear deals increased in Q4 .
Speaker #2: results. First, billings were to our financial billion. Up 34% year percentage was about $1.21 And now moving on Remaining performance obligations or gross revenue retention over year.
Speaker #2: We continue to believe revenue is a better indication of our business trends than billing and RPO . Now , let's review some of the key income statement results , unless otherwise noted , all metrics are non-GAAP .
Speaker #2: 52% year over about 40% year over $3.46 billion, up year. RPO duration year. increased year over year as the mix of multi-year deals increased in And current RPO growth was revenue is a better indication of our Q4.
Yuka Broderick: Current RPO growth was about 40% year-over-year. RPO duration increased year-over-year as the mix of multi-year deals increased in Q4. We continue to believe revenue is a better indication of our business trends than billing and RPO. Now, let's review some of the key income statement results. Unless otherwise noted, all metrics are non-GAAP. We have provided a reconciliation of GAAP to non-GAAP financials in our earnings release. First, our Q4 gross profit was $776 million with a gross margin percentage of 81.4%. This compares to a gross margin of 81.2% last quarter and 81.7% in the year-ago quarter. Q4 OpEx grew 29% year-over-year versus 32% last quarter and 30% in the year-ago quarter. We continue to grow our investments to pursue our long-term growth opportunities, and this OpEx growth is an indication of our successful execution on our hiring plans.
David Obstler: Current RPO growth was about 40% year-over-year. RPO duration increased year-over-year as the mix of multi-year deals increased in Q4. We continue to believe revenue is a better indication of our business trends than billing and RPO. Now, let's review some of the key income statement results. Unless otherwise noted, all metrics are non-GAAP. We have provided a reconciliation of GAAP to non-GAAP financials in our earnings release. First, our Q4 gross profit was $776 million with a gross margin percentage of 81.4%. This compares to a gross margin of 81.2% last quarter and 81.7% in the year-ago quarter. Q4 OpEx grew 29% year-over-year versus 32% last quarter and 30% in the year-ago quarter. We continue to grow our investments to pursue our long-term growth opportunities, and this OpEx growth is an indication of our successful execution on our hiring plans.
Speaker #2: have We provided a reconciliation of GAAP to non-GAAP financials in our earnings release . First , our Q4 gross profit was $776 million with a gross margin percentage of 81.4% .
Speaker #2: Business trends than billing and RPO. Now, let's review some of the key income statement results. Unless otherwise noted, all metrics are a reconciliation of GAAP to non-GAAP financials in our earnings release. We continue to believe release.
Speaker #2: This compares to a gross margin of 81.2% last quarter and 81.7% in the year ago year ago quarter . Q4 OpEx grew 29% year over year versus 32% last quarter , and 30% in the year ago quarter .
Speaker #2: First, our Q4 gross profit was $776 million. 81.4%. This compares to a gross With a gross margin 81.7% in the year ago quarter. Q4 OPEX grew 81.2% last quarter and 29% year over year, versus margin of 32% last quarter and 30% in the year ago quarter.
Speaker #2: And we continue to grow our investments to pursue our long term growth opportunities . And this opex growth is an indication of a successful execution on our hiring plans .
Speaker #2: Q4 Our operating income was $230 million , for a 24% operating margin , compared to 23% last and quarter 24% in the year ago quarter .
Speaker #2: growth opportunities. And this investments to pursue our long-term OPEX growth is an indication of our successful execution on our hiring plans. Our Q4 operating income was a 24% operating And we continue to grow our margin, compared to $230 million, for 23% last quarter and 24% in the year ago quarter.
Speaker #2: turning Now , to the balance sheet flow and cash We statements . ended the quarter with cash . Cash $4.47 billion in equivalents and marketable securities .
Yuka Broderick: Our Q4 operating income was $230 million for a 24% operating margin compared to 23% last quarter and 24% in the year-ago quarter. Now, turning to the balance sheet and cash flow statements, we ended the quarter with $4.47 billion in cash, cash equivalents, and marketable securities. Cash flow from operations was $327 million in the quarter. After taking into consideration capital expenditures and capitalized software, free cash flow was $291 million for a free cash flow margin of 31%. And now for our outlook for Q1 and the full fiscal year 2026. Our guidance philosophy overall remains unchanged. As a reminder, we base our guidance on trends observed in recent months and apply conservatism on these growth trends. For Q1, we expect revenues to be in the range of $951 to $961 million, which represents a 25 to 26% year-over-year growth.
David Obstler: Our Q4 operating income was $230 million for a 24% operating margin compared to 23% last quarter and 24% in the year-ago quarter. Now, turning to the balance sheet and cash flow statements, we ended the quarter with $4.47 billion in cash, cash equivalents, and marketable securities. Cash flow from operations was $327 million in the quarter. After taking into consideration capital expenditures and capitalized software, free cash flow was $291 million for a free cash flow margin of 31%. And now for our outlook for Q1 and the full fiscal year 2026. Our guidance philosophy overall remains unchanged. As a reminder, we base our guidance on trends observed in recent months and apply conservatism on these growth trends. For Q1, we expect revenues to be in the range of $951 to $961 million, which represents a 25 to 26% year-over-year growth.
Speaker #2: Cash flow from operations was $327 million in the quarter . After taking into consideration capital expenditures and capitalized software free cash flow was $291 million for free cash flow margin of 31% .
Speaker #2: Now, turning to the balance sheet and with $4.47 billion, in cash, cash equivalents, and marketable securities. Cash flow from cash flow statements, we ended the quarter operations was $327 million, in the quarter.
Speaker #2: And now for our outlook for the first quarter and the full fiscal year 2026 . Our guidance philosophy overall remains unchanged . As a reminder , we base guidance our on trends observed in recent months and apply conservatism on these growth trends .
Speaker #2: After taking into consideration capital expenditures and capitalized software, free cash flow was $291 million, for a free cash flow margin of 31%. And now for our outlook for the first quarter and the full fiscal year 2026.
Speaker #2: For the first quarter , we expect revenues to be in the range of 951 to $961 million , which represents a 25 to 26% year over year growth .
Speaker #2: Our guidance philosophy overall remains unchanged. As a reminder, we base our guidance on trends observed and apply conservatism on these growth trends. For the first quarter, we expect revenues to be in the range of $951 million to $961 million, which represents growth.
Speaker #2: non-GAAP operating income is expected to be in the range of 195 to $205 million , which operating implies an margin of 21% . non-GAAP net income per share is expected to be in the 49 to $0.51 per share range , based on approximately average 367 million weighted diluted shares outstanding .
Speaker #2: In recent months, non-GAAP operating income is expected to be in the range of 25% to 26% year over year, or $195 to $205 million, which implies an operating margin of 21%.
Yuka Broderick: Non-GAAP operating income is expected to be in the range of $195 to $205 million, which implies an operating margin of 21%. Non-GAAP net income per share is expected to be in the $0.49 to $0.51 per share range based on approximately 367 million weighted average diluted shares outstanding. For the full fiscal year 2026, we expect revenues to be in the range of $4.06 to $4.10 billion, which represents 18 to 20% year-over-year growth. This includes modeling within our guidance that our business, excluding our largest customer, grows at least 20% during the year. Non-GAAP operating income is expected to be in the range of $840 to $880 million, which implies an operating margin of 21%. Non-GAAP net income per share is expected to be in the range of $2.08 to $2.16 per share based on approximately 372 million weighted average diluted shares.
David Obstler: Non-GAAP operating income is expected to be in the range of $195 to $205 million, which implies an operating margin of 21%. Non-GAAP net income per share is expected to be in the $0.49 to $0.51 per share range based on approximately 367 million weighted average diluted shares outstanding. For the full fiscal year 2026, we expect revenues to be in the range of $4.06 to $4.10 billion, which represents 18 to 20% year-over-year growth. This includes modeling within our guidance that our business, excluding our largest customer, grows at least 20% during the year. Non-GAAP operating income is expected to be in the range of $840 to $880 million, which implies an operating margin of 21%. Non-GAAP net income per share is expected to be in the range of $2.08 to $2.16 per share based on approximately 372 million weighted average diluted shares.
Speaker #2: And for the full fiscal year 2026 , we expect revenues to be in the range of 4.06 to $4.1 billion , which represents 18 to 20% year over year growth .
Speaker #2: Non-GAAP net income per share is expected to be in the $0.49 to $0.51 per share range, based on approximately 367 million weighted outstanding.
Speaker #2: This includes modeling within our guidance that our business , excluding our largest customer , grows at least 20% during the year . non-GAAP operating income is expected to be range of 840 to $880 million , which implies an operating margin of 21% .
Speaker #2: And for the average diluted shares full fiscal year $4.06 to expect revenues to be in the range of $4.10 billion, 2026, we 18% to 20% year over year growth.
Speaker #2: This includes modeling within our guidance that our business, excluding our largest 20% during the customer, grows at least year. Non-GAAP range of operating income is expected to be in the million, which implies an operating margin of 21%.
Speaker #2: And income non-GAAP net per share is in the expected to be range of $2.08 to $2.16 per share , based on an approximately 372 million weighted average diluted shares .
Speaker #2: Finally , some additional notes on our guidance . First , we expect net interest and other income for the fiscal year 2026 to be approximately $140 million .
Speaker #2: And non-GAAP net income per share is expected to be in the range of $2.08 to $2.16, based on approximately 840 to 880 million shares.
Speaker #2: Next , we expect cash taxes in 2026 to be about 30 to $40 million . And we continue to apply a 21% non-GAAP tax rate for 2026 .
Speaker #2: Finally, some additional notes on our guidance. First, we expect net interest and other income for the 372 million weighted average diluted fiscal year to be $140 million.
Yuka Broderick: Finally, some additional notes on our guidance. First, we expect net interest and other income for the fiscal year 2026 to be approximately $140 million. Next, we expect cash taxes in 2026 to be about $30 to $40 million, and we continue to apply a 21% non-GAAP tax rate for 2026 and beyond. Finally, we expect capital expenditures and capitalized software together to be in the 4% to 5% of revenue range in fiscal year 2026. To summarize, we are pleased with our strong execution in 2025. Thank you to the Datadog teams worldwide for a great 2025, and I'm very excited about our plans for 2026. And finally, we look forward to seeing many of you on Thursday for our investor day. And now with that, we will open our call for questions. Operator, let's begin the Q&A.
David Obstler: Finally, some additional notes on our guidance. First, we expect net interest and other income for the fiscal year 2026 to be approximately $140 million. Next, we expect cash taxes in 2026 to be about $30 to $40 million, and we continue to apply a 21% non-GAAP tax rate for 2026 and beyond. Finally, we expect capital expenditures and capitalized software together to be in the 4% to 5% of revenue range in fiscal year 2026. To summarize, we are pleased with our strong execution in 2025. Thank you to the Datadog teams worldwide for a great 2025, and I'm very excited about our plans for 2026. And finally, we look forward to seeing many of you on Thursday for our investor day. And now with that, we will open our call for questions. Operator, let's begin the Q&A.
Speaker #2: And beyond . Finally , we expect capital expenditures and capitalized software together to be in the 4 to 5% of revenue in fiscal year 2026 .
Speaker #2: Next, we expect 2026 to be approximately cash taxes in 2026 to be about 30% to 40 million dollars, and we continue to apply a 21% non-GAAP tax rate for 2026 and expect capital expenditures and be in the 4% to 5% capitalized software together to of revenue range in fiscal beyond.
Speaker #2: To summarize , we pleased are strong with our execution in 2025 . you to Thank the Datadog teams worldwide for a great 2025 .
Speaker #2: And I'm very excited about our plans for 2026 . And finally , we look forward to seeing many of you on Thursday . For our Investor Day .
Speaker #2: year 2026. To summarize, we are pleased with our strong execution in 2025. Thank you to the Datadog teams worldwide for a great 2025. And I'm very excited about our plans Finally, we 2026.
Speaker #2: And now with that , we will open our call for questions . Operator . Let's begin the Q&A .
Speaker #3: Thank you . As a reminder to ask a question , please press star one . One . Our first question comes from Sangeet Singh with Morgan Stanley .
Speaker #2: We look forward to seeing many of you on Thursday for our Investor Day. And now, with that, we will open our call for questions. Operator, let's begin. And finally, we look forward to the...
Speaker #3: Your line is open .
Speaker #4: Thank you for taking the questions and congrats on a strong close of the year and a successful 2025 . Olivier , I wanted to get your updated views in terms of where observability is headed in the context of a lot of advancements when it comes to agentic frameworks .
Speaker #2: Q&A. Thank you.
Olivier Pomel: Thank you. As a reminder, to ask a question, please press star 11. Our first question comes from Sanjit Singh with Morgan Stanley. Your line is open.
Operator: Thank you. As a reminder, to ask a question, please press star 11. Our first question comes from Sanjit Singh with Morgan Stanley. Your line is open.
Speaker #1: As a reminder, to ask a question, please press star 11. Our first question comes from is
Speaker #4: Agentic deployments , the stuff that we've seen from anthropic and the New models from frontier OpenAI . Just in terms of like this means for what observability as a category , defensibility of it , in terms of can customers use these tools to build homegrown solutions for observability ?
Speaker #1: open. Sanjeet Singh with Morgan Stanley.
Speaker #3: Thank you for taking a question and congrats
Sanjit Singh: Thank you for taking the questions and congrats on a strong close to the year and a successful 2025. Olivier, I wanted to get your updated views in terms of where observability is headed in the context of a lot of advancements when it comes to agentic frameworks, agentic deployments, the stuff that we've seen from Anthropic and the new frontier models from OpenAI. Just in terms of what this means for observability as a category, the defensibility of it in terms of can customers use these tools to build homegrown solutions for observability? So just get your latest comments on the defensibility of the category and how Datadog may potentially have to evolve in this new sort of agentic era.
Sanjit Singh: Thank you for taking the questions and congrats on a strong close to the year and a successful 2025. Olivier, I wanted to get your updated views in terms of where observability is headed in the context of a lot of advancements when it comes to agentic frameworks, agentic deployments, the stuff that we've seen from Anthropic and the new frontier models from OpenAI. Just in terms of what this means for observability as a category, the defensibility of it in terms of can customers use these tools to build homegrown solutions for observability? So just get your latest comments on the defensibility of the category and how Datadog may potentially have to evolve in this new sort of agentic era.
Speaker #3: on a strong close of the year and a
Speaker #3: 2025. Olivier, I wanted to get your updated views in terms of where observability is headed. In the context of a lot of advancements when it comes Your line to agentic frameworks, agentic deployments, the stuff that we've seen from Anthropic and the new frontier models from OpenAI, just in terms of what this means for successful observability as a category, the defensibility of it in terms of can to build for observability.
Speaker #3: 2025. Olivier, I wanted to get your updated views in terms of where observability is headed. In the context of a lot of advancements when it comes Your line to agentic frameworks, agentic deployments, the stuff that we've seen from Anthropic and the new frontier models from OpenAI, just in terms of what this means for successful observability as a category, the defensibility of it in terms of can to build customers use these tools So just get your latest comments homegrown solutions on the defensibility of the category and how Datadog may potentially have to evolve in this new sort of agentic era.
Speaker #4: So just get your latest comments on the defensibility of the category and how may potentially have to evolve in this new sort of agentic area .
Speaker #1: Yeah , I mean , look , the the there's a few different ways to to look at it . One is there's , there's going to be many more applications than there were before .
Speaker #1: Like , people are more . They're building much much building faster . You know , we've we've that in covered previous calls . But we think that the this is nothing but an acceleration of the increase of productivity for developers in general .
Speaker #4: Yeah. I mean, look, the there's a few different ways to look at it. One is applications than there were before. People are building much more.
Olivier Pomel: Yeah. I mean, look, there's a few different ways to look at it. One is there's going to be many more applications than there were before. People are building much more. They're building much faster. We've covered that in previous calls, but we think that this is nothing but an acceleration of the increase of productivity for developers in general. So you can build a lot faster. As a result, you create a lot more complexity because you build more than you can understand at any point in time.
Olivier Pomel: Yeah. I mean, look, there's a few different ways to look at it. One is there's going to be many more applications than there were before. People are building much more. They're building much faster. We've covered that in previous calls, but we think that this is nothing but an acceleration of the increase of productivity for developers in general. So you can build a lot faster. As a result, you create a lot more complexity because you build more than you can understand at any point in time.
Speaker #1: So you can build a lot faster . As a result , you create a lot more complexity because you build more than you can understand at any point in time .
Speaker #1: And the value lot of you move a from the act of writing the code , do yourself actually don't you which now anymore to validating , testing , making sure it works in production , making sure it's safe .
Speaker #4: Faster. We've covered that in previous calls, but we think acceleration of the increase—there's going to be many more productivity gains for developers in general.
Speaker #1: Making sure it interacts well with the rest of the world , with the end users . Make sure it does what it's do for the supposed to business , which is what we do with observability .
Speaker #4: So you that the this is nothing but an can build a lot faster. As a result, you create a lot more complexity because you They're building much build more than you can understand at any point in time.
Speaker #1: So we see a lot more volume . There . And we see that as as what we do . Basically where observability can help .
Speaker #4: And you move a lot of the value from the act of writing the code—which now you actually don't do yourself anymore—to validating, testing, making sure it works in production, making sure it's in the world with the end users, making sure it does what it's supposed to do for the business, which is what we do with observability.
Speaker #4: And you move a lot of the value from the act of writing the code, which now you actually don't do yourself anymore, to validating, testing, making sure it works in production, making sure it's in the world with the end users, making sure it does what it's supposed to do for the business—which is what we say—making sure it interacts well with the rest. We see a lot more volume there.
Olivier Pomel: And you move a lot of the value from the act of writing the code, which now you actually don't do yourself anymore, to validating, testing, making sure it works in production, making sure it's safe, making sure it interacts well with the rest of the world, with the end users, and making sure it does what it's supposed to do for the business, which is what we do with observability. So we see a lot more volume there, and we see that as what we do, basically, where observability can help. The other part that's interesting is that a lot happened, a lot more happens within these agents and these applications. And a lot of what we do as humans now starts to look like observability. Basically, we're here to understand; we try and understand what the machine does. We try to make sure it's aligned with us.
Olivier Pomel: And you move a lot of the value from the act of writing the code, which now you actually don't do yourself anymore, to validating, testing, making sure it works in production, making sure it's safe, making sure it interacts well with the rest of the world, with the end users, and making sure it does what it's supposed to do for the business, which is what we do with observability. So we see a lot more volume there, and we see that as what we do, basically, where observability can help. The other part that's interesting is that a lot happened, a lot more happens within these agents and these applications. And a lot of what we do as humans now starts to look like observability. Basically, we're here to understand; we try and understand what the machine does. We try to make sure it's aligned with us.
Speaker #1: The other part is interesting is that we we . A lot , a lot happened a lot more happens within these agents and these applications and a lot of what we do as humans now starts to look like observability .
Speaker #4: And we see that as what we, observability can help. The other part that's interesting is, do, basically, where that happens within these agents and these, we, a lot happened, a lot more applications.
Speaker #1: We basically we're here to understand . We try to understand what the machine does . We try to make sure it's aligned with us .
Speaker #1: We try to make sure , you the the output is what we expected when we started and that we didn't break anything . And so we think it's going to bring observability domains widely in more that it didn't necessarily cover before .
Speaker #4: And a lot of what we do as humans observability. Basically, we're here to understand we're trying to understand what the machine does. We try to make sure now starts to look like sure the it's aligned with us.
Speaker #1: So we these are think that accelerants . And we I mean , obviously we have a horse in this one . But you know , we think that observability and the contact between the code , the applications and the real world and production environments and real users and real business is the most interesting , the most important part of the whole AI development lifecycle today .
Olivier Pomel: We try to make sure the output is what we expected when we started and that we didn't break anything. And so we think it's going to bring observability more widely in domains that it didn't necessarily cover before. So we think that these are accelerants. And I mean, obviously, we have a horse in this one, but we think that observability and the context between the code, the applications, and the real world, production environments, real users, and the real business is the most interesting, the most important part of the whole AI development lifecycle to them.
Olivier Pomel: We try to make sure the output is what we expected when we started and that we didn't break anything. And so we think it's going to bring observability more widely in domains that it didn't necessarily cover before. So we think that these are accelerants. And I mean, obviously, we have a horse in this one, but we think that observability and the context between the code, the applications, and the real world, production environments, real users, and the real business is the most interesting, the most important part of the whole AI development lifecycle to them.
Speaker #4: Started. And that we didn't break. We try to make anything. And so we think its going—output is what we expected when we—to bring observability more widely in domains that it didn't necessarily cover before.
Speaker #4: So we think that these are accelerants. horse in this one, but we think And I mean, obviously, we have a between the code, the applications, and the real world and production environments and real users and the real business is the most interesting, the most important part of the whole AI that observability and the contact development lifecycle
Speaker #4: And maybe just one follow up on on that line of thinking . In a world where there's a greater mix between human SREs and Agentic SREs , is there any sort of evolution that we need to think about in terms of whether it's UI or how workflows work in observability and and how maybe Datadog sort of tries to align with that .
Speaker #3: And maybe just one follow-up on that line of a greater mix between today, about in terms of whether it's UI or how workflows work in observability, and is there any sort of evolution that we need to think maybe Datadog sort of tries to align with that?
Sanjit Singh: Maybe just one follow-up on that line of thinking. In a world where there's a greater mix between human SREs and agentic SREs, is there any sort of evolution that we need to think about in terms of whether it's UI or how workflows work in observability and how maybe Datadog sort of tries to align with that evolution that's likely to come in the next couple of years?
Sanjit Singh: Maybe just one follow-up on that line of thinking. In a world where there's a greater mix between human SREs and agentic SREs, is there any sort of evolution that we need to think about in terms of whether it's UI or how workflows work in observability and how maybe Datadog sort of tries to align with that evolution that's likely to come in the next couple of years?
Speaker #4: That evolution . That's likely to come in the next couple of years .
Speaker #3: thinking. In a world where there's
Speaker #1: Yeah , there's going to be an evolution in that certain , you know , there's going to be a lot more automation . see We today .
Speaker #1: Like we see the all the signs , we see point to everything moving faster and data , more but more and more interactions , more , more systems , more , more releases , more breakage , more resolutions of breakages , more bugs , more vulnerabilities .
Speaker #1: Everything you know . So we we send acceleration . There . At the end of the day , the humans will still have some form of of UI to interact with all that .
Speaker #4: Yeah. There's
Olivier Pomel: Yeah. There's going to be an evolution. That's certain. There's going to be a lot more automation we see today. All the signs we see point to everything moving faster, more data, and more interactions, more systems, more releases, more breakage, more resolutions of those breakages, more bugs, more vulnerabilities, everything. So we see an acceleration there. At the end of the day, humans will still have some form of UI to interact with all that, and a lot of the interaction will be automated by agents. So we're building the product to satisfy both conditions. So we have a lot of UIs, and we are able to present the humans with UIs that represent how the world works, what their options are, give them familiar ways to go through problems and to model the world. And we also are exposing a lot of our functionality to agents directly.
Olivier Pomel: Yeah. There's going to be an evolution. That's certain. There's going to be a lot more automation we see today. All the signs we see point to everything moving faster, more data, and more interactions, more systems, more releases, more breakage, more resolutions of those breakages, more bugs, more vulnerabilities, everything. So we see an acceleration there. At the end of the day, humans will still have some form of UI to interact with all that, and a lot of the interaction will be automated by agents. So we're building the product to satisfy both conditions. So we have a lot of UIs, and we are able to present the humans with UIs that represent how the world works, what their options are, give them familiar ways to go through problems and to model the world. And we also are exposing a lot of our functionality to agents directly.
Speaker #4: going to be an evolution, that's
Speaker #4: certain. There's going to be a lot more automation. We see today. We see the all the signs we see point to everything moving faster and more data, but more interactions, more breakage, more resolutions of those breakages, more bugs, more and agentic SREs,
Speaker #1: of the interaction And a lot will be automated by agent . So we're building the products to satisfy both conditions . So we have a lot of UIs and we are able to present the humans with UIs that represent how the world works , with the options are , give them familiar ways to to to go through problems and to model the world .
Speaker #4: So we see an acceleration So we have a lot of UIs, and we are able to present the humans with UIs that represent how the world evolution that's likely to come in the next couple of of UI to interact works, what their options are, give them vulnerabilities, everything.
Speaker #4: day, the humans will still have some form with all that. And a lot of the interaction will be automated by to satisfy both conditions.
Speaker #1: And we also are exposing a lot of functionality to agents directly . You know , we mentioned on the call we have an MCP server that is currently in preview , and that is releasing explosive growth of usage from our customers .
Speaker #4: So we're building the products We mentioned on the call, we have an MCP growth of usage from our server that is currently in preview and customers.
Speaker #1: And so it's a a very likely future . That part of our functionality is delivered to agents through , MCP servers through or the like part of our functionality is directly implemented by our own agents and part of our functionality is delivered to humans with UIs .
Speaker #4: familiar ways to go through problems and to model the world. And we agents. also are exposing a lot of our functionality to agents directly.
Olivier Pomel: We mentioned on the call, we have an MCP Server that is currently in preview and that is really seeing explosive growth of usage from our customers. And so it's a very likely future that part of our functionality is delivered to agents through MCP Servers or the likes. Part of our functionality is directly implemented by our own agents, and part of our functionality is delivered to humans with UIs.
Olivier Pomel: We mentioned on the call, we have an MCP Server that is currently in preview and that is really seeing explosive growth of usage from our customers. And so it's a very likely future that part of our functionality is delivered to agents through MCP Servers or the likes. Part of our functionality is directly implemented by our own agents, and part of our functionality is delivered to humans with UIs.
Speaker #4: Understood . Thank you . Ollie .
Speaker #4: And so it's a very likely future agents through MCP servers or the that part of our functionality is delivered to likes. Part of our functionality is directly implemented by functionality is delivered to humans with UIs.
Speaker #3: Thank you . Our next question comes from Raimo Lenschow with Barclays . Your line is open .
Speaker #5: Thank you . Congrats from me as well . Staying a little bit on an AI theme . Olivier . The the eight figure deal for model company is really exciting .
Sanjit Singh: Understood. Thank you, Olivier.
Sanjit Singh: Understood. Thank you, Olivier.
Speaker #3: Olivier. Understood.
Speaker #5: I assume they try to do it with some open source tooling , etc. but and actually went from like , you know , almost paying not a lot of money to paying you more money .
Speaker #1: Thank you. Our Raymo Linshao with Barclays. Your line
Olivier Pomel: Thank you. Our next question comes from Raimo Lenschow with Barclays. Your line is open.
Operator: Thank you. Our next question comes from Raimo Lenschow with Barclays. Your line is open.
Speaker #1: next question comes from
Speaker #1: is open. Thank
Raimo Lenschow: Thank you. Congrats from me as well. Staying a little bit on that AI theme, Olivier, the eight-figure deal for a model company is really exciting. I assume they try to do it with some open-source tooling, etc., and actually went from almost paying not a lot of money to paying you more money. What drove that thinking? What do you think what they saw that kind of convinced them to do that? And it's now the second one after the other very big model provider. So clearly, that whole debate in the market between, "Oh, you can do that on the cheap somewhere," is not kind of quite valid. Could you speak to that, please? Thank you.
Raimo Lenschow: Thank you. Congrats from me as well. Staying a little bit on that AI theme, Olivier, the eight-figure deal for a model company is really exciting. I assume they try to do it with some open-source tooling, etc., and actually went from almost paying not a lot of money to paying you more money. What drove that thinking? What do you think what they saw that kind of convinced them to do that? And it's now the second one after the other very big model provider. So clearly, that whole debate in the market between, "Oh, you can do that on the cheap somewhere," is not kind of quite valid. Could you speak to that, please? Thank you.
Speaker #5: you. Congrats from me as well. Staying on a little bit on that AI theme, Olivier, the eight-figure deal exciting. I assume they try Thank you, tooling, et cetera.
Speaker #5: drove What that thinking ? What ? That do you think ? What saw they , that kind convinced them to do that . And it's now the second one , you know , after the other very big model provider .
Speaker #5: So , so clearly that whole debate in the market between that on the cheap you can do somewhere is not kind of that please ?
Speaker #5: But actually, went for a model company is really from almost paying not a lot of money to paying you more money. What drove that thinking?
Speaker #5: speak to valid . Could you quite And thank you .
Speaker #5: speak to valid . Could you quite And thank you .
Speaker #1: mean , the situation is just very I similar to every single customer . We learn , every customer we learn has some , has had some , some at home grown .
Speaker #5: What do you think to do it with some open-source what they saw that kind of convinced them to do that? And it's now the second big model provider, so clearly, that whole debate in the market between "Oh, you can do that on the cheap somewhere," is not kind of quite valid.
Speaker #1: They have some open source , and I still some open source like that's typically what we see everywhere . The you know , it's cheaper to do it yourself is usually not the So you're engineers are typically are very well compensated in the big part of the spend in this company .
Speaker #5: Could you speak to that, please? And thank you.
Speaker #5: you.
Speaker #4: I mean, the situation is just very similar to where every single
Olivier Pomel: I mean, the situation is just very similar to where every single customer we've had. Every customer we've had has had some homegrown. They have some open source. They might still run some open source. That's typically what we see everywhere. It's cheaper, or to do it yourself, is usually not the case. So your engineers typically are very well compensated in a big part of the spend in these companies. Their velocity is what gates just about anything else in the business. And so usually, when we come in, when customers start engaging with us, we can very quickly show value that way. So it's not any different from what we see with any other customer. And also within the AI cohort, it's not original at all.
Olivier Pomel: I mean, the situation is just very similar to where every single customer we've had. Every customer we've had has had some homegrown. They have some open source. They might still run some open source. That's typically what we see everywhere. It's cheaper, or to do it yourself, is usually not the case. So your engineers typically are very well compensated in a big part of the spend in these companies. Their velocity is what gates just about anything else in the business. And so usually, when we come in, when customers start engaging with us, we can very quickly show value that way. So it's not any different from what we see with any other customer. And also within the AI cohort, it's not original at all.
Speaker #4: End has had some item at homegrown. They have some open source. They might still run some open source. That's typically what we see everywhere.
Speaker #1: They're velocity is the is what gates just about anything else in the business . And so usually when we come in when customers start with engaging engaging with us , we can very quickly show value that way .
Speaker #4: It's cheaper to do it yourself is usually not the case. So your very well compensated in the big part of the spend in this company.
Speaker #1: So it's not different from any what we see with any other customer . And also within the AI cohort . not It's not original at all .
Speaker #1: Like , you know , the or AI cohorts in general is a who's who of the companies growing that are very fast and that are shaping the world in AI same our for the reasons .
Speaker #4: Their one.
Speaker #4: gates just about anything else in the business. And velocity is the is what so usually, when we come in, when customers start engaging with us, we can very quickly show value that way.
Speaker #1: they're all adopting with different Sometimes volumes , those have different scales the , but , the , the logic is the same .
Speaker #4: So it's After the other very not any different from what we see with any other customer. And also, within the AI engineers are typically cohort, it's not The AI cohorts in general, so are growing very fast and that are shaping the world in AI?
Speaker #5: Okay , perfect . Thank you . Well done .
Olivier Pomel: Or AI cohorts in general is a who's who of the companies that are growing very fast and that are shaping the world in AI. They're all adopting our product for the same reasons, sometimes with different volumes because those companies have different scales, but the logic is the same.
Olivier Pomel: Or AI cohorts in general is a who's who of the companies that are growing very fast and that are shaping the world in AI. They're all adopting our product for the same reasons, sometimes with different volumes because those companies have different scales, but the logic is the same.
Speaker #3: Thank you . Our next question comes from Gabriela Borges with Goldman Sachs . Your line is open .
Speaker #4: And they're all adopting our product for the same original at all. companies have different scales. But the logic is the
Speaker #6: Hi . Good morning . Congratulations on the quarter . And thank you for taking my question . I wanted to follow up on Sandeep's question on how to think about where the line is between what an LLM can do term longer and the domain experience that you have in observe ability .
Raimo Lenschow: That's perfect. Thank you, Pauline.
Raimo Lenschow: That's perfect. Thank you, Pauline.
Speaker #5: for
Speaker #5: that. Thank you.
Olivier Pomel: Thank you. Our next question comes from Gabriela Borges with Goldman Sachs. Your line is open.
Operator: Thank you. Our next question comes from Gabriela Borges with Goldman Sachs. Your line is open.
Speaker #1: Gabriela Borges with Goldman Our next question comes from open.
Speaker #6: If I think about some of Anthropic's recent announcements , they're talking about Llms as a broader anomaly detection type tool . For example , and the security vulnerability management site .
Speaker #6: Hi. Good morning. Congratulations
Speaker #6: Hi. Good morning. Congratulations
Gabriela Borges: Hi. Good morning. Congratulations for the quarter, and thank you for taking my question. Olivier, I wanted to follow up on Sanjit's question on how to think about where the line is between what an LLM can do longer-term and the domain experience that you have in observability. If I think about some of Anthropic's recent announcements, they're talking about LLMs as a broader anomaly detection type tool, for example, on the security vulnerability management side. How do you think about the limiting factor to using LLMs as an anomaly detection tool that could potentially take share from observability over time in the category? And how do you think about the moat that Datadog has that offers customers a better solution relative to whether roadmap and LLMs can go long-term? Thank you.
Gabriela Borges: Hi. Good morning. Congratulations for the quarter, and thank you for taking my question. Olivier, I wanted to follow up on Sanjit's question on how to think about where the line is between what an LLM can do longer-term and the domain experience that you have in observability. If I think about some of Anthropic's recent announcements, they're talking about LLMs as a broader anomaly detection type tool, for example, on the security vulnerability management side. How do you think about the limiting factor to using LLMs as an anomaly detection tool that could potentially take share from observability over time in the category? And how do you think about the moat that Datadog has that offers customers a better solution relative to whether roadmap and LLMs can go long-term? Thank you.
Speaker #6: question. Olivier, I wanted to follow up on
Speaker #6: How do you think about the limiting factor to using Llms as an anomaly detection tool that could potentially take share from observability of a time in the category ?
Speaker #6: is between what Perfect.
Speaker #6: term and the domain experience that who's who of the companies that you have in observability. If I think And thank you for taking my Thank you announcements, they're talking about LLMs as a broader, anomaly detection-type tool, for example, on the security vulnerability management side.
Speaker #6: term and the domain experience that who's who of the companies that you have in observability. If I think And thank you for taking my Thank you announcements, they're talking about LLMs as a broader, anomaly detection-type tool, for example, on the security vulnerability management side. Sachs.
Speaker #6: And how do you think about the moat that Datadog has that offers customers solution a better relative to where the roadmap and can be ?
Speaker #6: The long term ? LMS Thank you .
Speaker #1: Yeah , so that's a that's a very good question . We see we definitely see that Llms are getting better and better . And we will bet on on them getting significantly better every few months .
Speaker #6: an anomaly detection tool that could same. about some of Anthropic's recent potentially take share over from the limiting factor to using And how do you think about the moat that observability over time in the category?
Speaker #1: As we've seen over the past couple of years . And as a result , they're very , very good at looking at broad sets of data .
Speaker #6: customers a better solution relative to where the How do you think about roadmap and LLMs can go long term? Thank
Speaker #4: Yeah. question. We see we definitely see
Olivier Pomel: Yeah. So that's a very good question. We definitely see that LLMs are getting better and better, and we'll bet on them getting significantly better every few months as we've seen over the past couple of years. And as a result, they're very, very good at looking at broad sets of data, thought box. So if you feed a lot of data to an LLM enough for an analysis, you're very likely to get something that is very good and that is going to get even better. So when you think of what we have that is fundamentally remote here, there's 2 parts. One is how we are able to assemble that context so we can feed it into those intelligence engines. And that's how we aggregate all the data we get. We parse out the dependencies.
Olivier Pomel: Yeah. So that's a very good question. We definitely see that LLMs are getting better and better, and we'll bet on them getting significantly better every few months as we've seen over the past couple of years. And as a result, they're very, very good at looking at broad sets of data, thought box. So if you feed a lot of data to an LLM enough for an analysis, you're very likely to get something that is very good and that is going to get even better. So when you think of what we have that is fundamentally remote here, there's 2 parts. One is how we are able to assemble that context so we can feed it into those intelligence engines. And that's how we aggregate all the data we get. We parse out the dependencies.
Speaker #4: So that’s a very good get—something that is very good and that is going to get even broader sets of data, thought better. So, when you think of what we have that is fundamentally remote here, there’s two parts.
Speaker #1: if you So feed a lot of data to ask for an analysis , you're very likely to get something that is very good and that is going to get even better So when you think of what we have , that is fundamentally remote here , there's two parts .
Speaker #4: that LLMs are getting better and better. And we'll bet on them Your line is getting significantly better every few months as we've seen over the past couple of
Speaker #4: years. And as a result, they're very, very good at looking at problems. So if you feed a lot of data to an LLM and ask for an analysis, you'll very likely to you.
Speaker #1: One is how we are able to assemble that context so we can feed it into those intelligence , intelligence engines , you . And that's how we aggregate all the data we get parse .
Speaker #1: out the We dependencies . We understand how and we everything fits together , can feed that into the LLM what we in part .
Speaker #1: That's do . For example , today we these expose of functionality behind our MCP server . And so customers can recombine that in different ways using different intelligence tools .
Speaker #4: One is how we are able to assemble that context so we can feed it into those intelligence engines. And that's how we aggregate all the data we get.
Speaker #1: the other But part that we think where the world is going is going from the availability is that right now we're the DLC is accelerating a lot , but it's still somewhat slow .
Speaker #4: We parse out the dependencies. We understand how everything fits together, and we can feed that into the LLM. That's in part what we do, for example, today.
Olivier Pomel: We understand how everything fits together, and we can feed that into the LLM. That's in part what we do. For example, today, we expose these kinds of functionality behind our MCP Server. And so customers can recombine that in different ways using different intelligence tools. But the other part that we think where the world is going for observability is that right now, DSDLC is accelerating a lot, but it's still somewhat slow. And so it's okay to have incidents and run post-hoc analysis on those incidents and maybe use some outside tooling for that. Where the world is going is you're going to have many more changes, many more things. You cannot actually afford to have incidents to look at for everything that's happening in your system. So you'll need to be proactive. You'll need to run analysis in-stream as all the data flows through.
Olivier Pomel: We understand how everything fits together, and we can feed that into the LLM. That's in part what we do. For example, today, we expose these kinds of functionality behind our MCP Server. And so customers can recombine that in different ways using different intelligence tools. But the other part that we think where the world is going for observability is that right now, DSDLC is accelerating a lot, but it's still somewhat slow. And so it's okay to have incidents and run post-hoc analysis on those incidents and maybe use some outside tooling for that. Where the world is going is you're going to have many more changes, many more things. You cannot actually afford to have incidents to look at for everything that's happening in your system. So you'll need to be proactive. You'll need to run analysis in-stream as all the data flows through.
Speaker #4: We expose these kinds of functionality behind our MCP can recombine that in different ways using different intelligence tools. But the other part that we think where the world observability, is that right now, we're TSDLC is accelerating a lot.
Speaker #1: And so it's okay to have incidents and run post-hoc analysis on those incidents and maybe use some outside tooling for that . Where the world is going is you're going to have many more changes , many more things .
Speaker #1: You cannot actually afford to have incidents to look at know for , you , everything that's happening in your system . So you'll need to be proactive .
Speaker #4: But it's still somewhat slow. And so it's okay to have incidents and run as it's going, for post-hoc analysis on those incidents and maybe use some outside tooling for that.
Speaker #1: You'll need to run analysis in stream as all the data flows through , you'll need to run detection and resolution before you actually have outages materialized .
Speaker #4: Where the world is going server. is you're going to have many more changes, many more And so customers things. You cannot actually afford to have incidents to look at for everything that's happening in your system.
Speaker #1: And for that , you'll need to be embedded into the the data plane , which is what we run . And we also need to be run able to specialized models that can data opposed that act on taking just to , as everything and summarizing everything after the fact .
Speaker #4: So you'll need to be proactive. analysis in-stream as all the data flows You'll need to run through. You'll need to run detection and resolution before you actually have outages, materialize.
Olivier Pomel: You'll need to run detection and resolution before you actually have outages materialized. And for that, you'll need to be embedded into the data plane, which is what we run. And we also need to be able to run specialized models that can act on that data as opposed to just taking everything and summarizing everything after the fact 10, 15 minutes later. And that's what we're uniquely positioned to do. We're building that. We're not quite there yet, but we think that a few years from now, that's how the world's going to run, and that's what makes us significantly different in terms of how we can apply anomaly detection, intelligence, and preemptive resolution into our systems.
Olivier Pomel: You'll need to run detection and resolution before you actually have outages materialized. And for that, you'll need to be embedded into the data plane, which is what we run. And we also need to be able to run specialized models that can act on that data as opposed to just taking everything and summarizing everything after the fact 10, 15 minutes later. And that's what we're uniquely positioned to do. We're building that. We're not quite there yet, but we think that a few years from now, that's how the world's going to run, and that's what makes us significantly different in terms of how we can apply anomaly detection, intelligence, and preemptive resolution into our systems.
Speaker #1: And 15 minutes later , and that's what we're uniquely positioned to to do . We're building that . We we , you are not quite there yet , but we we think that , you know , a few now , years from that's that's what the world is going to run .
Speaker #4: And for that, you'll need to be embedded into the data plane, which is what we run. And we also need to be able to run specialized models that can act on that data as opposed to the fact 10, 15 minutes just taking everything and summarizing everything after later.
Speaker #1: And that's what makes us significantly different in terms of who we can apply an early detection intelligence and preemptive resolution into our systems .
Speaker #4: position to do. We're building that. And that's what we uniquely We're not quite there yet, but we think that a few years from now, that's where the world's going to run.
Speaker #6: That makes a lot of sense . My follow up .
Speaker #1: By the way , the data plane . Talk to me about how very real time and there are many orders of magnitude larger in terms of data flows that are volumes than what you typically feed into an LLM .
Speaker #4: significantly different in terms of who we And that's what makes us can apply anomaly detection intelligence and preemptive resolution into our
Speaker #1: So it's a bit of a different problem to solve .
Speaker #6: Yeah . Super interesting . Thank you . My follow up for both of you Ali and David . You mentioned a couple of times now some of the conversations you have with customers about value creation within the Datadog, Inc. bit Tell of those platform .
Speaker #4: systems. That makes a lot of
Olivier Pomel: That makes a lot of sense. Thank you. My follow-up.
Gabriela Borges: That makes a lot of sense. Thank you. My follow-up.
Speaker #4: By the way, the Dataplex we're talking about are very sense.
Olivier Pomel: By the way, the data plates we're talking about are very real-time, and there are many others of magnitude larger in terms of data flows that are volumes than what you're typically feeding to an LLM. So it's a bit of a different problem to solve.
Olivier Pomel: By the way, the data plates we're talking about are very real-time, and there are many others of magnitude larger in terms of data flows that are volumes than what you're typically feeding to an LLM. So it's a bit of a different problem to solve.
Speaker #4: real-time. And there are many others of magnitude larger in terms of data flows that are volumes than what you typically feed into Thank you.
Speaker #6: how some about us a little conversations evolve when the customer sees that in order to do observability for more AI perhaps that Datadog bill usage , is going up .
Speaker #4: an LLM. So it's a bit of a different problem to
Speaker #6: Yeah. Super interesting. Thank you. My My follow-up. David, you've mentioned a couple of times now some of the conversations you have with
Gabriela Borges: Yeah. Super interesting. Thank you. My follow-up for both you, Olivier, and David. You've mentioned a couple of times now some of the conversations you have with customers about value creation within the Datadog platform. Tell us a little bit about how some of those conversations evolve when the customer sees that in order to do observability for more AI usage, perhaps that Datadog bill is going up. What are some of the steps that you can take to make sure the customer still feels like they're getting a ton of value out of the Datadog platform? Thank you.
Gabriela Borges: Yeah. Super interesting. Thank you. My follow-up for both you, Olivier, and David. You've mentioned a couple of times now some of the conversations you have with customers about value creation within the Datadog platform. Tell us a little bit about how some of those conversations evolve when the customer sees that in order to do observability for more AI usage, perhaps that Datadog bill is going up. What are some of the steps that you can take to make sure the customer still feels like they're getting a ton of value out of the Datadog platform? Thank you.
Speaker #6: What are some of the steps that you can take to make sure the customer still feels like they're getting a ton of value out of the data platform ?
Speaker #6: customers about value creation within the follow-up for both of you, Ollie and solve.
Speaker #6: Thank you .
Speaker #1: few things . Well , there's a I mean , first , I , you know , again , the rule of software always applies .
Speaker #1: Tell us a little bit platform . about how some of those evolved when the Datadog, Inc. customer sees observability for more AI conversations usage , perhaps that bill is up .
Speaker #1: know , You there's only two reasons people buy your product is to make more money or to save money . So whatever you do when when customers use a new product , they need to see a cost savings somewhere , or they need to see that they are going to get to customers , get to otherwise .
Speaker #1: What are some of Datadog that you to in order to do going for more that order to do going up? Steps that perhaps make sure the feels like in a ton of data usage? Platform?
Speaker #1: What are some of Datadog that you to in order to do going for more that order to do going up ? steps that Perhaps make sure the feels like in a ton of data usage ?
Speaker #1: Datadog bill AI
Speaker #1: So we we have to prove that we always prove that anytime a customer buys a product , that's what is happening behind the scenes .
Olivier Pomel: Well, there's a few things. I mean, first, again, the rule of software always applies. There's only two reasons people buy your product: it's to make more money or to save money. So whatever you do, when customers use a new product, they need to see a cost saving somewhere, or they need to see that they are going to get to customers they wouldn't get to otherwise. So we have to prove that. We always prove that. Anytime a customer buys a product, that's what is happening behind the scenes. In general, when customers add to our platform as opposed to bringing another vendor in or another product in, they also spend less by doing it on our platform.
Olivier Pomel: Well, there's a few things. I mean, first, again, the rule of software always applies. There's only two reasons people buy your product: it's to make more money or to save money. So whatever you do, when customers use a new product, they need to see a cost saving somewhere, or they need to see that they are going to get to customers they wouldn't get to otherwise. So we have to prove that. We always prove that. Anytime a customer buys a product, that's what is happening behind the scenes. In general, when customers add to our platform as opposed to bringing another vendor in or another product in, they also spend less by doing it on our platform.
Speaker #1: take to appreciate the caller .
Speaker #1: you
Speaker #1: The in general , when customers add to our platform as opposed to bringing another or vendor another product in , they also spend less by doing it on our platform .
Speaker #2: Well , there's a I mean , first , I
Speaker #2: software
Speaker #2: always applies . You
Speaker #2: people
Speaker #2: people rule of your
Speaker #2: whatever you do buy customers use a new product , they need to see a cost savings somewhere , or they need to see still that they are know , going to get to get to otherwise .
Speaker #2: when So we we have to prove that anytime we always customer buys a product , that's what happening behind they wouldn't the scenes .
Speaker #2: when So we we have to prove that anytime we always customer buys a product , that's what happening behind
Speaker #6: appreciate the call . I Thank you . very
Speaker #3: you . Our Thank next question comes from Ittai Kidron with Oppenheimer and Company . Your line is open .
Speaker #2: customers ,
Speaker #7: Thanks and congrats . Quite an impressive finish for the year . David , I wanted to dig in a little bit into your 26 guide .
Speaker #2: The in general What are some you can
Speaker #2: , when bringing another vendor another there's only two they also spend less by money . . prove that on our doing it
Speaker #7: Just want to make sure I understand some of your assumptions . So maybe you could talk about the level of conservatism that you've built into the guide for the year .
Gabriela Borges: I appreciate the call. Thank you very much.
Gabriela Borges: I appreciate the call. Thank you very much.
Olivier Pomel: Thank you. Our next question comes from Itai Kidron with Oppenheimer & Co. Your line is open.
Operator: Thank you. Our next question comes from Itai Kidron with Oppenheimer & Co. Your line is open.
Speaker #7: And also you've talked about at least 20% growth for the core , excluding the largest customer . But what is it that we should for the large customer ?
Speaker #1: Thank you I
Ittai Kidron: Thanks, and congrats. Quite an impressive finish for the year. David, I wanted to dig in a little bit into your 2026 guide. Just want to make sure I understand some of your assumptions. So maybe you could talk about the level of conservatism that you've built into the guide for the year. And also, you've talked about at least 20% growth for the core, excluding the largest customer. But what is it that we should assume for the large customer? And now when you look at the AI cohort excluding this large customer, are there any concentrations evolving over there, given your strong success there?
Ittai Kidron: Thanks, and congrats. Quite an impressive finish for the year. David, I wanted to dig in a little bit into your 2026 guide. Just want to make sure I understand some of your assumptions. So maybe you could talk about the level of conservatism that you've built into the guide for the year. And also, you've talked about at least 20% growth for the core, excluding the largest customer. But what is it that we should assume for the large customer? And now when you look at the AI cohort excluding this large customer, are there any concentrations evolving over there, given your strong success there?
Speaker #3: you . Our Thank next from
Speaker #3: you . Our Thank next from
Speaker #3: comes Ittai
Speaker #3: comes Ittai
Speaker #3: comes Ittai
Speaker #7: And now when you look at the the AI cohort , excluding this large customer , are there any concentrations evolving over there ? Given your strong success , there ?
Speaker #3: line is
Speaker #3: line is
Speaker #3: open And now .
Speaker #3: open And now .
Speaker #4: Thanks and
Speaker #4: Quite an finish for the of the . David , I
Speaker #4: Just want to wanted to dig I make your assumptions . So maybe you could value out talk
Speaker #4: Just want to wanted to dig I make
Speaker #2: Yeah , there are three questions . And the first is overall on guidance . we what we're going to speak about next . We same took the we approach as looked at the organic rates growth and the attached rates and the new logo accumulation rates and discounted that .
Speaker #4: level of in or the
Speaker #4: for the
Speaker #4: also
Speaker #4: about at in , least year core , excluding largest customer . we the should is it that assume But what the large
Speaker #4: for AI cohort , to make more the 20% growth product customer , there any concentrations there ? evolving strong success ,
Speaker #4: for AI cohort , to make more the 20% growth product customer , there any concentrations there ? evolving strong success , Given your over that you've question ?
Speaker #2: So for the overall business , which is quite diversified , we talked about diversification by industry , by geography , by by SMB , mid-market and and enterprise .
David Obstler: Yeah. There are three questions. The first is overall on guidance. Except what we're going to speak about next, we took the same approach as we looked at the organic growth rates, the attach rates, and the logo accumulation rates and discounted that. So for the overall business, which is quite diversified, we talked about diversification by industry, by geography, by SMB, mid-market, and enterprise. We took the same approach. We noted that with the guidance being 18% to 20% and the non-AI or heavily diversified business being 20%+, that would imply that the growth rate of that core business assumed in the guidance is higher than the growth rate of the large customer. That doesn't mean the large customer is growing any which way. It's just that in our consumption model, we essentially don't control that. So we took a very conservative assumption there.
David Obstler: Yeah. There are three questions. The first is overall on guidance. Except what we're going to speak about next, we took the same approach as we looked at the organic growth rates, the attach rates, and the logo accumulation rates and discounted that. So for the overall business, which is quite diversified, we talked about diversification by industry, by geography, by SMB, mid-market, and enterprise. We took the same approach. We noted that with the guidance being 18% to 20% and the non-AI or heavily diversified business being 20%+, that would imply that the growth rate of that core business assumed in the guidance is higher than the growth rate of the large customer. That doesn't mean the large customer is growing any which way. It's just that in our consumption model, we essentially don't control that. So we took a very conservative assumption there.
Speaker #5: Yeah , are three there questions . And the on platform guidance ,
Speaker #2: We took the same approach . We noted that with the guidance being 18 to 20% and the non-ai or heavily diversified business being 20% plus , that would imply that the growth rate of that core business assumed in the guidance is higher than the growth rate of the large customer , that doesn't mean the large customer is growing way .
Speaker #5: next . about same approach as looked at to speak organic rates what we're going rates and the new logo and rates and discounted So for business , that .
Speaker #5: which diversified , we talked we except what diversification by industry , by geography , the SMB , mid-market and
Speaker #5: which diversified , we talked we except what diversification by industry , by geography , the SMB , mid-market and by and enterprise . overall We took your 26 approach that with being .
Speaker #5: which diversified , we talked we except what diversification by industry , by geography , the SMB , mid-market and by and enterprise . overall We took your 26 approach that with being .
Speaker #2: any which It's just that in our consumption model , we essentially don't control that . And so we we took a very conservative assumption there .
Speaker #2: And the last you mentioned is the highly , we diversified said 650 names in the AI is quite diversified . You know , essentially would be very similar to our overall business , which we have a range of customers .
Speaker #5: Heavily diversified business, or being the 20% plus, that are growth rate of that core business, assumed in the noted guidance. We’re higher than the growth rate, large the customer.
Speaker #5: of the That the mean is quite the large growing customer look at We any which is way . It's in is our
Speaker #2: But not the concentration level . And what we're seeing there is significant growth . But like our overall distributed customer base , you know , a growth .
Speaker #5: model , we essentially growth large don't control so that . And doesn't we guidance a the very conservative assumption there . last point , I think mentioned about is the that highly imply that , we said we 650 names in the AI is quite just diversified .
David Obstler: The last point I think you mentioned is the highly diversified. We said 650 names in the AI is quite diversified. Essentially, it would be very similar to our overall business, in which we have a range of customers, but not the concentration level. What we're seeing there is significant growth, but our overall distributed customer base, a growth, and then potentially some working on how the product's being used, but nothing out of the ordinary relative to the overall customer base in the very diversified AI set of customers outside the largest customer, hopefully.
David Obstler: The last point I think you mentioned is the highly diversified. We said 650 names in the AI is quite diversified. Essentially, it would be very similar to our overall business, in which we have a range of customers, but not the concentration level. What we're seeing there is significant growth, but our overall distributed customer base, a growth, and then potentially some working on how the product's being used, but nothing out of the ordinary relative to the overall customer base in the very diversified AI set of customers outside the largest customer, hopefully.
Speaker #2: And then , you know , potentially some working on how the product is being used . But nothing out of the ordinary relative to the overall customer base in the in the very diversified AI set of customers outside the largest customer .
Speaker #5: You know diversified very our overall business , of which similar a range be But not the level . what we're And concentration customers .
Speaker #2: Okay .
Speaker #7: That's great . Yeah . And can you give us the percent of revenue of the AI cohort this quarter ?
Speaker #5: is significant we have growth . But you , essentially overall customer base , you seeing like our growth . And then of some distributed working on how the product But nothing out of the relative to the overall base in the diversified in the customer AI set of outside the used .
Speaker #2: We we haven't there put it in .
Speaker #7: Thank you .
Speaker #3: you . Thank Our next comes from question Todd with CIBC . Your . open is line
Ittai Kidron: Okay. That's great. Yeah. Can you give us the percent of revenue of the AI cohort this quarter?
Ittai Kidron: Okay. That's great. Yeah. Can you give us the percent of revenue of the AI cohort this quarter?
Speaker #5: very
Speaker #8: and good Oh morning . thank you I wanted to about ask you competition and how the rise is impacting share shifts . Just talk about that and how Datadog will be impacted .
Speaker #5: . Okay .
David Obstler: We haven't put it in there.
David Obstler: We haven't put it in there.
Speaker #4: Yeah . And can you give us the
Speaker #4: Revenue of the AI percent of this quarter—great. Customer?
Ittai Kidron: Thank you.
Ittai Kidron: Thank you.
Speaker #4: cohort
Speaker #5: We we haven't customers
Olivier Pomel: Thank you. Our next question comes from Todd Coupland with CIBC. Your line is open.
Operator: Thank you. Our next question comes from Todd Coupland with CIBC. Your line is open.
Speaker #5: there .
Speaker #8: Thanks a lot .
Speaker #4: Thank you .
Speaker #1: Yeah , I mean , there hasn't been , you know , in the in the market with customers , there hasn't been any particular change in competition .
Todd Coupland: Olivier, thank you and good morning. I wanted to ask you about competition and how the LLM rise is impacting share shifts. Just talk about that and how Datadog will be impacted. Thanks a lot.
Todd Coupland: Olivier, thank you and good morning. I wanted to ask you about competition and how the LLM rise is impacting share shifts. Just talk about that and how Datadog will be impacted. Thanks a lot.
Speaker #3: you .
Speaker #3: next to question comes from Coupland Todd line is open .
Speaker #1: You know , in that we see the same kind of folks and the positions are relatively similar . And we are pulling away .
Speaker #6: Oh
Speaker #6: thank you and good morning . would
Speaker #6: thank you and good morning . would about CIBC . competition and how
Speaker #1: We're taking share from anybody who has scale . And I know there's been there's been noise . There were a couple of M&A deals that came up and we got some questions about that .
Speaker #6: the
Speaker #6: rise is impacting
Olivier Pomel: Yeah. I mean, there hasn't been, in the market with customers, there hasn't been any particular change in competition in that we see the same kind of folks and the position are relatively similar, and we are pulling away. We're taking share from anybody who has scale. And I know there's been noise. There were a couple of M&A deals that came up, and we got some questions about that. The companies in there were not particularly winning companies, not companies that we saw in deals, not companies that had a large market impact. And so we don't see that as changing the competitive dynamics for us in the near future. We also know that competing in observability is a very, very full-time job. It's a very innovative market.
Olivier Pomel: Yeah. I mean, there hasn't been, in the market with customers, there hasn't been any particular change in competition in that we see the same kind of folks and the position are relatively similar, and we are pulling away. We're taking share from anybody who has scale. And I know there's been noise. There were a couple of M&A deals that came up, and we got some questions about that. The companies in there were not particularly winning companies, not companies that we saw in deals, not companies that had a large market impact. And so we don't see that as changing the competitive dynamics for us in the near future. We also know that competing in observability is a very, very full-time job. It's a very innovative market.
Speaker #6: and how share will Your impacted . l.l.m. Thanks a Datadog lot
Speaker #6: and how share will Your impacted . l.l.m. Thanks a Datadog lot .
Speaker #1: The company is in there were not particularly winning companies , not companies that we saw in deals , not companies that had a large market impact .
Speaker #2: there hasn't been , you know , in the in the market Thank with there hasn't put it in particular change in
Speaker #1: And so don't see that as changing dynamics competitive the for us in the near future . competing in also know that We observability is a very , very time full job .
Speaker #2: You know , in that same of folks and the
Speaker #2: similar . we are pulling And with away . We're share from scale who has taking . And I know noise there's been Yeah , a couple of M&A deals that came up anybody we got some questions wanted been any be that there's been .
Speaker #1: It's a very innovative market , and we know exactly what it is . We had to do and have to do to keep pulling away the way we are .
Speaker #1: And so we're very confident in our approach and what we're going to do in the future . There . With the rise of LM , there is a there's clearly more functionality to build , and there are new ways to serve customers .
Speaker #2: impact we as changing not the competitive dynamics don't see in the near future . We also that know that competing a very is in very , full time job .
Speaker #1: You know , we we mentioned LM reliability product . There are a few other products on the market for that . I think it's still very early for that part of the market .
Olivier Pomel: And we know exactly what it is we had to do and have to do to keep pulling away the way we are. And so we're very confident in our approach and what we're going to do in the future there. With the rise of LLM, there's clearly more functionality to build, and there are new ways to serve customers. We mentioned our LLM Observability product. There are a few other products on the market for that. I think it's still very early for that part of the market, and that market is still relatively undifferentiated in terms of the kinds of products there are. But we expect that to shake out more into the future. We think, in the end, there's no reason to have observability for your LLM that is different from the rest of your system, in great part because your LLMs don't work in isolation.
Olivier Pomel: And we know exactly what it is we had to do and have to do to keep pulling away the way we are. And so we're very confident in our approach and what we're going to do in the future there. With the rise of LLM, there's clearly more functionality to build, and there are new ways to serve customers. We mentioned our LLM Observability product. There are a few other products on the market for that. I think it's still very early for that part of the market, and that market is still relatively undifferentiated in terms of the kinds of products there are. But we expect that to shake out more into the future. We think, in the end, there's no reason to have observability for your LLM that is different from the rest of your system, in great part because your LLMs don't work in isolation.
Speaker #1: And that market is still relatively undifferentiated in terms of the kinds of products they are . But we expect that to shake out more into the future .
Speaker #2: very . innovative market and we know And so exactly what it is . We and have not to keep had to do pulling away observability we are .
Speaker #1: We think in the end , there's no reason to have observability for your LM . different from That is the rest of your system .
Speaker #2: we're very confident And so in our what we're going to do in the future . the way They're for us With the rise of LM , there there's clearly in more to there are new ways to serve is do to You know , had we mentioned approach and LM reliability is a product .
Speaker #1: In great part because you don't work in isolation . The way they implement their smarts . By using tools , the tools that your applications and your existing new applications you And that build for applications so purpose .
Speaker #2: customers . few other products on build , and I still very early for think it's that part of the market . And that is still relatively undifferentiated terms of the kinds of products they are in expect .
Speaker #1: or you need to be integrated in production . And we think we stand on a very strong footing there .
Speaker #2: that to shake out more into the future . We think in the But we It's a end no , there's to have That is observability LM . different from system .
Speaker #2: that to shake out more into the future . We think in the But we It's a end no , there's to have That is observability LM .
Speaker #2: that to shake out more into the future . We think in the But we It's a end no , there's to have That is observability LM . different from reason for your great part because In don't work in The way they implement their smarts is by using tools , the tools that applications existing applications or new applications you build for that And purpose .
Speaker #8: Thank you .
Speaker #3: Thank you . Our next question comes from Mark Murphy with J.P. Morgan . Your line is open .
Olivier Pomel: The way they implement their smarts is by using tools, the tools on your applications and your existing applications, or new applications you build for that purpose. And so you need everything to be integrated in production, and we think we stand on a very strong footing there.
Olivier Pomel: The way they implement their smarts is by using tools, the tools on your applications and your existing applications, or new applications you build for that purpose. And so you need everything to be integrated in production, and we think we stand on a very strong footing there.
Speaker #9: Thank you . Olivier Amazon is targeting 200 billion in CapEx this year . If you include Microsoft and Google that CapEx is going to exceed 500 billion this year for the big three hyperscalers .
Todd Coupland: Thank you.
Todd Coupland: Thank you.
Speaker #2: you so to be integrated in isolation . production . And we think we stand on a and your .
Speaker #9: And it's growing 40 to 60% . I'm wondering if you've collected enough signal from the last couple years of CapEx that that trend to to estimate how much of that is training related and when it might convert to inferencing , where Datadog might be required .
Olivier Pomel: Thank you. Our next question comes from Mark Murphy with JP Morgan. Your line is open.
Operator: Thank you. Our next question comes from Mark Murphy with JP Morgan. Your line is open.
Speaker #2: footing there strong
Speaker #6: Thank
Mark Murphy: Thank you, Olivier. Amazon is targeting $200 billion in CapEx this year. If you include Microsoft and Google, CapEx is going to exceed $500 billion this year for the big three hyperscalers. It's growing 40% to 60%. I'm wondering if you've collected enough signal from the last couple of years of CapEx, that trend, to estimate how much of that is training-related and when it might convert to inferencing where Datadog might be required. In other words, are you looking at this wave of CapEx and able to say it's going to create a predictable ramp in your LLM Observability revenue? Maybe what inning of that are we in? And then I have a follow-up.
Mark Murphy: Thank you, Olivier. Amazon is targeting $200 billion in CapEx this year. If you include Microsoft and Google, CapEx is going to exceed $500 billion this year for the big three hyperscalers. It's growing 40% to 60%. I'm wondering if you've collected enough signal from the last couple of years of CapEx, that trend, to estimate how much of that is training-related and when it might convert to inferencing where Datadog might be required. In other words, are you looking at this wave of CapEx and able to say it's going to create a predictable ramp in your LLM Observability revenue? Maybe what inning of that are we in? And then I have a follow-up.
Speaker #3: you . Our
Speaker #3: question comes from J.P. Morgan . Your line .
Speaker #3: open
Speaker #9: In other words , you know , are you looking at this wave of CapEx and able to say it's going to create a predictable ramp in your LM ability that what inning of maybe are we in ?
Speaker #7: Olivier Thank you . Amazon . is 200 billion in targeting year . If you include and Google Microsoft
Speaker #7: Olivier Thank you . Amazon . is 200 billion in targeting year . If you include and Google Microsoft
Speaker #7: Olivier Thank you . Amazon . is 200 billion in targeting year . If you include and Google Microsoft
Speaker #7: for the big three hyperscalers . CapEx is And it's
Speaker #9: And then I have a follow up . Well .
Speaker #1: I think it's more it's I think probably too reductive to to peg that on LM observability . I think it to points a where more applications , where more intelligence , where more of everything into the future .
Speaker #7: wondering if you collected enough signal from the last years of CapEx going to that exceed to much of that is 500 billion couple to Mark that
Speaker #7: wondering if you collected enough signal from the last years of CapEx going to that exceed to much of that is 500 billion couple to Mark that related and when next inferencing , where Datadog might be other trend required .
Speaker #1: Now , it's kind of hard to directly map the CapEx on those companies into what part of the infrastructure is actually going to be used to deliver value , you know , two or 3 or 4 years from now .
Speaker #7: In you know , is this wave are you and looking Thank say it's going it might training ramp to create your LM a revenue , maybe what inning of that estimate how in ?
Speaker #7: words , have a follow up . Well .
Speaker #7: of I
Speaker #7: of I
Olivier Pomel: Well, I think it's more interesting. I think it's probably too reductive to peg that on LLM Observability. I think it points to way more applications, way more intelligence, way more of everything into the future. Now, it's kind of hard to directly map the CapEx from those companies into what part of the infrastructure is actually going to be used to deliver value two or three or four years from now. So I think we'll have to see on what the conversion rate is on that. But look, it definitely points to very, very, very large increases in the complexity of the systems, the number of systems, and the reach of the systems in the economy. And so we think it's going to be it's going to be of great help to our business, let's put it this way.
Olivier Pomel: Well, I think it's more interesting. I think it's probably too reductive to peg that on LLM Observability. I think it points to way more applications, way more intelligence, way more of everything into the future. Now, it's kind of hard to directly map the CapEx from those companies into what part of the infrastructure is actually going to be used to deliver value two or three or four years from now. So I think we'll have to see on what the conversion rate is on that. But look, it definitely points to very, very, very large increases in the complexity of the systems, the number of systems, and the reach of the systems in the economy. And so we think it's going to be it's going to be of great help to our business, let's put it this way.
Speaker #1: So I think we'll have to see on what the conversion rate is on that . But look , it definitely points to a very , very , very large increases in the complexity of the system .
Speaker #2: reductive
Speaker #2: think
Speaker #1: The number of systems and the reach of the systems in the economy . And so we think it's going to be a it's going to be to be of to our great help business .
Speaker #2: to And then I a Murphy with predictable where more applications , where more intelligence , way more the I think to it's
Speaker #2: to And then I a Murphy with predictable where more applications , where more intelligence , way more the I think to it's
Speaker #2: to And then I a Murphy with predictable where more applications , where more intelligence , way more the I think to it's
Speaker #2: directly maybe , map the 40 to 60% . CapEx on those at into companies
Speaker #1: Let's put it this way .
Speaker #9: Yeah . Great help . Okay . And then as a quick follow up , there is an expectation developing that OpenAI is going to have a very strong competitor , which is anthropic kind of closing the gap nearly as , producing much revenue as OpenAI in the next 1 to 2 years .
Speaker #2: infrastructure is going to be used to actually value , you know , two or 3 or 4 years from points everything into now think it's more of think we'll kind of have to see on what the in conversion rate is on . that .
Speaker #2: infrastructure is going to be used to actually value , you know , two or 3 or 4 years from points everything into now think it's more of think we'll kind of have to see on what the in conversion rate is on .
Speaker #2: . look , it So I definitely points it's to a are we very , hard very , very large increases in the what part complexity of the system .
Speaker #2: The number of systems observability and the systems in the Now , economy . And so reach of the we we think it's be a it's to be to be of to our deliver great help going Let's put it this way
Speaker #9: You mentioned an eight figure land with an AI model company . I'm wondering if we step back . Do you see an opportunity to diversify that AI customer concentration , whether sometimes it , you know , might be a direct customer relationship , there or , you know , it could be some of the products like Claude code , you know , being adopted globally , just kind of creating more surface area to drive business to , to Datadog .
Mark Murphy: Yeah. Great help. Okay. And then as a quick follow-up, there is an expectation developing that OpenAI is going to have a very strong competitor, which is Anthropic, kind of closing the gap, producing nearly as much revenue as OpenAI in the next one to two years. You mentioned an eight-figure land with an AI model company. I'm wondering, if we step back, do you see an opportunity to diversify that AI customer concentration, whether sometimes it might be a direct customer relationship there? Or it could be some of the products like Claude Code being adopted globally, just kind of creating more surface area to drive business to Datadog. Can you comment on maybe what is happening there among the larger AI providers and whether you can diversify that out?
Mark Murphy: Yeah. Great help. Okay. And then as a quick follow-up, there is an expectation developing that OpenAI is going to have a very strong competitor, which is Anthropic, kind of closing the gap, producing nearly as much revenue as OpenAI in the next one to two years. You mentioned an eight-figure land with an AI model company. I'm wondering, if we step back, do you see an opportunity to diversify that AI customer concentration, whether sometimes it might be a direct customer relationship there? Or it could be some of the products like Claude Code being adopted globally, just kind of creating more surface area to drive business to Datadog. Can you comment on maybe what is happening there among the larger AI providers and whether you can diversify that out?
Speaker #2: .
Speaker #7: Great Yeah . help . And then , there is an expectation Okay . that quick going to very strong have a which is is
Speaker #7: Anthropic is closing the gap, producing nearly as much in the one to two years. Mentioned an eight-figure land with an 'You' model company.
Speaker #9: Can you comment on maybe what is happening there among the larger AI providers and whether you can diversify that out ?
Speaker #7: wondering I'm if we back . see an business . Do you step to AI diversify customer revenue as , you know , concentration , might be customer whether direct or you sometimes it know , it could Claude being you know , adopted code , just kind of surface to to , to area drive business creating more Datadog .
Speaker #1: Yeah , I mean , look , we've never been a we're not built as a business to be concentrated on a couple of customers .
Speaker #1: That's not how we become successful . That's probably not who will be successful in the long term . So yes , I mean , we at the end of the day irrational for customers , , it should be for all customers in the AI court , not to use our products .
Speaker #7: some of the Can you comment on maybe what is there among the AI happening providers and diversify that products like out ?
Olivier Pomel: Yeah. I mean, look, we've never been a, we're not built as a business to be concentrated on a couple of customers. That's not how we've become successful. That's probably not how we'll be successful in the long term. So yes, I mean, at the end of the day, it should be irrational for customers, for all customers in the AI cohort, not to use our product. So we have some great successes with the customers currently in that cohort. We see more, by the way. We have more that are more inbound there and more customers that are talking to us from the largest, even hyperscaler-level AI labs. And we expect to drive more business there in the future. I think there's no question about that.
Olivier Pomel: Yeah. I mean, look, we've never been a, we're not built as a business to be concentrated on a couple of customers. That's not how we've become successful. That's probably not how we'll be successful in the long term. So yes, I mean, at the end of the day, it should be irrational for customers, for all customers in the AI cohort, not to use our product. So we have some great successes with the customers currently in that cohort. We see more, by the way. We have more that are more inbound there and more customers that are talking to us from the largest, even hyperscaler-level AI labs. And we expect to drive more business there in the future. I think there's no question about that.
Speaker #1: So we see we have some great successes with the customers currently in that , in that cohort , we see more , by the way , we have more that inbound there more and more customers that are talking to us from the largest , even hyperscaler level AI labs .
Speaker #7: whether you can Yeah ,
Speaker #2: been a be we're not we've never built as a I mean , look , business to be a couple concentrated on customers of .
Speaker #2: That's not who we become successful . That's probably not who
Speaker #1: we And expect to drive more business there in the think there's future . I there's no a about that .
Speaker #2: term in the long will be yes , we the . day , it should be irrational for So customers , larger in AI court , not to the use our So we customers have some great customers currently at the end of product .
Speaker #2: seeing that in some of
Speaker #2: metrics we've the been giving question And you're in terms of the number of of native customers , the size of some of these customers .
Speaker #2: So , you know , to echo what Ali said , we are essentially selling to many of the largest players , which results in greater size of the cohort and more diversification .
Speaker #2: Successes with the—we see more, way, we cohort, inbound there in that, and more that are talking to more of us from the largest, even hyperscaler, the AI labs customers, business there future.
Mark Murphy: You're seeing that in some of the metrics we've been giving in terms of the number of AI-native customers, the size of some of these customers. To echo what Olivier said, we are essentially selling to many of the largest players, which results in greater size of the cohort and more diversification. Thank you.
David Obstler: You're seeing that in some of the metrics we've been giving in terms of the number of AI-native customers, the size of some of these customers. To echo what Olivier said, we are essentially selling to many of the largest players, which results in greater size of the cohort and more diversification.
Speaker #9: Thank you .
Speaker #3: Thank you . Our next question comes from Matt Hedberg with RBC . Your line is open .
Speaker #2: drive more seeing that And
Speaker #2: in the
Speaker #2: think I there's a by the there's no expect to about that .
Speaker #2: think I there's a by the there's no expect to about that . question And you're
Speaker #5: in that
Speaker #9: Great . Thanks for taking my question . Congrats well as for me . You know , Dave , for you . Your prior investments are clearly paying off with another quarter of acceleration , and it seems like you're going to continue to invest in front of the future opportunity .
Speaker #5: of of customers
Speaker #5: native , you in some of Ali we are we have more essentially selling to the many of largest players , which in greater size of the and more know , to .
Mark Murphy: Thank you.
Speaker #9: I think up margins are down maybe 100 basis points on your initial guide . I'm curious if you can comment on gross margin expectations this year and how you also might realize incremental opex synergies by using even more AI internally .
Olivier Pomel: Thank you. Our next question comes from Matt Hedberg with RBC. Your line is open.
Operator: Thank you. Our next question comes from Matt Hedberg with RBC. Your line is open.
Speaker #7: Thank you
Matt Hedberg: Great. Thanks for taking my question, guys. Congrats from me as well. David, question for you. Your prior investments are clearly paying off with another quarter of acceleration, and it seems like you're going to continue to invest in front of the future opportunity. I think op margins are down maybe 100 basis points on your initial guide. I'm curious if you can comment on gross margin expectations this year and how you also might realize incremental optics synergies by using even more AI internally.
Matt Hedberg: Great. Thanks for taking my question, guys. Congrats from me as well. David, question for you. Your prior investments are clearly paying off with another quarter of acceleration, and it seems like you're going to continue to invest in front of the future opportunity. I think op margins are down maybe 100 basis points on your initial guide. I'm curious if you can comment on gross margin expectations this year and how you also might realize incremental optics synergies by using even more AI internally.
Speaker #3: Thank you .
Speaker #3: next question comes Our Hedberg RBC . with Your line is
Speaker #3: next question comes Our Hedberg RBC .
Speaker #3: open .
Speaker #2: Yeah , on the gross margin , I think what we said is , you know , plus or minus the 80% mark , try to you know , we engineer when we see opportunities for efficiency .
Speaker #8: Great . customers . taking my question , guys . Thanks for results well . for me as Congrats You know a question , Dave , investments prior for you .
Speaker #8: are Your paying off with another clearly of acceleration . And it quarter seems like you're going to continue to invest the future think up margins the number maybe 100 basis points on your initial guide .
Speaker #2: We've been quite good at being able to harvest them at the same time , we want to make sure we're investing in the platform .
Speaker #2: So I think , you know what ? We're what we're essentially where we are today is very much sort of in line with what we said .
Speaker #8: I'm are down you can diversification comment on gross opportunity . margin expectations this year and how I might also curious if synergies by even more AI internally .
David Obstler: Yeah. On the gross margin, I think what we said is ±80%. We try to engineer. When we see opportunities for efficiency, we've been quite good at being able to harvest them. At the same time, we want to make sure we're investing in the platform. So I think what we're essentially where we are today is very much sort of in line with what we said we're targeting. There may be opportunities longer term, but we also are trying to balance those opportunities with investment in the platform. In terms of AI, to date, we are using it in our internal operations. So far, it's one of the first signs of what we're seeing is productivity and adoption. We will continue to update everybody as we see opportunities in terms of the cost structure. Oley, anything else you want to go over?
David Obstler: Yeah. On the gross margin, I think what we said is ±80%. We try to engineer. When we see opportunities for efficiency, we've been quite good at being able to harvest them. At the same time, we want to make sure we're investing in the platform. So I think what we're essentially where we are today is very much sort of in line with what we said we're targeting. There may be opportunities longer term, but we also are trying to balance those opportunities with investment in the platform. In terms of AI, to date, we are using it in our internal operations. So far, it's one of the first signs of what we're seeing is productivity and adoption. We will continue to update everybody as we see opportunities in terms of the cost structure. Oley, anything else you want to go over?
Speaker #2: We're targeting . There opportunities longer term , but we also are trying to balance those opportunities with investment in the platform . of And in AI terms to date , we are using it in our internal operations .
Speaker #5: gross on the Yeah , margin , I we said is , you or
Speaker #5: know , plus realize
Speaker #5: 80% mark , you know , we try to when we engineer
Speaker #5: opportunities for We've been good see efficiency . the harvest to them at time , we make sure want to we're investing you I platform .
Speaker #2: far , So it's with the first signs of what we're seeing is , is productivity and adoption . We will continue to update everybody as we see opportunities in terms of the cost structure .
Speaker #5: think opex So ? We're what we're essentially where we are today using very much sort of able in line said . We're targeting we .
Speaker #2: I'll anything else you want to go over ?
Speaker #1: look , I mean , we the expectation in the in the short term anyway should be that we keep investing heavily in R&D .
Speaker #5: There may be the same longer opportunities but also are term , , you know those what the with platform . terms of And in trying to to using are it in our internal .
Speaker #1: You know , we're getting a lot . We see great productivity gains . You know with AI there . But at this point it does .
Speaker #1: It helps us be more faster and get to solve more problems for our customers . And but we're very busy adopting AI for the organization .
Speaker #5: So, so far, the first signs of what we're seeing are productivity and AI adoption. We will continue to update everybody as we see operations opportunities in terms of the structure.
Olivier Pomel: Yeah. I mean, look, the expectation in the short mid-term, anyway, should be that we keep investing heavily in R&D. We're getting a lot. We see great productivity gains with AI there. But at this point, look, it does it. It helps us build more faster and get to solve more problems for our customers. But we're very busy adopting AI across the organization.
Olivier Pomel: Yeah. I mean, look, the expectation in the short mid-term, anyway, should be that we keep investing heavily in R&D. We're getting a lot. We see great productivity gains with AI there. But at this point, look, it does it. It helps us build more faster and get to solve more problems for our customers. But we're very busy adopting AI across the organization.
Speaker #9: Got it . Thanks , guys .
Speaker #5: else you want to go over ?
Speaker #3: Thank you . Our next question comes from Koji Ikeda with Bank of America . Your line is open .
Speaker #2: Yeah , I mean , investment in look , the
Speaker #2: the in the
Speaker #2: we keep short term investing R&D . we're getting a see we great productivity . gains There We . But at it AI . You know , it helps heavily in does it with more be us this point faster solve more customers .
Speaker #10: Yeah . Hey guys . Thanks so much for taking the question . Olivier . Maybe a question for you . A year ago , you talked about how while some customers do want to take observability in-house , really it's a cultural choice .
Matt Hedberg: Got it. Thanks, guys.
Matt Hedberg: Got it. Thanks, guys.
Speaker #2: for our
Olivier Pomel: Thank you. Our next question comes from Koji Ikeda with Bank of America. Your line is open.
Operator: Thank you. Our next question comes from Koji Ikeda with Bank of America. Your line is open.
Speaker #2: but problems busy And we're very
Speaker #2: AI across adopting organization date, we.
Speaker #2: the
Speaker #10: It may not be rational unless you have tremendous scale , access to talent and growth is not limited by innovation , bandwidth , which most companies do not .
Speaker #8: Thanks ,
Speaker #8: guys
Speaker #3: Thank you . next question from Koji
David Obstler: Yeah. Hey, guys. Thanks so much for taking the question. Olivier, maybe a question for you. A year ago, you talked about how while some customers do want to take observability in-house, it's really a cultural choice. It may not be rational unless you have tremendous scale, access to talent, and growth is not limited by innovation bandwidth, which most companies do not. And so it is a year later, and it does seem like the industry and the ecosystem and everything has changed quite a bit. So I was hoping to get your updated views on these thoughts, if it has changed at all over the past year and why. Thank you.
Koji Ikeda: Yeah. Hey, guys. Thanks so much for taking the question. Olivier, maybe a question for you. A year ago, you talked about how while some customers do want to take observability in-house, it's really a cultural choice. It may not be rational unless you have tremendous scale, access to talent, and growth is not limited by innovation bandwidth, which most companies do not. And so it is a year later, and it does seem like the industry and the ecosystem and everything has changed quite a bit. So I was hoping to get your updated views on these thoughts, if it has changed at all over the past year and why. Thank you.
Speaker #3: comes cost America . Akita Your line is open .
Speaker #10: And so it is a year later , and it does seem like the industry and the ecosystem and everything has changed . Quite a bit .
Speaker #9: Yeah . Thanks so Hey guys . and get to taking the Maybe a question
Speaker #10: So I was hoping to get your updated views on these thoughts . If it has changed at all over the past year and why ?
Speaker #9: talked about how ago , well some do want observability in-house . It's really a It may not be tremendous scale question to take .
Speaker #9: talked about how ago , well some do want observability in-house . It's really a It may not be tremendous scale question to take . not limited by is bandwidth , Olivier .
Speaker #10: Thank you .
Speaker #1: No , I mean , look , it's a it's something that happens sometimes , but it's a small minority of the cases like the general motion is customers start with some or homegrown attempts to do things themselves .
Speaker #9: talent and
Speaker #1: Then they move to a product , then they scale with our product . Sometimes they optimize a little bit along the way . But the general motion is they do more and more with us .
Speaker #9: do not . And so it is a year later , and it does the rational the choice . and changed . Quite a cultural bit everything has .
Speaker #1: They rely . They rely on us of their solving , more of their problems , and they outsource the the problem and increasingly the outcomes to us .
Olivier Pomel: No. I mean, look, it's something that happens sometimes, but it's a small minority of the cases. The general motion is customers start with some homegrown attempts to do things themselves, then they move to our product, then they scale with our product. Sometimes they optimize a little bit along the way. But the general motion is they do more and more with us. They're relying on us for more of their solving more of their problems. And they outsource the problem and increasingly the outcomes to us. So I don't think that's changing. Look, we'll still see customers here and there that choose to insource it and do it themselves, again, hugely for cultural reasons. I would say economically, or from a focus perspective, it doesn't make sense for the very vast majority of companies.
Olivier Pomel: No. I mean, look, it's something that happens sometimes, but it's a small minority of the cases. The general motion is customers start with some homegrown attempts to do things themselves, then they move to our product, then they scale with our product. Sometimes they optimize a little bit along the way. But the general motion is they do more and more with us. They're relying on us for more of their solving more of their problems. And they outsource the problem and increasingly the outcomes to us. So I don't think that's changing. Look, we'll still see customers here and there that choose to insource it and do it themselves, again, hugely for cultural reasons. I would say economically, or from a focus perspective, it doesn't make sense for the very vast majority of companies.
Speaker #9: So I was hoping to get updated views on unless you have thoughts . If it has changed at all past year and why ?
Speaker #9: over the Thank you
Speaker #2: I mean , look , your
Speaker #2: I mean , look , your
Speaker #2: I mean , look , your
Speaker #1: So I don't think that's changing . Look , we'll still see customers here and there that choose to insource it and do it themselves again , hugely .
Speaker #2: a small minority of the cases like the general motion . is No ,
Speaker #1: For cultural reasons . I would say economically or from a focus perspective . It doesn't make sense for the very vast majority of companies .
Speaker #2: with start some homegrown
Speaker #2: themselves . seem Then they move to a or they scare with our like optimize product , then a little bit way
Speaker #1: And we even see teams at hyperscalers that tooling have all the in the world , all the money in the world , all the Navajo in the world .
Speaker #2: they do more and for more rely on us They of They rely . solving , more of , and they motion the . and Sometimes they increasingly outcomes the problem don't the think general growth So I But the changing their .
Speaker #1: And choose to that still use our products because it gives them a more direct path to solving their problems .
Speaker #2: customers here Look , and there that choose to problems insource it and do it is outsource again , for hugely cultural to us . .
Speaker #2: customers here Look , and there that choose to problems insource it and do it is outsource again , for hugely cultural to us .
Speaker #10: Thank you .
Olivier Pomel: We even see teams at hyperscalers that have all the tooling in the world, all the money in the world, all the know-how in the world, and that still choose to use our products because it gives them a more direct path to solving their problems.
Speaker #3: Thank you . And our next question comes from Peter Weed with Bernstein Research . Your line is open . Peter , if your telephone is muted , please unmute .
Olivier Pomel: We even see teams at hyperscalers that have all the tooling in the world, all the money in the world, all the know-how in the world, and that still choose to use our products because it gives them a more direct path to solving their problems.
Speaker #2: say , from a focus it sense for economically the very vast majority of perspective , we even see and teams at hyperscalers tooling all the in the world , all the money in the world , Navajo in all the the And choose to use our world .
Speaker #2: companies because products us . doesn't make it gives them more a direct path to their .
David Obstler: Thank you.
Koji Ikeda: Thank you.
Speaker #3: Our next question comes from Brad Reback with Stifel . Your line is
Olivier Pomel: Thank you. And our next question comes from Peter Weed with Bernstein Research. Your line is open. Peter, if your telephone's muted, please unmute. Our next question comes from Brad Reback with Stifel. Your line is open.
Operator: Thank you. And our next question comes from Peter Weed with Bernstein Research. Your line is open. Peter, if your telephone's muted, please unmute. Our next question comes from Brad Reback with Stifel. Your line is open.
Speaker #9: Thank .
Speaker #9: Thanks
Speaker #9: much ,
Speaker #9: much , Great . open . very Ali . The sustained acceleration in the core business is pretty impressive . Obviously , you all have invested very aggressively in go to market over the last kind of 18 to 24 months .
Speaker #3: Thank you . And our question
Speaker #3: from muted , please unmute
Speaker #3: Weed with Bernstein Your line open Research . next is that have if your Peter , is
Speaker #9: Can you give us a sense of where you are in that productivity curve ? And if there's additional meaningful gains , you think , or is it incremental ?
Speaker #3: . Our question
Speaker #9: And maybe where you see additional investments in the next 12 to 18 months ? Thanks .
Matt Hedberg: Great. Thanks very much. Olivier, the sustained acceleration in the core business is pretty impressive. Obviously, you all have invested very aggressively in go-to-market over the last kind of 18 to 24 months. Can you give us a sense of where you are on that productivity curve and if there's additional meaningful gains you think, or is it incremental, and maybe where you see additional investments in the next 12 to 18 months? Thanks.
Brad Reback: Great. Thanks very much. Olivier, the sustained acceleration in the core business is pretty impressive. Obviously, you all have invested very aggressively in go-to-market over the last kind of 18 to 24 months. Can you give us a sense of where you are on that productivity curve and if there's additional meaningful gains you think, or is it incremental, and maybe where you see additional investments in the next 12 to 18 months? Thanks.
Speaker #3: Brad from problems Reback with Stiefel . is Your line open
Speaker #1: Yeah . I mean , we feel good about about the productivity . I think the the main drivers for us in the future is we still we still need to scale and we're still scaling the go to market team .
Speaker #8: very solving much .
Speaker #8: sustained Ali . The Thanks
Speaker #8: acceleration in the core pretty you all have aggressively in go you over the last . 18 to 24 months . Great . to market sense of give us a where you are in invested productivity curve ?
Speaker #1: We're not at the scale we need to be in every single market segment . We need to be in the world right now .
Speaker #1: And so we're keep we keep scaling there . So the focus now is not necessarily to improve productivity . It's to scale while maintaining productivity .
Speaker #8: And additional
Speaker #8: Think, or is it incremental and where you have additional investments in the—maybe. Obviously, thanks for that.
Olivier Pomel: Yeah. I mean, we feel good about the productivity. I think the main drivers for us in the future is we still need to scale, and we're still scaling the go-to-market team. We're not at the scale we need to be in every single market and segment we need to be in the world right now. And so we keep scaling there. So the focus now is not necessarily to improve productivity. It's to scale while maintaining productivity. And of course, there's still many, many things we can do. Actually, even though we love our performance, there's always a bunch of things that could be better, territories that could be better, productivity that could be better, things like that. So we have tons of work, tons of things we want to do, tons of things we want to fix, tons of things we want to improve.
Olivier Pomel: Yeah. I mean, we feel good about the productivity. I think the main drivers for us in the future is we still need to scale, and we're still scaling the go-to-market team. We're not at the scale we need to be in every single market and segment we need to be in the world right now. And so we keep scaling there. So the focus now is not necessarily to improve productivity. It's to scale while maintaining productivity. And of course, there's still many, many things we can do. Actually, even though we love our performance, there's always a bunch of things that could be better, territories that could be better, productivity that could be better, things like that. So we have tons of work, tons of things we want to do, tons of things we want to fix, tons of things we want to improve.
Speaker #1: And of course there's still many , many things we can do . Like we we even though we love our performance , there's always a bunch of things that better that could be could be territories better productivity , that could be better , things like that .
Speaker #2: Yeah . I feel good about Peter think the the main drivers from productivity . us in the is we still
Speaker #2: Yeah . I feel good about Peter think the the main drivers from productivity . us in the is we still
Speaker #1: So we have tons of work , we want to do , tons of things we want to fix , some things we want to improve .
Speaker #2: the go months ? market team . We're not we still the scale at mean , we be in every single market segment . be We need to
Speaker #1: But overall , we feel good about what happened . We feel good about scaling . And you should expect more scaling for us in the go to market side in the year to come .
Speaker #2: right we need to now . we're future We keep to very keep . scaling there . So the now is not necessarily to about the It's to productivity .
Speaker #9: Great . Thank you .
Speaker #2: scale next business is productivity world while improve . And of And so still many , many things we maintaining Like we can do .
Speaker #3: Thank you . Our next question comes from Howard Ma with Guggenheim . Your line is open .
Speaker #2: you know , even though we love our performance , a bunch of we things that could be better territories that could be better productivity , better that could be that .
Olivier Pomel: But overall, we feel good about what happened. We feel good about scaling, and you should expect more scaling for us on the go-to-market side in the year to come.
Olivier Pomel: But overall, we feel good about what happened. We feel good about scaling, and you should expect more scaling for us on the go-to-market side in the year to come.
Speaker #10: Great . Thanks for taking the question . I have one for Olivier . The core APM product growing in the mid 30% growth .
Speaker #2: So we we have tons of tons of things work , things like do , tons of we want to to fix , some things we want to improve course there's .
Matt Hedberg: Great. Thank you.
Brad Reback: Great. Thank you.
Speaker #2: But we feel overall , what feel good happened . We about scaling . And you expect more scaling should for us in the go to side in the market .
Speaker #10: That is pretty impressive . And I think better than maybe a lot of us expected is the question is , is that a reacceleration and is the growth driven by AI native companies that are using Datadog's real user monitoring and other Dem features as compared to where as opposed to other core enterprise customers that are building more applications .
Speaker #2: good about
Olivier Pomel: Thank you. Our next question comes from Howard Ma with Guggenheim. Your line is open.
Operator: Thank you. Our next question comes from Howard Ma with Guggenheim. Your line is open.
Speaker #8: Great . come you . if there's
Speaker #8: Great . come you . if there's
Speaker #8: Thank
Howard Ma: Great. Thanks for taking the question. I have one for Olivier. The core APM product growing in the mid-30% growth, that is pretty impressive, and I think better than maybe a lot of us expected. The question is, is that a reacceleration, and is the growth driven by AI-native companies that are using Datadog's Real User Monitoring and other DEM features as compared to, or as opposed to, rather, core enterprise customers that are building more applications?
Howard Ma: Great. Thanks for taking the question. I have one for Olivier. The core APM product growing in the mid-30% growth, that is pretty impressive, and I think better than maybe a lot of us expected. The question is, is that a reacceleration, and is the growth driven by AI-native companies that are using Datadog's Real User Monitoring and other DEM features as compared to, or as opposed to, rather, core enterprise customers that are building more applications?
Speaker #3: question comes from Howard , with Guggenheim . Your line is open .
Speaker #9: Great . Thanks for taking the question . I have one for core APM product Olivier . mid growing in the 30% growth . impressive .
Speaker #1: Yeah , I think I mean , look , APM in I think general , been a bit steady has always eddy in terms of the growth .
Speaker #1: it's a Like product that little bit takes a longer to deploy than others , which is further into the applications . And so it's , you know , it takes a bit a bit longer to penetrate , penetrate within the customer environment .
Speaker #9: pretty better than maybe a lot of expected us . The question The is , is that a And I think is the driven by That is AI reacceleration and growth native companies there's always that are
Speaker #9: using user monitoring and real other Dem features as 12 to 18 compared to where as core opposed to customers that enterprise building more are applications other .
Speaker #1: That being said , we did a number of different things . We did that helped with the growth there . One is we invested a lot in actually making that onboarding and deployment a lot simpler and faster .
Olivier Pomel: Yeah. I mean, look, APM, in general, I think has always been a bit of a steady eddy in terms of the growth. It's a product that takes a little bit longer to deploy than others, which is further into the applications. And so it takes a bit longer to fully penetrate within the customer environment. That being said, there's a number of different things we did that helped with the growth there. One is we invested a lot in actually making that onboarding deployment a lot simpler and faster. So we think we have the best in the market for that, and it shows. Second, we invested a lot in the digital experience side of it. And it's very differentiated. Something our customers love and is driving a lot of adoption of the broader APM suite. And we expect to see more of that in the future.
Olivier Pomel: Yeah. I mean, look, APM, in general, I think has always been a bit of a steady eddy in terms of the growth. It's a product that takes a little bit longer to deploy than others, which is further into the applications. And so it takes a bit longer to fully penetrate within the customer environment. That being said, there's a number of different things we did that helped with the growth there. One is we invested a lot in actually making that onboarding deployment a lot simpler and faster. So we think we have the best in the market for that, and it shows. Second, we invested a lot in the digital experience side of it. And it's very differentiated. Something our customers love and is driving a lot of adoption of the broader APM suite. And we expect to see more of that in the future.
Speaker #2: Yeah , I think I mean , Datadog's look , APM in general , I think been a bit of a steady Eddie in terms of the growth .
Speaker #1: So we think we have the best in the market for that , and it shows . Second , we invested a lot in the digital of experience side it , and it's very differentiated .
Speaker #2: Like has always it's a that product takes a little bit longer to deploy than others , which is further into the applications . And so it's , know , it takes a bit you a bit longer to penetrate within the customer environment .
Speaker #1: Something or customers love and is driving a lot of adoption of the APM suite . And we expect to see more of that in the in the future .
Speaker #2: That being said , we did number things . We of different did that helped growth with the there One is we invested a lot in actually making .
Speaker #1: And third , you know , we we made investments in go to market . Recover the market better . And so we're getting into more looks at more deals in more parts of the world .
Speaker #2: that and deployment a lot simpler and faster . So we think we have the best in the and it that , shows Second , we invested a lot in market for the digital it , and it's very differentiated .
Speaker #1: And so all of that combined , you helps that product Reaccelerate growth . Quite a bit . And so we feel actually very , very good about it , which is why we are we you know , we keep investing overall .
Speaker #2: Something our customers love and is driving a lot of adoption of the . APM suite we expect to see more of that . And experience side of in the in the future .
Olivier Pomel: And third, we made investments in go-to-market. We covered the market better. And so we're getting into more looks at more deals in more parts of the world. And so all of that combined helps that product reaccelerate growth quite a bit. And so we feel actually very, very good about it, which is why we keep investing. Overall, we still only have a small part of the pure APM market. That product is scaled at about $10 billion, including DEM, but the market is larger. And so we think there's a lot more we can do there.
Olivier Pomel: And third, we made investments in go-to-market. We covered the market better. And so we're getting into more looks at more deals in more parts of the world. And so all of that combined helps that product reaccelerate growth quite a bit. And so we feel actually very, very good about it, which is why we keep investing. Overall, we still only have a small part of the pure APM market. That product is scaled at about $10 billion, including DEM, but the market is larger. And so we think there's a lot more we can do there.
Speaker #1: We still only have a small part of the pure APM market like that product is scaled at about 10 billion , including DM .
Speaker #2: we we made investments in go to market . We covered market the better . And so we're getting into looks at more more deals in the more parts of And so all of world .
Speaker #1: But the market is larger . And so we think there's a lot more we can do there .
Speaker #2: Yeah , I want to add you know , we talked about , as I just mentioned , that we're not penetrated across our customer base .
Speaker #2: that combined , you helps that product Reaccelerate growth . Quite a so bit . we And feel actually very , very good about it , which is why we are we you know , we keep investing overall .
Speaker #2: And and therefore we're continuing to consolidate onto our platform . So we have quite a number of wins where we already have other products .
Speaker #2: We already have infra and logs . And we're consolidating APM .
Speaker #2: We still only have a small part of the pure APM-like market. That product is scaled at about $10 billion, including DEM.
David Obstler: Yeah. I want to add. We talked about, as Olivier just mentioned, that we're not penetrated across our customer base, and therefore, we're continuing to consolidate onto our platform. So we have quite a number of wins where we already have other products. We already have infra logs, and we're consolidating APM.
David Obstler: Yeah. I want to add. We talked about, as Olivier just mentioned, that we're not penetrated across our customer base, and therefore, we're continuing to consolidate onto our platform. So we have quite a number of wins where we already have other products. We already have infra logs, and we're consolidating APM.
Speaker #10: Thank you guys . David , as a follow up for you on margin , are the large AI native customers significantly dilutive to gross margin , and when you think about the guide , initial 2026 margin how much of that reflects lower gross margin tied to those customers versus incremental investments ?
Speaker #2: But the market is larger, and so we think there's a lot more we can do there.
Speaker #5: Yeah , I want to add you know , we talked about as I just mentioned , that not we're penetrated across our customer And
Speaker #5: base . therefore and continuing we're consolidate onto our to platform . So we have quite a number of wins where we already have other products .
Howard Ma: Thank you, guys. David, as a follow-up for you on margin, are the large AI-native customers significantly diluted to gross margin? And when you think about the initial 2026 margin guide, how much of that reflects potentially lower gross margin tied to those customers versus incremental investments?
Howard Ma: Thank you, guys. David, as a follow-up for you on margin, are the large AI-native customers significantly diluted to gross margin? And when you think about the initial 2026 margin guide, how much of that reflects potentially lower gross margin tied to those customers versus incremental investments?
Speaker #2: On a weighted average , they're not , as we always said , for larger customers , it isn't about the AI natives or non AI natives .
Speaker #5: We, and third, already have infra and logs, and we're consolidating APM.
Speaker #9: Thank you follow up David has a for guys . you on margin . Are the AI native customers significantly large gross margin ? And when you think about the initial guide , 2026 margin that how reflects potentially lower gross margin tied to those customers versus incremental investments ?
Speaker #2: It has to do with the size of the customer . We have a highly differentiated , diversified customer base , so I would say , you know , we're we're we're we're essentially expecting a similar type of discount structure in terms of size of customer .
David Obstler: On a weighted average, they're not. As we've always said, for larger customers, it isn't about the AI natives or non-AI natives. It has to do with the size of the customer. We have a highly differentiated, diversified customer base. I would say we're essentially expecting a similar type of discount structure in terms of size of customer as we have going forward. There are consistent ongoing investments in our gross margin, including data centers and development of the platform. I think it's more or less what we've seen over the past couple of years, not really affected by AI or not AI native.
David Obstler: On a weighted average, they're not. As we've always said, for larger customers, it isn't about the AI natives or non-AI natives. It has to do with the size of the customer. We have a highly differentiated, diversified customer base. I would say we're essentially expecting a similar type of discount structure in terms of size of customer as we have going forward. There are consistent ongoing investments in our gross margin, including data centers and development of the platform. I think it's more or less what we've seen over the past couple of years, not really affected by AI or not AI native.
Speaker #2: As we have going forward . And , you know , there are consistent , ongoing investments in our gross margin , including data centers and development of the I think platform .
Speaker #5: On a weighted they're average , as we always said , for larger customers , not , isn't it about the AI natives or non AI It has natives .
Speaker #2: So it's more or less we've seen past over the couple of years . Not really affected by AI or not a native .
Speaker #5: to do with the size of the customer . We have a highly differentiated , diversified customer base , so I say , you know , we're we're we're we're essentially expecting a similar type of discount structure in terms of size of customer .
Speaker #10: Okay . Thank you . Great quarter .
Speaker #2: you Thank
Speaker #3: next Thank question Peter from Weed with Research . Bernstein . Your line is open .
Speaker #5: As we have going forward . And , you know , there consistent , are ongoing investments in our gross margin , including data centers and development of the platform .
Speaker #9: you hear me this Hello ? Can time ?
Speaker #2: Yes . You're on .
Speaker #9: Okay . Thank you . You're on . Yeah . Apologies for the last time . Great quarter . You know , looking forward , I think one of your most interesting , exciting opportunities really is around AI .
Speaker #5: So I think it's more or less what we've seen over the years . past really Not by AI or not a native .
Howard Ma: Okay. Thank you. Great quarter.
Howard Ma: Okay. Thank you. Great quarter.
David Obstler: Thank you.
David Obstler: Thank you.
Olivier Pomel: Thank you. Our next question comes from Peter Weed with Bernstein Research. Your line is open.
Operator: Thank you. Our next question comes from Peter Weed with Bernstein Research. Your line is open.
Speaker #9: Thank you. Okay. Great quarter.
Speaker #9: And I'd love to hear . Kind of like how you think that opportunity shapes up . Like , how do you get the fair paid value for the productivity you're the bringing to SRE and the broader operations team ?
Speaker #5: Thank you .
Matt Hedberg: Hello. Can you hear me this time?
Peter Weed: Hello. Can you hear me this time?
Speaker #3: Thank you . Our next question comes from Peter Weed with Bernstein
David Obstler: Yes, you're on.
David Obstler: Yes, you're on.
Matt Hedberg: Okay. Thank you.
Peter Weed: Okay. Thank you.
Howard Ma: Yeah, you're on.
David Obstler: Yeah, you're on.
Matt Hedberg: Yep. Apologies for my last time. Great quarter. Looking forward, I think one of your most interesting, exciting opportunities really is around Bits AI. And I'd love to hear kind of how you think that opportunity shapes up. How do you get paid the fair value for the productivity you're bringing to the SRE and the broader operations team, and really how you see competition playing out in that space? Because obviously, we've seen startups coming in. There's questions about Anthropic and where they want to go. How does Datadog really capture this value and protect it for the business?
Peter Weed: Yep. Apologies for my last time. Great quarter. Looking forward, I think one of your most interesting, exciting opportunities really is around Bits AI. And I'd love to hear kind of how you think that opportunity shapes up. How do you get paid the fair value for the productivity you're bringing to the SRE and the broader operations team, and really how you see competition playing out in that space? Because obviously, we've seen startups coming in. There's questions about Anthropic and where they want to go. How does Datadog really capture this value and protect it for the business?
Speaker #3: .
Speaker #8: Hello . Can you hear me this time ?
Speaker #9: And really how you see competition playing out in that space , because obviously we've seen startups coming in , you know , there's questions about anthropic and , you know , where they want to go .
Speaker #5: Yes . You're on .
Speaker #8: Okay . Thank you . You're on affected . Yeah . Apologies for the last time . Great quarter . You know , looking forward , I think one of your most interesting , exciting opportunities really is around AI .
Speaker #9: You know , how does Datadog really capture this value . And protect it for the business ?
Speaker #8: And I'd love to hear . Kind of like how you think that opportunity shapes up . Like , how do you get paid the fair value for the productivity you're bringing to the SRE the and broader operations team ?
Speaker #1: mean , look , Yeah , I the the way currently we sell a lot of way , the these we products is you , you the the show like difference in time spent and , you know , when the alternative is you try and solve a problem yourself and you know , you have an outage and you , you start a bridge and you have 20 people on the bridge and they look for three hours for the root cause , you and the other people in the know , middle of the night for that .
Speaker #8: And really how you see competition playing out in that space , because obviously we've seen startups coming in , you know , there's questions about anthropic and , you know , where they want to go .
Speaker #8: know , how Datadog does You really capture this value . And protect it for the business ?
Olivier Pomel: Yeah. I mean, look, the way we currently sell a lot of these products is you show the difference in time spent. And when the alternative is you try and solve a problem yourself, and you have an outage, and you start a bridge, and you have 20 people on the bridge, and they look for three hours for the root cause, and you wake up people in the middle of the night for that. It's very expensive. It takes a lot of time. There's a lot of customer impact because the outages are long. And if the alternative is in five minutes, you have the answer, and you only get three people looked in that are the right folks, and you have a fix within 10 minutes, you have shorter impact on the customer, many, many, many less folks internally involved, lower cost.
Olivier Pomel: Yeah. I mean, look, the way we currently sell a lot of these products is you show the difference in time spent. And when the alternative is you try and solve a problem yourself, and you have an outage, and you start a bridge, and you have 20 people on the bridge, and they look for three hours for the root cause, and you wake up people in the middle of the night for that. It's very expensive. It takes a lot of time. There's a lot of customer impact because the outages are long. And if the alternative is in five minutes, you have the answer, and you only get three people looked in that are the right folks, and you have a fix within 10 minutes, you have shorter impact on the customer, many, many, many less folks internally involved, lower cost.
Speaker #1: It's very expensive . It takes a lot of time . There's a lot of customer impact because the outages are long . And if the alternative is in , you know , five minutes , you have the answer and you only get three people looped in that are the right folks .
Speaker #2: Yeah , I mean , look , the the way , the way we we currently sell a lot of these products is you , you show like the the difference in time spent and , you know , when the alternative is you try and solve a problem yourself and you know , you have an outage and you , you start a bridge and you have 20 people on the bridge and they look for three hours for the root cause , you know , and the other people in the middle of the night for that .
Speaker #1: And , you know , you have a fix within ten minutes or you shorter , shorter impact on the Many , many , many less folks internally involved lower cost .
Speaker #1: So it's fairly easy to make that case . And so that's a that's how we sell the value there . So longer term , as I was saying earlier , I think the right now the the state of art for incident the post hoc .
Speaker #2: It's very expensive . It takes a lot of time . There's a lot of customer impact because the outages are long . And if the alternative is , you know , in five minutes , you have the answer and you only get three people looped in that are the right folks .
Speaker #1: resolution is You know , you you look have an into incident and it and you diagnose it and then you resolve it , you know , so yeah , maybe you cut the the customer impact from one hour to , you know , 15 minutes .
Speaker #2: And , you know , you have a fixed within ten minutes or you shorter , shorter impact on the Many , many , customer .
Olivier Pomel: So it's fairly easy to make that case. And so that's how we sell the value there. Longer term, as I was saying earlier, I think right now, the state-of-the-art 40-second resolution is post hoc. You have an incident, and you look into it. And you diagnose it, and then you resolve it. So yeah, maybe you cut the customer impact from 1 hour to 15 minutes. But you still have an issue. You still have impact. You still distract the team. You still have humans working on that. I think longer term, what's going to happen is the systems will get in front of issues. They will auto-diagnose issues. They will help pre-mitigate or pre-remediate potential issues. And for that, the analysis will have to be run in-stream, which is a very different thing. You can message data and give it to an LLM for post hoc analysis.
Olivier Pomel: So it's fairly easy to make that case. And so that's how we sell the value there. Longer term, as I was saying earlier, I think right now, the state-of-the-art 40-second resolution is post hoc. You have an incident, and you look into it. And you diagnose it, and then you resolve it. So yeah, maybe you cut the customer impact from 1 hour to 15 minutes. But you still have an issue. You still have impact. You still distract the team. You still have humans working on that. I think longer term, what's going to happen is the systems will get in front of issues. They will auto-diagnose issues. They will help pre-mitigate or pre-remediate potential issues. And for that, the analysis will have to be run in-stream, which is a very different thing. You can message data and give it to an LLM for post hoc analysis.
Speaker #1: You know , but you still have an issue . You still have impact , you still distract the team , you still , you know , have humans working on that .
Speaker #2: folks internally less many involved case . And a that's so that's sell the value there . The longer term , as I was saying earlier , I think the right now the the the state of art for incident resolution is post hoc .
Speaker #1: I think longer term , the what's going to happen is the systems will will get in front of issues . They will auto diagnose issues , they will help pre mitigate or remediate potential issues .
Speaker #2: You have an incident and you look into it and you diagnose it , and then you resolve it , you know , so yeah , maybe you cut the the customer impact cost .
Speaker #1: And for that , the analysis will have to in stream , which is a very different thing . You know you can you can message data and give it to an LLM for post-hoc analysis .
Speaker #2: know , 15 minutes . You know , but you still have an issue . You still have impact , you still distract the team , you still , you So know , have humans working on that .
Speaker #1: And a lot of the value is going to be in the gathering the data . But you also have quite a bit of value in the smarts that are done on the back end by the LM .
Speaker #2: I think longer term , the what's going to happen is the systems will will front of issues get in . They will auto diagnose issues , they will help pre mitigate or remediate potential issues .
Speaker #1: For that . And that's something that is done by anthropic . The opening eyes of the world today , I think as you look at being in stream , looking at , you know , three , four , five orders of magnitude more data , looking at this data in real time and passing judgment what's time on normal , what's in real anomalous , and what might be going wrong during that .
Speaker #2: And for that , the analysis will have to be run in stream , which is a very different thing . You know you can you can message data for to an and give it from analysis .
Olivier Pomel: And a lot of the value is going to be in gathering the data, but you also have quite a bit of value in the smarts that are done in the backend by the LLM for that. And that's something that is done by Anthropic, the OpenAIs of the world today. I think as you look at being in-stream, looking at three, four, five orders of magnitude more data, looking at this data in real time, and passing judgment in real time on what's normal, what's anomalous, and what might be going wrong doing that hundreds, thousands, millions of times per second, I think that's what is going to be our advantage and where it's going to be much harder for others to compete, especially general-purpose AI platforms.
Olivier Pomel: And a lot of the value is going to be in gathering the data, but you also have quite a bit of value in the smarts that are done in the backend by the LLM for that. And that's something that is done by Anthropic, the OpenAIs of the world today. I think as you look at being in-stream, looking at three, four, five orders of magnitude more data, looking at this data in real time, and passing judgment in real time on what's normal, what's anomalous, and what might be going wrong doing that hundreds, thousands, millions of times per second, I think that's what is going to be our advantage and where it's going to be much harder for others to compete, especially general-purpose AI platforms.
Speaker #1: You know , hundreds , thousands , millions of times per second . I think that's what is going to be our advantage . And where the the it's going to be much harder for others to compete , especially general purpose AI platforms .
Speaker #2: To be, and a lot of the value is going into gathering the data. But you also have quite a bit of value in the smarts that are done by the end for that.
Speaker #2: And that's something that is done by anthropic . The open eyes of today , I think as you look at being in stream , looking at , you know , three , four , five orders of magnitude more data , looking at this real data in time and passing judgment in real time on what's normal , what's anomalous , and what might be going wrong during that .
Speaker #9: Thank you .
Speaker #3: Thank you . Our next question comes from Brent Thill with Jefferies . Your line is open .
Speaker #2: You know , hundreds , thousands , billions of times per second . I think that's what is going to be our advantage . And where the the it's going to be much harder for others to compete , especially general purpose AI platforms .
Speaker #11: Thanks , David . I think many gravitate back to that mid 20% margin . You put up a couple of years ago . And I know the last couple of years , including the guide , are looking at low 20% .
Matt Hedberg: Thank you.
Peter Weed: Thank you.
Speaker #11: Can you talk to how you're north true maybe your thinking Obviously about that . growth being number how you're But one . thinking about the framework on the bottom line .
Olivier Pomel: Thank you. Our next question comes from Brent Thill with Jefferies. Your line is open.
Operator: Thank you. Our next question comes from Brent Thill with Jefferies. Your line is open.
Speaker #11: Thanks .
Speaker #8: Thank you .
Speaker #2: Yeah , we try to plan with more conservative revenues . Understanding that that if revenues exceed above the targets that we give , it's , you know , a difficult in the short term to invest incrementally .
Howard Ma: Thanks, David. I think many gravitate back to that mid-20% margin you put up a couple of years ago. I know the last couple of years, including the COVID, looking at low 20%. Can you talk to maybe your true north, how you're thinking about that, obviously, growth being number one, but how you're thinking about the framework on the bottom line? Thanks.
Brent Thill: Thanks, David. I think many gravitate back to that mid-20% margin you put up a couple of years ago. I know the last couple of years, including the COVID, looking at low 20%. Can you talk to maybe your true north, how you're thinking about that, obviously, growth being number one, but how you're thinking about the framework on the bottom line? Thanks.
Speaker #3: you . Our next question comes from Brent Thank Thill with Jefferies . Your line is open
Speaker #3: .
Speaker #10: David, I think I gravitate to many of that, back to mid-20% margin. Thanks. You put up a couple of years ago.
Speaker #10: And I know the last couple of including the years , guide , are low 20% . maybe your Can you talk north true how you're Obviously about that .
Speaker #2: So we're trying to do is invest first in the revenue growth and then layer in additional investment . As we see if we see excess targets .
David Obstler: Yep. Yeah. The framework is we try to plan with more conservative revenues, understanding that if revenues exceed above the targets that we give, it's difficult in the short term to invest incrementally. So what we're trying to do is invest first in the revenue growth and then layer in additional investment as we see if we see excessive targets. So generally, it reflects, one, the continued investment, which we think is paying off, both in terms of the platform and R&D, as well as including AI as in go-to-market. And then, as we've seen over the years in our beat and raise, we've tended to have some of that flow through into the margin line and then re-up again for the next phase of growth.
David Obstler: Yep. Yeah. The framework is we try to plan with more conservative revenues, understanding that if revenues exceed above the targets that we give, it's difficult in the short term to invest incrementally. So what we're trying to do is invest first in the revenue growth and then layer in additional investment as we see if we see excessive targets. So generally, it reflects, one, the continued investment, which we think is paying off, both in terms of the platform and R&D, as well as including AI as in go-to-market. And then, as we've seen over the years in our beat and raise, we've tended to have some of that flow through into the margin line and then re-up again for the next phase of growth.
Speaker #10: growth being number But how you're thinking about the bottom line framework on the . . Thanks
Speaker #2: So generally it reflects one the continued investment , which we think is paying off both in terms of the platform and R&D as well as in and including AI , as in go to market .
Speaker #5: plan , we with more Yeah try to conservative revenues . Understanding that that if above exceed the revenues targets that we give , it's , you know , a difficult in the short term to invest incrementally .
Speaker #2: And then , you know , as as we've seen over the years in our beat and raise , we we've tended to have some of that flow through into the margin line and then re-up again for the next phase of growth .
Speaker #5: So we're trying to do first in the invest as revenue growth layer in additional and then investment . As see we if we see excessive So targets .
Speaker #11: And any big changes in the go to market or big need to make . investments you David , this year to address what's happened in the AI cohort or not .
Speaker #5: generally it reflects investment , which we is paying think one the continued off , both in of the terms and platform R&D well as in as and in go to including AI , as market .
Speaker #2: We're continuing fairly similar to what we're doing , which is to try to work with clients to prove value over time . That reflects , you know , that that that manifests itself in our account management and our CS , as well as our enterprise .
Speaker #5: then , as as you know , we've the And over years in our beaten to have , we we've raise some of tended flow that through into line .
Howard Ma: Any big changes in the go-to-market or big investments you need to make, David, this year to address what's happened in the AI cohort or not?
Brent Thill: Any big changes in the go-to-market or big investments you need to make, David, this year to address what's happened in the AI cohort or not?
Speaker #2: So , no , I think for this year , we are looking at capacity growth , including geographic , you know , deepening the the ways we interact with customers , expanding very channels much similar to what we've done in the previous years .
Speaker #5: margin the And then again for the re-up phase of next growth .
Speaker #10: And any big changes in the go to market or big investments , you David , this year need to make . address what's the AI cohort happened in or not .
David Obstler: We're continuing very similar to what we're doing, which is to try to work with clients to prove value over time. That manifests itself in our account management and our CS, as well as our enterprise. So no, I think for this year, we are looking at capacity growth, including geographic, deepening the ways we interact with customers, expanding channels, very much similar to what we've done in the previous years.
David Obstler: We're continuing very similar to what we're doing, which is to try to work with clients to prove value over time. That manifests itself in our account management and our CS, as well as our enterprise. So no, I think for this year, we are looking at capacity growth, including geographic, deepening the ways we interact with customers, expanding channels, very much similar to what we've done in the previous years.
Speaker #5: we're doing , continuing fairly similar to what We're which is to to try work with value over clients to prove time . That reflects , that that that manifests in our you know , itself management and account our CS as well as our enterprise .
Speaker #11: Thanks .
Speaker #1: And that's that's going to be it for today . So on that , I'd like to thank all of you for listening to this call .
Speaker #1: And I think we'll meet many of you on Thursday . For an Investor Day . So thank you all . Bye .
Speaker #5: I think So , no , for year , this we are looking at capacity , including geographic growth , you know , deepening the the ways we interact with expanding customers , channels very much we've done in years the previous similar to what Thanks .
Speaker #2: Thank you .
Howard Ma: Thanks. All right. That's going to be it for today. So on that, I'd like to thank all of you for listening to this call. And I think we'll meet many of you on Thursday for Investor Day. So thank you all. Bye.
Brent Thill: Thanks.
Olivier Pomel: All right. That's going to be it for today. So on that, I'd like to thank all of you for listening to this call. And I think we'll meet many of you on Thursday for Investor Day. So thank you all. Bye.
Speaker #2: that's
Speaker #2: today . be it for on . And that , I'd like you thank all to for listening to of call . this And I we'll meet many think of you Thursday for our on So Investor Day .
David Obstler: Thank you.
David Obstler: Thank you.
Olivier Pomel: Thank you for your participation. You may now disconnect. Everyone, have a great day.
Operator: Thank you for your participation. You may now disconnect. Everyone, have a great day.