Q3 2026 Elastic NV Earnings Call

Speaker #1: Good afternoon. And welcome to the Elastic Third Quarter Fiscal 2026 Earnings Results Conference Call. All participants will be in listen-only mode. Should you need assistance, please signal a conference specialist by pressing the star key followed by 0.

Speaker #1: After today's presentation, there will be an opportunity to ask questions. To ask a question, you may press star, then 1 on your telephone keypad.

Speaker #1: To withdraw your question, please press star, then 2. Please note this event is being recorded. I would now like to turn the conference over to Eric Prengel, Global Vice President of Finance.

Speaker #1: Please go ahead.

Speaker #2: Thank you. Good afternoon and thank you for joining us on today's conference call to discuss Elastic's third quarter fiscal 2026 financial results. On the call, we have Ash Kulkarni, Chief Executive Officer, and Navam Welihinda, Chief Financial Officer.

Eric Prengel: Thank you. Good afternoon, and thank you for joining us on today's conference call to discuss Elastic's Q3 fiscal 2026 financial results. On the call, we have Ash Kulkarni, Chief Executive Officer, and Navam Welihinda, Chief Financial Officer. Following their prepared remarks, we will take questions. Our press release was issued today after the close of market and is posted on our website. Slides, which are supplemental to the call, can also be found on the Elastic Investor Relations website at ir.elastic.co. Our discussion will include forward-looking statements which may include predictions, estimates, our expectations regarding demand for our products and solutions, and our future revenue and other information. These forward-looking statements are based on factors currently known to us, speak only as to the date of this call and are subject to risks and uncertainties that could cause actual results to differ materially.

Eric Prengel: Thank you. Good afternoon, and thank you for joining us on today's conference call to discuss Elastic's Q3 fiscal 2026 financial results. On the call, we have Ash Kulkarni, Chief Executive Officer, and Navam Welihinda, Chief Financial Officer. Following their prepared remarks, we will take questions. Our press release was issued today after the close of market and is posted on our website. Slides, which are supplemental to the call, can also be found on the Elastic Investor Relations website at ir.elastic.co. Our discussion will include forward-looking statements which may include predictions, estimates, our expectations regarding demand for our products and solutions, and our future revenue and other information. These forward-looking statements are based on factors currently known to us, speak only as to the date of this call and are subject to risks and uncertainties that could cause actual results to differ materially.

Speaker #2: Following their prepared remarks, we will take questions. Our press release was issued today after the close of market and is posted on our website.

Speaker #2: Slides which are supplemental to the call can also be found on the Elastic Investor Relations website at ir.elastic.co. Our discussion will include forward-looking statements, which may include predictions, estimates, our expectations regarding demand for our products and solutions, and our future revenue and other information.

Speaker #2: These forward-looking statements are based on factors currently known to us, speak only as to the date of this call, and are subject to risks and uncertainties that could cause actual results to differ materially.

Speaker #2: We disclaim any obligation to update or revise these forward-looking statements unless required by law. Please refer to the risks and uncertainties included in the press release that we issued earlier today.

Eric Prengel: We disclaim any obligation to update or revise these forward-looking statements unless required by law. Please refer to the risks and uncertainties included in the press release that we issued earlier today, included in the slides posted on the Investor Relations website, and those more fully described in our filings with the Securities and Exchange Commission. We will also discuss certain non-GAAP financial measures. Disclosures regarding non-GAAP measures, including reconciliations with the most comparable GAAP measures, can be found in the press release and slides. Unless specifically noted otherwise, all results and comparisons are on a fiscal year-over-year basis. The webcast replay of this call will be available on our company website under the Investor Relations link. Our Q4 fiscal 2026 quiet period begins at the close of business on Thursday, 16 April 2026.

Eric Prengel: We disclaim any obligation to update or revise these forward-looking statements unless required by law. Please refer to the risks and uncertainties included in the press release that we issued earlier today, included in the slides posted on the Investor Relations website, and those more fully described in our filings with the Securities and Exchange Commission. We will also discuss certain non-GAAP financial measures. Disclosures regarding non-GAAP measures, including reconciliations with the most comparable GAAP measures, can be found in the press release and slides. Unless specifically noted otherwise, all results and comparisons are on a fiscal year-over-year basis. The webcast replay of this call will be available on our company website under the Investor Relations link. Our Q4 fiscal 2026 quiet period begins at the close of business on Thursday, 16 April 2026.

Speaker #2: Included in the slides posted on the Investor Relations website and those more fully described in our filings with the Securities and Exchange Commission. We will also discuss certain non-GAAP financial measures.

Speaker #2: Disclosures regarding non-GAAP measures including reconciliations with the most comparable GAAP measures can be found in the press release and slides. Unless specifically noted otherwise, all results and comparisons are on a fiscal year-over-year basis.

Speaker #2: The webcast replay of this call will be available on our company website under the Investor Relations link. Our fourth quarter fiscal 2026 quiet period begins at the close of business on Thursday, April 16th, 2026.

Speaker #2: We will be participating in the Morgan Stanley Technology Media and Telecom Conference on March 2nd. With that, I'll turn it over to Ash.

Eric Prengel: We will be participating in the Morgan Stanley Technology, Media and Telecom Conference on 2 March. With that, I'll turn it over to Ash.

Eric Prengel: We will be participating in the Morgan Stanley Technology, Media and Telecom Conference on 2 March. With that, I'll turn it over to Ash.

Speaker #3: Thank you, Eric. And good afternoon, everyone. Thank you for joining today's call. Elastic delivered yet another outstanding quarter, beating the high end of guidance across all key metrics and showcasing the power of the Elastic platform in our business model.

Ash Kulkarni: Thank you, Eric, and good afternoon, everyone. Thank you for joining today's call. Elastic delivered yet another outstanding quarter, beating the high end of guidance across all key metrics and showcasing the power of the Elastic platform and our business model. Sustained platform demand, strong sales execution, and our relentless focus on customers drove Q3 momentum. As LLMs rapidly evolve their capabilities around inference and reasoning, it is becoming increasingly clear that context is the most important ingredient in making these models useful within an enterprise. With that backdrop, in Q3, we continued to see enterprises choose Elastic to power context for their most critical AI needs. Translating the success to our performance, we achieved 18% total revenue growth and an 18.6% non-GAAP operating margin.

Ash Kulkarni: Thank you, Eric, and good afternoon, everyone. Thank you for joining today's call. Elastic delivered yet another outstanding quarter, beating the high end of guidance across all key metrics and showcasing the power of the Elastic platform and our business model. Sustained platform demand, strong sales execution, and our relentless focus on customers drove Q3 momentum. As LLMs rapidly evolve their capabilities around inference and reasoning, it is becoming increasingly clear that context is the most important ingredient in making these models useful within an enterprise. With that backdrop, in Q3, we continued to see enterprises choose Elastic to power context for their most critical AI needs. Translating the success to our performance, we achieved 18% total revenue growth and an 18.6% non-GAAP operating margin.

Speaker #3: Sustained platform demand, strong sales execution, and our relentless focus on customers drove Q3 momentum. As LLMs rapidly evolve their capabilities around inference and reasoning, it is becoming increasingly clear that context is the most important ingredient in making these models useful within an enterprise.

Speaker #3: With that backdrop, in Q3, we continued to see enterprises choose Elastic to power context for their most critical AI needs. Translating the success to our performance, we achieved 18% total revenue growth and an 18.6% non-GAAP operating margin.

Speaker #3: Sales-led subscription revenue accelerated to 21% alongside our growing cohort of 100K ACV customers which now exceeds 1,660. Q3 marked a sixth consecutive quarter of strong field execution, driving solid customer commitments and supporting healthy CRPO growth.

Ash Kulkarni: Sales net subscription revenue accelerated to 21% alongside our growing cohort of 100,000 ACV customers, which now exceeds 1,660. Q3 marked our sixth consecutive quarter of strong field execution, driving solid customer commitments and supporting healthy CRPO growth. That execution is also translating into a strong pipeline as we head into Q4. The lifeblood of organizations is the proprietary data that they create, manage, and analyze every day to drive business decisions and operations. This data is massive, often many petabytes in scale, and simply cannot be moved for cost and security reasons outside of the organization's control. For businesses to use agentic AI, the LLM needs to come to the data. This is where Elastic comes in.

Ash Kulkarni: Sales net subscription revenue accelerated to 21% alongside our growing cohort of 100,000 ACV customers, which now exceeds 1,660. Q3 marked our sixth consecutive quarter of strong field execution, driving solid customer commitments and supporting healthy CRPO growth. That execution is also translating into a strong pipeline as we head into Q4. The lifeblood of organizations is the proprietary data that they create, manage, and analyze every day to drive business decisions and operations. This data is massive, often many petabytes in scale, and simply cannot be moved for cost and security reasons outside of the organization's control. For businesses to use agentic AI, the LLM needs to come to the data. This is where Elastic comes in.

Speaker #3: That execution is also translating into a strong pipeline as we head into Q4. The lifeblood of organizations is the proprietary data that they create, manage, and analyze every day to drive business decisions and operations.

Speaker #3: This data is massive, often many petabytes in scale, and simply cannot be moved for cost and security reasons outside of the organization's control. For businesses to use agentic AI, the LLM needs to come to the data.

Speaker #3: This is where Elastic comes in with our ability to help organizations store and manage all of their data in very cost-effective ways and by providing accurate, real-time context to AI by searching through all of this organizational data in real time.

Ash Kulkarni: With our ability to help organizations store and manage all of their data in very cost-effective ways, and by providing accurate real-time context to AI by searching through all of this organizational data in real time. Furthermore, Elastic is capable of doing this consistently across cloud and self-managed environments. This hybrid flexibility allows sensitive data and workloads to remain in their preferred environments, eliminating the need for costly re-platforming. This unique flexibility is why we continue to displace legacy vendors and niche cloud native players alike. The results are clear. The number of commitments for over $1 million in annual commitment value signed this quarter grew over 30% compared to the same period last year, driven by new logos and customer expansion. Consolidation and AI are powerful tailwinds.

Ash Kulkarni: With our ability to help organizations store and manage all of their data in very cost-effective ways, and by providing accurate real-time context to AI by searching through all of this organizational data in real time. Furthermore, Elastic is capable of doing this consistently across cloud and self-managed environments. This hybrid flexibility allows sensitive data and workloads to remain in their preferred environments, eliminating the need for costly re-platforming. This unique flexibility is why we continue to displace legacy vendors and niche cloud native players alike. The results are clear. The number of commitments for over $1 million in annual commitment value signed this quarter grew over 30% compared to the same period last year, driven by new logos and customer expansion. Consolidation and AI are powerful tailwinds.

Speaker #3: Furthermore, Elastic is capable of doing this consistently across cloud and self-managed environments. This hybrid flexibility allows sensitive data and workloads to remain in their preferred environments eliminating the need for costly re-platforming.

Speaker #3: This unique flexibility is why we continue to displace legacy vendors and niche cloud-native players alike. And the results are clear. The number of commitments for over a million dollars in annual commitment value signed this quarter grew over 30% compared to the same period last year, driven by new logos and customer expansion.

Speaker #3: Consolidation and AI are powerful tailwinds. As organizations manage exploding data volumes, they are turning to Elastic to drive both innovation and efficiency, to their search, observability, and security needs.

Ash Kulkarni: As organizations manage exploding data volumes, they are turning to Elastic to drive both innovation and efficiency to their search, observability, and security needs. For example, we signed a seven-figure new logo deal with the Fortune 100 insurance institution for Elastic Security. Seeking to modernize their security operations, the company initiated a competitive process to replace a legacy SIEM solution that was plagued by slow query speeds, inefficient data retention, and rigid SOC workflows. By leveraging features like LogsDB and searchable snapshots, they are consolidating data into a single cyber data lake with integrated AI-powered SIEM workflows, all powered by Elastic and its capabilities, including AI Assistant, Attack Discovery, and AI-driven orchestration. This transition enables their analysts to achieve markedly faster cybersecurity detection and remediation outcomes while meeting strict regulatory requirements.

Ash Kulkarni: As organizations manage exploding data volumes, they are turning to Elastic to drive both innovation and efficiency to their search, observability, and security needs. For example, we signed a seven-figure new logo deal with the Fortune 100 insurance institution for Elastic Security. Seeking to modernize their security operations, the company initiated a competitive process to replace a legacy SIEM solution that was plagued by slow query speeds, inefficient data retention, and rigid SOC workflows. By leveraging features like LogsDB and searchable snapshots, they are consolidating data into a single cyber data lake with integrated AI-powered SIEM workflows, all powered by Elastic and its capabilities, including AI Assistant, Attack Discovery, and AI-driven orchestration. This transition enables their analysts to achieve markedly faster cybersecurity detection and remediation outcomes while meeting strict regulatory requirements.

Speaker #3: For example, we signed a seven-figure new logo deal with a Fortune 100 insurance institution for Elastic Security. Seeking to modernize their security operations, the company initiated a competitive process to replace a legacy SIEM solution that was plagued by slow query speeds, inefficient data retention, and rigid SOC workflows.

Speaker #3: By leveraging features like LogsDB and Searchable Snapshots, they are consolidating data into a single cyber data lake with integrated AI-powered SIM workflows. All powered by Elastic and its capabilities including AI assistant, attack discovery, and AI-driven orchestration.

Speaker #3: This transition enables their analysts to achieve markedly faster cybersecurity detection and remediation outcomes while meeting strict regulatory requirements. In another large deal from the quarter, a global leader in data resiliency software, chose Elastic Observability to power the monitoring layer for its new cloud offering.

Ash Kulkarni: In another large deal from the quarter, a global leader in data resiliency software chose Elastic Observability to power the monitoring layer for its new cloud offering. As they migrate their vast user base to the cloud, they are leveraging our full observability suite, including AI Assistant and LogsDB, to transform from reactive troubleshooting to intelligent semantic-aware analysis. By integrating OpenTelemetry and our vector search capabilities, the customer is now able to proactively detect anomalies and remediate issues using natural language queries, significantly reducing mean time to resolution. They chose Elastic over incumbents due to our deep integration flexibility, superior handling of unstructured data, and the ability to provide a single source of truth across the organization. Crucially, as companies navigate their cloud migrations, they require a platform that doesn't force them to choose between their existing data centers and the cloud.

Ash Kulkarni: In another large deal from the quarter, a global leader in data resiliency software chose Elastic Observability to power the monitoring layer for its new cloud offering. As they migrate their vast user base to the cloud, they are leveraging our full observability suite, including AI Assistant and LogsDB, to transform from reactive troubleshooting to intelligent semantic-aware analysis. By integrating OpenTelemetry and our vector search capabilities, the customer is now able to proactively detect anomalies and remediate issues using natural language queries, significantly reducing mean time to resolution. They chose Elastic over incumbents due to our deep integration flexibility, superior handling of unstructured data, and the ability to provide a single source of truth across the organization. Crucially, as companies navigate their cloud migrations, they require a platform that doesn't force them to choose between their existing data centers and the cloud.

Speaker #3: As they migrate their vast user base to the cloud, they are leveraging our full observability suite, including AI Assistant and LogsDB, to transform from reactive troubleshooting to intelligent, semantic-aware analysis.

Speaker #3: By integrating open telemetry and our vector search capabilities, the customer is now able to proactively detect anomalies and remediate issues using natural language queries significantly reducing mean time to resolution.

Speaker #3: They chose Elastic over incumbents due to our deep integration flexibility, superior handling of unstructured data, and the ability to provide a single source of truth across the organization.

Speaker #3: Crucially, as companies navigate their cloud migrations, they require a platform that doesn't force them to choose between their existing data centers and the cloud.

Speaker #3: Our asymmetric advantage in supporting modern cloud and hybrid environments drove a significant win with a global financial group. During the quarter, we closed a seven-figure expansion deal for Elasticsearch which serves as the core of their online banking application for tens of millions of users.

Ash Kulkarni: Our asymmetric advantage in supporting modern cloud and hybrid environments drove a significant win with a global financial group. During the quarter, we closed a seven-figure expansion deal for Elasticsearch, which serves as the core of their online banking application for tens of millions of users. They needed a central data repository capable of supporting both cloud and self-managed architectures, allowing them to run mission-critical workloads in their preferred environment without compromising performance. Elastic succeeded where their existing MongoDB implementation failed to provide the scalable retrieval and precision necessary to move beyond simple search into production-grade context engineering. Moving forward, they are integrating semantic search and advanced AI features to further personalize the user experience through faster, more accurate retrieval. Central to these enterprise engagements is the rise of agentic AI. Customers are moving from passive Q&A to active agents that drive workflows. Precise action requires precise data.

Ash Kulkarni: Our asymmetric advantage in supporting modern cloud and hybrid environments drove a significant win with a global financial group. During the quarter, we closed a seven-figure expansion deal for Elasticsearch, which serves as the core of their online banking application for tens of millions of users. They needed a central data repository capable of supporting both cloud and self-managed architectures, allowing them to run mission-critical workloads in their preferred environment without compromising performance. Elastic succeeded where their existing MongoDB implementation failed to provide the scalable retrieval and precision necessary to move beyond simple search into production-grade context engineering. Moving forward, they are integrating semantic search and advanced AI features to further personalize the user experience through faster, more accurate retrieval. Central to these enterprise engagements is the rise of agentic AI. Customers are moving from passive Q&A to active agents that drive workflows. Precise action requires precise data.

Speaker #3: They needed a central data repository capable of supporting both cloud and self-managed architectures, allowing them to run mission-critical workloads in their preferred environment without compromising performance.

Speaker #3: Elastic succeeded with their existing MongoDB implementation failed to provide the scalable retriever and precision necessary to move beyond simple search into production-grade context engineering.

Speaker #3: Moving forward, they are integrating semantic search and advanced AI features to further personalize the user experience through faster, more accurate retrieval. Central to these enterprise engagements is the rise of agentic AI.

Speaker #3: Customers are moving from passive Q&A to active agents that drive workflows. Precise action requires precise data. The conversation has shifted from which model to use to how to feed it the most accurate context.

Ash Kulkarni: The conversation has shifted from which model to use to how to feed it the most accurate context. Enterprises realize that to unlock the value of AI, they must bridge the gap between their LLMs and their proprietary, unstructured, and structured data. Elastic makes this AI work. We are the engine that allows enterprises to build production-grade AI systems that are actually worthy of their business. While others offer simple vector databases, we know that vectors alone are not enough. We deliver the full retrieval toolkit from hybrid search to advanced re-ranking, ensuring that agents have the relevant context they need to take precise actions. This ability to bridge enterprise data to the LLM with our platform is directly translating into expanded AI adoption.

Ash Kulkarni: The conversation has shifted from which model to use to how to feed it the most accurate context. Enterprises realize that to unlock the value of AI, they must bridge the gap between their LLMs and their proprietary, unstructured, and structured data. Elastic makes this AI work. We are the engine that allows enterprises to build production-grade AI systems that are actually worthy of their business. While others offer simple vector databases, we know that vectors alone are not enough. We deliver the full retrieval toolkit from hybrid search to advanced re-ranking, ensuring that agents have the relevant context they need to take precise actions. This ability to bridge enterprise data to the LLM with our platform is directly translating into expanded AI adoption.

Speaker #3: Enterprises realize that to unlock the value of AI, they must bridge the gap between their LLMs and their proprietary unstructured and structured data. Elastic makes this AI work.

Speaker #3: We are the engine that allows enterprises to build production-grade AI systems that are actually worthy of their business. While others offer simple vector databases, we know that vectors alone are not enough.

Speaker #3: We deliver the full retrieval toolkit from hybrid search to advanced re-ranking. Ensuring that agents have the relevant context they need to take precise actions.

Speaker #3: This ability to bridge enterprise data to the LLM with our platform is directly translating into expanded AI adoption. In Q3, new customer commitments with AI continue to grow, and we now have over 2,700 customers on Elastic Cloud using us as a vector database.

Ash Kulkarni: In Q3, new customer commitments with AI continued to grow, and we now have over 2,700 customers on Elastic Cloud using us as a vector database, with additional customers using us for broader AI capabilities, including Agent Builder and Attack Discovery, bringing our total count of AI customers to over 3,000. We now have over 470 customers with an ACV of $100,000 or greater using us for AI. This includes more than 410 using us as a vector database. Cumulatively, AI use cases have now penetrated over a quarter of our 100K ACV customer cohort. We are seeing sustained demand from the largest companies in the world, alongside interest from the new wave of AI-native companies.

Ash Kulkarni: In Q3, new customer commitments with AI continued to grow, and we now have over 2,700 customers on Elastic Cloud using us as a vector database, with additional customers using us for broader AI capabilities, including Agent Builder and Attack Discovery, bringing our total count of AI customers to over 3,000. We now have over 470 customers with an ACV of $100,000 or greater using us for AI. This includes more than 410 using us as a vector database. Cumulatively, AI use cases have now penetrated over a quarter of our 100K ACV customer cohort. We are seeing sustained demand from the largest companies in the world, alongside interest from the new wave of AI-native companies.

Speaker #3: With additional customers using us for broader AI capabilities including agent builder and attack discovery, bringing our total count of AI customers to over 3,000.

Speaker #3: We now have over 470 customers with an ACV of 100,000 or greater using us for AI. This includes more than 410 using us as a vector database.

Speaker #3: Cumulatively, AI use cases have now penetrated over a quarter of our 100K ACV customer cohort. We are seeing sustained demand from the largest companies in the world alongside interest from the new wave of AI-native companies.

Speaker #3: During the quarter, we closed multiple new logo and expansion deals with AI-first innovators, validating that our platform is the standard for both established enterprises and disruptors.

Ash Kulkarni: During the quarter, we closed multiple new logo and expansion deals with AI first innovators, validating that our platform is the standard for both established enterprises and disruptors. A leading AI recruiting platform used by large enterprises and startups alike chose Elastic's vector database to power their core customer-facing software because our search performance at scale was better than competitors. An AI-enabled driver and fleet safety company expanded their use of Elasticsearch in Q3 as they scale into new global regions. Elastic provides the real-time retrieval necessary to power their platform, ensuring they can manage increasing data volumes without sacrificing performance. A leading AI native cybersecurity company focused on AI automated penetration testing has integrated our SIEM solution into their product. Elastic centralizes all of their logs without complication, allowing them to effortlessly scale through their massive growth trajectory.

Ash Kulkarni: During the quarter, we closed multiple new logo and expansion deals with AI first innovators, validating that our platform is the standard for both established enterprises and disruptors. A leading AI recruiting platform used by large enterprises and startups alike chose Elastic's vector database to power their core customer-facing software because our search performance at scale was better than competitors. An AI-enabled driver and fleet safety company expanded their use of Elasticsearch in Q3 as they scale into new global regions. Elastic provides the real-time retrieval necessary to power their platform, ensuring they can manage increasing data volumes without sacrificing performance. A leading AI native cybersecurity company focused on AI automated penetration testing has integrated our SIEM solution into their product. Elastic centralizes all of their logs without complication, allowing them to effortlessly scale through their massive growth trajectory.

Speaker #3: A leading AI recruiting platform used by large enterprises and startups alike chose Elastic's vector database to power their core customer-facing software because our search performance at scale was better than competitors.

Speaker #3: An AI-enabled driver and fleet safety company expanded their use of Elasticsearch in Q3 as they scale into new global regions. Elastic provides the real-time retrieval necessary to power their platform, ensuring they can manage increasing data volumes without sacrificing performance.

Speaker #3: And a leading AI-native cybersecurity company focused on AI-automated penetration testing has integrated our SIM solution into their product. Elastic centralizes all of their logs without complication, allowing them to effortlessly scale through their massive growth trajectory.

Speaker #3: At the heart of these wins is the performance of our search AI platform. We aren't just adding features. We are aggressively optimizing our engine.

Ash Kulkarni: At the heart of these wins is the performance of our search AI platform. We aren't just adding features, we are aggressively optimizing our engine, focusing our development efforts on delivering market-leading relevance, speed, and efficiency. In the last 18 months, we have driven 2 orders of magnitude less RAM required for vector search through innovations like better binary quantization or BBQ, disk BBQ, and our ACORN-1 filtering algorithm, among other things. This investment makes Elasticsearch vector search up to 8 times faster than OpenSearch. Our superior performance led to 1 seven-figure deal with a global heavy equipment manufacturer. The customer continues to migrate mission-critical workloads over to Elastic Cloud from OpenSearch to improve scalability and performance. They are relying on our platform to power their high-speed search for telemetry data collected via the Starlink network.

Ash Kulkarni: At the heart of these wins is the performance of our search AI platform. We aren't just adding features, we are aggressively optimizing our engine, focusing our development efforts on delivering market-leading relevance, speed, and efficiency. In the last 18 months, we have driven 2 orders of magnitude less RAM required for vector search through innovations like better binary quantization or BBQ, disk BBQ, and our ACORN-1 filtering algorithm, among other things. This investment makes Elasticsearch vector search up to 8 times faster than OpenSearch. Our superior performance led to 1 seven-figure deal with a global heavy equipment manufacturer. The customer continues to migrate mission-critical workloads over to Elastic Cloud from OpenSearch to improve scalability and performance. They are relying on our platform to power their high-speed search for telemetry data collected via the Starlink network.

Speaker #3: Focusing our development efforts on delivering marked leading relevance, speed, and efficiency. In the last 18 months, we have driven two orders of magnitude less RAM required for vector search through innovations like better binary quantization or BBQ, disk BBQ, and our Acorn filtering algorithm, among other things.

Speaker #3: This investment makes Elasticsearch vector search up to eight times faster than open search. Our superior performance led to 1 seven-figure deal with a global heavy equipment manufacturer.

Speaker #3: The customer continues to migrate mission-critical workloads over to Elastic Cloud from open search to improve scalability and performance. They are relying on our platform to power their high-speed search for telemetry data collected via the Starlink network.

Speaker #3: By leveraging LogsDB, they have achieved a significant reduction in cloud costs while managing seven years of historical customer data. Our focus on performance well, where together we help enterprises deploy AI applications faster without draining IT infrastructure.

Ash Kulkarni: By leveraging logsdb, they have achieved a significant reduction in cloud costs while managing seven years of historical customer data. Our focus on performance extends to our partnership with NVIDIA as well, where together we help enterprises deploy AI applications faster without draining IT infrastructure. We recently announced the technical preview of our Elasticsearch GPU plugin for a GPU-accelerated vector database, which allows for 12 times faster indexing. Additionally, the Dell AI Data Platform, now with NVIDIA and Elastic, delivers a tightly integrated AI stack that streamlines the ability to build, deploy, and scale AI. By making Elasticsearch a core component of the Dell and NVIDIA AI factories, we are meeting the critical demand for building AI on customer-controlled infrastructure. As we deepen these technical advantages, we strengthen our technical moat while removing friction from scaling AI.

Ash Kulkarni: By leveraging logsdb, they have achieved a significant reduction in cloud costs while managing seven years of historical customer data. Our focus on performance extends to our partnership with NVIDIA as well, where together we help enterprises deploy AI applications faster without draining IT infrastructure. We recently announced the technical preview of our Elasticsearch GPU plugin for a GPU-accelerated vector database, which allows for 12 times faster indexing. Additionally, the Dell AI Data Platform, now with NVIDIA and Elastic, delivers a tightly integrated AI stack that streamlines the ability to build, deploy, and scale AI. By making Elasticsearch a core component of the Dell and NVIDIA AI factories, we are meeting the critical demand for building AI on customer-controlled infrastructure. As we deepen these technical advantages, we strengthen our technical moat while removing friction from scaling AI.

Speaker #3: We recently announced the technical preview of our Elasticsearch GPU plug-in for a GPU-accelerated vector database which allows for 12 times faster indexing. Additionally, the Dell AI data platform now with NVIDIA and Elastic delivers a tightly integrated AI stack that streamlines the ability to build, deploy, and scale AI.

Speaker #3: By making Elasticsearch a core component of the Dell and NVIDIA AI factories, we are meeting the critical demand for building AI on customer-controlled infrastructure.

Speaker #3: As we deepen these technical advantages, we strengthen our technical moat while removing friction from scaling AI. This quarter, we reached several product milestones designed to simplify the path from data to action for our customers.

Ash Kulkarni: This quarter, we reached several product milestones designed to simplify the path from data to action for our customers. We are providing an end-to-end framework for building the next generation of intelligent applications. First, we officially launched the general availability of Agent Builder. Agent Builder allows developers to build secure, context-driven AI agents in minutes. Unlike consumer apps that surf the web, our focus is on internal business applications using company data. We piloted Agent Builder with a global 100 financial group to investigate and troubleshoot its production infrastructure, demonstrating an order of magnitude improvement in performance for complex issues and democratizing the specialized expertise necessary for rapid troubleshooting. An international entertainment and media company created a chat interface for customer interactions. They found the Agent Builder results to be significantly more reliable and accurate than the other LLM-centric approaches they had tried.

Ash Kulkarni: This quarter, we reached several product milestones designed to simplify the path from data to action for our customers. We are providing an end-to-end framework for building the next generation of intelligent applications. First, we officially launched the general availability of Agent Builder. Agent Builder allows developers to build secure, context-driven AI agents in minutes. Unlike consumer apps that surf the web, our focus is on internal business applications using company data. We piloted Agent Builder with a global 100 financial group to investigate and troubleshoot its production infrastructure, demonstrating an order of magnitude improvement in performance for complex issues and democratizing the specialized expertise necessary for rapid troubleshooting. An international entertainment and media company created a chat interface for customer interactions. They found the Agent Builder results to be significantly more reliable and accurate than the other LLM-centric approaches they had tried.

Speaker #3: We are providing an end-to-end framework for building the next generation of intelligent applications. First, we officially launched the general availability of agent builder. Agent builder allows developers to build secure context-driven AI agents in minutes.

Speaker #3: Unlike consumer apps, that serve the web, our focus is on internal business applications using company data. We piloted agent builder with a global 100 financial group to investigate and troubleshoot its production infrastructure.

Speaker #3: Demonstrating an order of magnitude improvement in performance for complex issues and democratizing the specialized expertise necessary for rapid troubleshooting. An international entertainment and media company created a chat interface for customer interactions.

Speaker #3: They found the agent builder results to be significantly more reliable and accurate than the other LLM-centric approaches they had tried. Building an agent is only half the battle.

Ash Kulkarni: Building an agent is only half the battle. The other half is ensuring that agent has the most relevant information at its fingertips. This quarter, we expanded our Elastic Inference Service to include Jina AI's multilingual re-ranking models. Jina AI delivers a best-in-class model for search accuracy, with Jina V3 currently the number one re-ranker in its model size category on the MTEB English retrieval benchmark, a gold standard for search and RAG relevance. Jina AI's V5 nano and V5 small models continue to outpace peers as well, scoring high in retrieval, re-ranking, and other tasks. By making these models available natively, we are allowing our customers to tune their AI applications for maximum precision and recall. RAG is the critical next step in a context engineering pipeline that ensures the most relevant data is presented to the LLM. Jina's state-of-the-art models deliver superior performance across over 80 languages.

Ash Kulkarni: Building an agent is only half the battle. The other half is ensuring that agent has the most relevant information at its fingertips. This quarter, we expanded our Elastic Inference Service to include Jina AI's multilingual re-ranking models. Jina AI delivers a best-in-class model for search accuracy, with Jina V3 currently the number one re-ranker in its model size category on the MTEB English retrieval benchmark, a gold standard for search and RAG relevance. Jina AI's V5 nano and V5 small models continue to outpace peers as well, scoring high in retrieval, re-ranking, and other tasks. By making these models available natively, we are allowing our customers to tune their AI applications for maximum precision and recall. RAG is the critical next step in a context engineering pipeline that ensures the most relevant data is presented to the LLM. Jina's state-of-the-art models deliver superior performance across over 80 languages.

Speaker #3: The other half is ensuring that agent has the most relevant information at its fingertips. This quarter, we expanded our Elastic inference service to include Gina AI's multilingual re-ranking models.

Speaker #3: Gina AI delivers a best-in-class model for search accuracy, with Gina V3 currently the number one re-ranker in its model size category on the MTEB English Retrieval Benchmark.

Speaker #3: A gold standard for search and RAG relevance. Gina AI's V5 Nano and V5 Small models continue to outpace peers as well, scoring high in retrieval, re-ranking, and other tasks.

Speaker #3: By making these models available natively, we are allowing our customers to tune their AI applications for maximum precision and recall. Re-rank is the critical next step in a context engineering pipeline that ensures the most relevant data is presented to the LLM.

Speaker #3: Gina's state-of-the-art models deliver superior performance across over 80 languages. While AI provides the reasoning, enterprises still require the reliability of rule-based automation for critical business tasks.

Ash Kulkarni: While AI provides the reasoning, enterprises still require the reliability of rule-based automation for critical business tasks. This is why we introduced Elastic Workflows in technical preview. Workflows adds automation capability directly into our platform, allowing agents to orchestrate actions across internal and external systems like Slack or ServiceNow. It moves Elastic from being a search box to a complete system of action. Finally, we are delivering on our promise of hybrid flexibility with Cloud Connect for self-managed customers. We recognize that many of our largest customers, particularly in financial services and government, maintain data on premises for regulatory or sovereignty reasons. However, procuring and managing GPU hardware for AI is a massive hurdle for these teams. Cloud Connect allows customers to keep their data local while securely bursting to Elastic Cloud to leverage NVIDIA GPUs for high-performance inference.

Ash Kulkarni: While AI provides the reasoning, enterprises still require the reliability of rule-based automation for critical business tasks. This is why we introduced Elastic Workflows in technical preview. Workflows adds automation capability directly into our platform, allowing agents to orchestrate actions across internal and external systems like Slack or ServiceNow. It moves Elastic from being a search box to a complete system of action. Finally, we are delivering on our promise of hybrid flexibility with Cloud Connect for self-managed customers. We recognize that many of our largest customers, particularly in financial services and government, maintain data on premises for regulatory or sovereignty reasons. However, procuring and managing GPU hardware for AI is a massive hurdle for these teams. Cloud Connect allows customers to keep their data local while securely bursting to Elastic Cloud to leverage NVIDIA GPUs for high-performance inference.

Speaker #3: This is why we introduced Elastic workflows in technical preview. Workflows add automation capability directly into our platform, allowing agents to orchestrate actions across internal and external systems like Slack or ServiceNow.

Speaker #3: It moves Elastic from being a search box to a complete system of action. Finally, we are delivering on our promise of hybrid flexibility with cloud connect for self-managed customers.

Speaker #3: We recognize that many of our largest customers, particularly in financial services and government, maintain data on-premises for regulatory or sovereignty reasons. However, procuring and managing GPU hardware for AI is a massive hurdle for these teams.

Speaker #3: Cloud connect allows customers to keep their data local while securely bursting to Elastic Cloud to leverage NVIDIA GPUs for high-performance inference. This ability to bridge modern AI capabilities with rigorous enterprise requirements is exactly why we are winning large-scale displacements against legacy providers.

Ash Kulkarni: This ability to bridge modern AI capabilities with rigorous enterprise requirements is exactly why we are winning large-scale displacements against legacy providers. As organizations prioritize both innovation and operational efficiency, they're moving away from fragmented legacy tools in favor of Elastic's unified search AI platform. The results of this quarter, accelerating growth, large deal momentum, and major competitive displacements, confirm that our strategy is resonating and that we are winning the race to become the essential infrastructure for the next generation of AI-powered businesses. I want to thank our customers for their partnership, our shareholders for their trust, and most importantly, our employees for their tireless spirit of innovation. With that, I will turn the call over to Navam to review our financial results in more detail.

Ash Kulkarni: This ability to bridge modern AI capabilities with rigorous enterprise requirements is exactly why we are winning large-scale displacements against legacy providers. As organizations prioritize both innovation and operational efficiency, they're moving away from fragmented legacy tools in favor of Elastic's unified search AI platform. The results of this quarter, accelerating growth, large deal momentum, and major competitive displacements, confirm that our strategy is resonating and that we are winning the race to become the essential infrastructure for the next generation of AI-powered businesses. I want to thank our customers for their partnership, our shareholders for their trust, and most importantly, our employees for their tireless spirit of innovation. With that, I will turn the call over to Navam to review our financial results in more detail.

Speaker #3: As organizations prioritize both innovation and operational efficiency, they are moving away from fragmented legacy tools in favor of Elastic's unified search AI platform. The results of this quarter accelerating growth, large deal momentum, and major competitive displacements confirm that our strategy is resonating and that we are winning the race to become the essential infrastructure for the next generation of AI-powered businesses.

Speaker #3: I want to thank our customers for their partnership, our shareholders for their trust, and most importantly, our employees for their tireless spirit of innovation.

Speaker #3: With that, I will turn the call over to Navam to review our financial results in more detail.

Speaker #1: Thank you, Ash. Good afternoon, everyone. We delivered yet another outstanding quarter. We outperformed the high-end end of revenue and profitability guidance ranges driven by another quarter of consistent execution, strong consumption, and strong customer commitments across search security and observability.

Navam Welihinda: Thank you, Ash. Good afternoon, everyone. We delivered yet another outstanding quarter. We outperformed the high end of revenue and profitability guidance ranges, driven by another quarter of consistent execution, strong consumption, and strong customer commitments across search, security, and observability. The momentum in our performance throughout this fiscal year is a testament to our team's ability to deliver rapid innovation and sales execution consistently quarter-over-quarter. The ongoing market demand we see is translating to total revenue growth, sales-led subscription revenue growth, and healthy increases in pipeline generation to support our future growth. These factors together underscore our increasingly strategic value as a critical data platform in the age of AI. Our total revenue in Q3 was $450 million, representing growth of approximately 18% as reported and 16% on a constant currency basis.

Navam Welihinda: Thank you, Ash. Good afternoon, everyone. We delivered yet another outstanding quarter. We outperformed the high end of revenue and profitability guidance ranges, driven by another quarter of consistent execution, strong consumption, and strong customer commitments across search, security, and observability. The momentum in our performance throughout this fiscal year is a testament to our team's ability to deliver rapid innovation and sales execution consistently quarter-over-quarter. The ongoing market demand we see is translating to total revenue growth, sales-led subscription revenue growth, and healthy increases in pipeline generation to support our future growth. These factors together underscore our increasingly strategic value as a critical data platform in the age of AI. Our total revenue in Q3 was $450 million, representing growth of approximately 18% as reported and 16% on a constant currency basis.

Speaker #1: The momentum in our performance throughout this fiscal year is a testament to our team's ability to deliver rapid innovation and sales execution consistently quarter over quarter.

Speaker #1: The ongoing market demand we see is translating to total revenue growth. Sales-led subscription revenue growth and healthy increases in pipeline generation to support our future growth.

Speaker #1: These factors together underscore our increasingly strategic value as a critical data platform in the age of AI. Our total revenue in the third quarter was $450 million, representing growth of approximately 18% as reported, and 16% on a constant currency basis.

Speaker #1: Sales-led subscription revenue in the third quarter was $376 million, growing 21% as reported, and 19% on a constant currency basis. We saw commitment contribution from both our self-managed and cloud offerings.

Navam Welihinda: Sales-led subscription revenue in Q3 was $376 million, growing 21% as reported and 19% on a constant currency basis. We saw commitment contribution from both our self-managed and cloud offerings. Aggregate consumption trends in Q3 remained strong. Our current remaining performance obligations or CRPO, which is a portion of RPO that we expect to recognize as revenue within the next 12 months, crossed the billion-dollar mark for the first time in Q3. CRPO accelerated to approximately $1.06 billion, growing 19% as reported and 15% on a constant currency basis. In our consumption business, we structure customer contracts based on their annual usage. Our CRPO gives us a very clear view into the revenue we will recognize in the next 12 months, giving us visibility and confidence in our business.

Navam Welihinda: Sales-led subscription revenue in Q3 was $376 million, growing 21% as reported and 19% on a constant currency basis. We saw commitment contribution from both our self-managed and cloud offerings. Aggregate consumption trends in Q3 remained strong. Our current remaining performance obligations or CRPO, which is a portion of RPO that we expect to recognize as revenue within the next 12 months, crossed the billion-dollar mark for the first time in Q3. CRPO accelerated to approximately $1.06 billion, growing 19% as reported and 15% on a constant currency basis. In our consumption business, we structure customer contracts based on their annual usage. Our CRPO gives us a very clear view into the revenue we will recognize in the next 12 months, giving us visibility and confidence in our business.

Speaker #1: An aggregate consumption trend in the third quarter remained strong. Our current remaining performance obligations or CRPO, which is a portion of RPO that we expect to recognize as revenue within the next 12 months, crossed the billion-dollar mark for the first time in Q3.

Speaker #1: CRPO accelerated to approximately $1.06 billion, growing 19% as reported, and 15% on a constant currency basis. In our consumption business, we structure customer contracts based on their annual usage.

Speaker #1: So our CRPO gives us a very clear view into the revenue we will recognize in the next 12 months, giving us visibility and confidence in our business.

Speaker #1: As Ash mentioned, we saw deal momentum continue in Q3. This quarter's strength was balanced across all geographies, and we continue to see customers make multi-year commitments this quarter.

Navam Welihinda: As Ash mentioned, we saw deal momentum continue in Q3. This quarter's strength was balanced across all geographies, and we continued to see customers make multi-year commitments this quarter, which serves as a clear indicator of how our customers view the Elastic platform as a critical foundational element in their long-term data architectures. The positive momentum was reflected in our RPO, which saw strong growth of 22% in the quarter as reported, and 18% on a constant currency basis. Our deal momentum is also evident in the growth of the count of customers with over $100,000 in annual contract value. We ended the Q3 with over 1,660 customers with ACV of more than $100,000, growing 14%. Quarter-over-quarter, we added approximately 60 net new 100K ACV customers.

Navam Welihinda: As Ash mentioned, we saw deal momentum continue in Q3. This quarter's strength was balanced across all geographies, and we continued to see customers make multi-year commitments this quarter, which serves as a clear indicator of how our customers view the Elastic platform as a critical foundational element in their long-term data architectures. The positive momentum was reflected in our RPO, which saw strong growth of 22% in the quarter as reported, and 18% on a constant currency basis. Our deal momentum is also evident in the growth of the count of customers with over $100,000 in annual contract value. We ended the Q3 with over 1,660 customers with ACV of more than $100,000, growing 14%. Quarter-over-quarter, we added approximately 60 net new 100K ACV customers.

Speaker #1: Which serves as a clear indicator of how our customers view the Elastic platform as a critical foundational element in their long-term data architectures. The positive momentum was reflected in our RPO, which saw strong growth of 22% in the quarter, as reported, and 18% on a constant currency basis.

Speaker #1: Our deal momentum is also evident in the growth of the count of customers with over 100,000 in annual contract value. We ended the third quarter with over 1,660 customers with ACV of more than $100,000, growing 14%.

Speaker #1: Quarter over quarter, we added approximately 60 net new 100K ACV customers. We saw strong field execution and healthy growth across our solutions, where search continues to see ongoing momentum from AI.

Navam Welihinda: We saw strong field execution and healthy growth across our solutions, where search continues to see ongoing momentum from AI. This demand is benefiting both cloud and self-managed, where both form factors are relevant for AI use cases. We continue to see customers taking a self-managed license and deploying Elastic into their own modern cloud and hybrid environments. The demand reflects customers' preference for Elastic, which uniquely provides the necessary control and cost efficiency for AI initiatives. AI also continues to be a powerful catalyst for customer expansion. 28% of our greater than 100,000 cohort now utilizes Elastic for AI, which includes incremental AI capabilities like Attack Discovery and Agent Builder.

Navam Welihinda: We saw strong field execution and healthy growth across our solutions, where search continues to see ongoing momentum from AI. This demand is benefiting both cloud and self-managed, where both form factors are relevant for AI use cases. We continue to see customers taking a self-managed license and deploying Elastic into their own modern cloud and hybrid environments. The demand reflects customers' preference for Elastic, which uniquely provides the necessary control and cost efficiency for AI initiatives. AI also continues to be a powerful catalyst for customer expansion. 28% of our greater than 100,000 cohort now utilizes Elastic for AI, which includes incremental AI capabilities like Attack Discovery and Agent Builder.

Speaker #1: This demand is benefiting both cloud and self-managed where both form factors are relevant for AI use cases. We continue to see customers taking a self-managed license and deploying Elastic into their own modern cloud and hybrid environments.

Speaker #1: The demand reflects customers' preference for Elastic which uniquely provides the necessary control and cost efficiency for AI initiatives. AI also continues to be a powerful catalyst for customer expansion.

Speaker #1: 28% of our greater than 100K cohort now utilizes Elastic for AI, which includes incremental AI capabilities like attack discovery and agent builder. Today, we are still in the early stages of expansion, and we see considerable opportunity for ongoing upside for both new and existing customers to accelerate their AI adoption in the years ahead.

Navam Welihinda: Today, we are still in the early stages of expansion. We see considerable opportunity for ongoing upside for both new and existing customers to accelerate their AI adoption in the years ahead, particularly as they scale into and within our 100K ACV cohort. Now turning to Q3 margins and profitability. I will discuss all measures on a non-GAAP basis. Our commitment to balancing growth with disciplined spending translated into robust operating leverage and strong bottom-line results. We continue to focus on costs and efficiency in our business. We recorded subscription gross margins of 82% and total gross margins of 78%, delivering an operating margin of 18.6%. The outperformance on Q3 operating margin was the result of our strong revenue performance, the sustained leverage in our model, as well as some Q3 expenses moving into Q4.

Navam Welihinda: Today, we are still in the early stages of expansion. We see considerable opportunity for ongoing upside for both new and existing customers to accelerate their AI adoption in the years ahead, particularly as they scale into and within our 100K ACV cohort. Now turning to Q3 margins and profitability. I will discuss all measures on a non-GAAP basis. Our commitment to balancing growth with disciplined spending translated into robust operating leverage and strong bottom-line results. We continue to focus on costs and efficiency in our business. We recorded subscription gross margins of 82% and total gross margins of 78%, delivering an operating margin of 18.6%. The outperformance on Q3 operating margin was the result of our strong revenue performance, the sustained leverage in our model, as well as some Q3 expenses moving into Q4.

Speaker #1: Particularly as they scale into and within our 100K ACV cohort. Now turning to third quarter margins and profitability. I will discuss all measures on a non-gap basis.

Speaker #1: Our commitment to balancing growth with discipline spending translated into robust operating leverage and strong bottom line results. We continue to focus on costs and efficiency in our business.

Speaker #1: We recorded subscription gross margins of 82% and total gross margins of 78%, delivering an operating margin of 18.6%. The outperformance on Q3 operating margin was the result of our strong revenue performance that sustained leverage in our model as well as some Q3 expenses moving into Q4.

Speaker #1: Due to this outperformance, we now expect to see our full-year margins to come in slightly ahead than previously anticipated, with updated FY26 operating margin guidance now at 16.3%.

Navam Welihinda: Due to this outperformance, we now expect to see our full year margins to come in slightly ahead than previously anticipated, with updated FY 2026 operating margin guidance now at 16.3%. Regarding cash flow, adjusted free cash flow was approximately $54 million in Q3, representing a margin of approximately 12%. Our cash flows are expected to fluctuate on a quarterly basis based on the timing of bookings and collections related to the enterprise booking seasonality. We continue to manage cash flow on a full year basis. For fiscal 2026, we do not see any change in our full year outlook, where we continue to expect to sustain the level of adjusted free cash flow margins that we achieved in fiscal 2025. We have made significant progress on the $500 million share repurchase program that we announced in October.

Navam Welihinda: Due to this outperformance, we now expect to see our full year margins to come in slightly ahead than previously anticipated, with updated FY 2026 operating margin guidance now at 16.3%. Regarding cash flow, adjusted free cash flow was approximately $54 million in Q3, representing a margin of approximately 12%. Our cash flows are expected to fluctuate on a quarterly basis based on the timing of bookings and collections related to the enterprise booking seasonality. We continue to manage cash flow on a full year basis. For fiscal 2026, we do not see any change in our full year outlook, where we continue to expect to sustain the level of adjusted free cash flow margins that we achieved in fiscal 2025. We have made significant progress on the $500 million share repurchase program that we announced in October.

Speaker #1: Regarding cash flow, adjusted free cash flow was approximately $54 million in Q3, representing a margin of approximately 12%. Our cash flows are expected to fluctuate on a quarterly basis based on the timing of bookings and collections related to the enterprise booking seasonality.

Speaker #1: So we continue to manage cash flow on a full-year basis. For fiscal 2026, we do not see any change in our full-year outlook, where we continue to expect to sustain the level of adjusted free cash flow margins that we achieved in fiscal 2025.

Speaker #1: We have made significant progress on the $500 million share repurchase program that we announced in October. During the third quarter, we returned approximately $186 million to shareholders.

Navam Welihinda: During Q3, we returned approximately $186 million to shareholders, representing purchases of approximately 2.4 million shares. Cumulatively, we have repurchased 3.8 million shares. I mentioned at our Financial Analyst Day in October that we expect to use more than 50% of the $500 million authorized amount in fiscal 2026, and we have already exceeded this goal. As of the end of Q3, we have completed 60% of our repurchase program and we are continuing our repurchase program here in Q4. Let's move to our outlook for Q4 and the remainder of fiscal 2026.

Navam Welihinda: During Q3, we returned approximately $186 million to shareholders, representing purchases of approximately 2.4 million shares. Cumulatively, we have repurchased 3.8 million shares. I mentioned at our Financial Analyst Day in October that we expect to use more than 50% of the $500 million authorized amount in fiscal 2026, and we have already exceeded this goal. As of the end of Q3, we have completed 60% of our repurchase program and we are continuing our repurchase program here in Q4. Let's move to our outlook for Q4 and the remainder of fiscal 2026.

Speaker #1: Representing purchases of approximately 2.4 million shares. Cumulatively, we have repurchased 3.8 million shares. I mentioned at our Financial Analyst Day in October that we expect to use more than 50% of the $500 million authorized amount in fiscal 2026, and we have already exceeded this goal.

Speaker #1: As of the end of Q3, we have completed 60% of our repurchase program, and we are continuing our repurchase program here in Q4. Let's move to our outlook for the fourth quarter and the remainder of fiscal 2026.

Speaker #1: For the fourth quarter of fiscal 2026, we expect total revenue in the range of $445 million to $447 million, representing 15% growth at the midpoint, or 13% constant currency growth at the midpoint.

Navam Welihinda: For Q4 of fiscal 2026, we expect total revenue in the range of $445 to $447 million, representing 15% growth at the midpoint or 13% constant currency growth at the midpoint. We expect sales-led subscription revenue in the range of $371 to $373 million, representing 18% growth at the midpoint or 15% in constant currency growth at the midpoint. We expect non-GAAP operating margins to be approximately 14.5%. We expect non-GAAP diluted earnings per share in the range of $0.55 to $0.57, using between 105.5 million and 106.5 million diluted weighted average ordinary shares outstanding.

Navam Welihinda: For Q4 of fiscal 2026, we expect total revenue in the range of $445 to $447 million, representing 15% growth at the midpoint or 13% constant currency growth at the midpoint. We expect sales-led subscription revenue in the range of $371 to $373 million, representing 18% growth at the midpoint or 15% in constant currency growth at the midpoint. We expect non-GAAP operating margins to be approximately 14.5%. We expect non-GAAP diluted earnings per share in the range of $0.55 to $0.57, using between 105.5 million and 106.5 million diluted weighted average ordinary shares outstanding.

Speaker #1: We expect sales-led subscription revenue in the range of $371 million to $373 million, representing 18% growth at the midpoint or 15% in constant currency growth at the midpoint.

Speaker #1: We expect non-gap operating margins to be approximately 14.5%. We expect non-gap diluted earnings per share in the range of $55 to $57, using between $105.5 million and $106.5 million diluted weighted average ordinary shares outstanding.

Speaker #1: Based on our fourth quarter guidance, we are raising our full-year total revenue and sales-led subscription revenue targets as well. We expect total revenue in the range of $1.734 billion to $1.736 billion, representing approximately 17% growth at the midpoint or 15% constant currency growth at the midpoint.

Navam Welihinda: Based on our Q4 guidance, we are raising our full year total revenue and sales-led subscription revenue targets as well. We expect total revenue in the range of $1.734 billion to $1.736 billion, representing approximately 17% growth at the midpoint or 15% constant currency growth at the midpoint. We expect sales-led subscription revenue in the range of $1.434 billion to $1.436 billion, representing 20% growth at the midpoint or 18% in constant currency growth at the midpoint. We expect non-GAAP operating margin for full fiscal 2026 to be approximately 16.3%.

Navam Welihinda: Based on our Q4 guidance, we are raising our full year total revenue and sales-led subscription revenue targets as well. We expect total revenue in the range of $1.734 billion to $1.736 billion, representing approximately 17% growth at the midpoint or 15% constant currency growth at the midpoint. We expect sales-led subscription revenue in the range of $1.434 billion to $1.436 billion, representing 20% growth at the midpoint or 18% in constant currency growth at the midpoint. We expect non-GAAP operating margin for full fiscal 2026 to be approximately 16.3%.

Speaker #1: We expect sales-led subscription revenue in the range of $1.434 billion to $1.436 billion, representing 20% growth at the midpoint or 18% in constant currency growth at the midpoint.

Speaker #1: We expect non-gap operating margin for full fiscal 2026 to be approximately 16.3%. We expect non-gap diluted earnings per share in the range of $2.50 to $2.54, using between $107 million and $108 million diluted weighted average ordinary shares outstanding.

Navam Welihinda: We expect non-GAAP diluted earnings per share in the range of $2.50 to 2.54, using between 107 million and 108 million diluted weighted average ordinary shares outstanding. A few other financial modeling points to keep in mind. The diluted weighted average shares outstanding reflect only share buybacks completed as of 31 January 2026. As you consider comparing sequential quarters, keep in mind that Q4 has three fewer days than we had in each of the first three quarters of the year, which creates a sequential headwind to revenue, which we have accounted for in our guidance. Also, as is typical with prior Q4 periods, we expect to see seasonally higher expenses related to the timing of employee benefit costs.

Navam Welihinda: We expect non-GAAP diluted earnings per share in the range of $2.50 to 2.54, using between 107 million and 108 million diluted weighted average ordinary shares outstanding. A few other financial modeling points to keep in mind. The diluted weighted average shares outstanding reflect only share buybacks completed as of 31 January 2026. As you consider comparing sequential quarters, keep in mind that Q4 has three fewer days than we had in each of the first three quarters of the year, which creates a sequential headwind to revenue, which we have accounted for in our guidance. Also, as is typical with prior Q4 periods, we expect to see seasonally higher expenses related to the timing of employee benefit costs.

Speaker #1: A few other financial modeling points to keep in mind. The diluted weighted average shares outstanding reflect only share buybacks completed as of January 31st, 2026.

Speaker #1: As you consider comparing sequential quarters, keep in mind that Q4 has three fewer days than we had in each of the first three quarters of the year.

Speaker #1: Which creates a sequential headwind to revenue, which we have accounted for in our guidance. Also, as is typical with prior Q4 periods, we expect to see seasonally higher expenses related to the timing of employee benefit costs.

Speaker #1: These expenses were already part of the guidance that we had initially laid out for the year. As in past years, we finalize our plans for the upcoming fiscal year during the fourth quarter.

Navam Welihinda: These expenses were already part of the guidance that we had initially laid out for the year. As in past years, we finalize our plans for the upcoming fiscal year during Q4. We will provide our initial FY 2027 guide during our earnings call in May. In summary, Q3 was another very strong quarter at Elastic. Consistent sales execution throughout FY 2026 continues to drive our sales-led subscription revenue growth expectations higher for the year, validating the durability of this business motion. As I said last quarter, while quarter revenue can naturally vary in a consumption model, our strong customer commitments drive strong annual growth. Fueled by a highly differentiated platform and the expanding value we deliver to our customers, we remain on track to achieve our medium-term targets for both sales led subscription revenue growth and adjusted free cash flow.

Navam Welihinda: These expenses were already part of the guidance that we had initially laid out for the year. As in past years, we finalize our plans for the upcoming fiscal year during Q4. We will provide our initial FY 2027 guide during our earnings call in May. In summary, Q3 was another very strong quarter at Elastic. Consistent sales execution throughout FY 2026 continues to drive our sales-led subscription revenue growth expectations higher for the year, validating the durability of this business motion. As I said last quarter, while quarter revenue can naturally vary in a consumption model, our strong customer commitments drive strong annual growth. Fueled by a highly differentiated platform and the expanding value we deliver to our customers, we remain on track to achieve our medium-term targets for both sales led subscription revenue growth and adjusted free cash flow.

Speaker #1: And we will provide our initial FY27 guide during our earnings call in May. In summary, Q3 was another very strong quarter elastic. Consistent sales execution throughout FY26 continues to drive our sales-led subscription revenue growth expectations higher for the year.

Speaker #1: Validating the durability of this business motion. As I said last quarter, while quarterly revenue can naturally vary in a consumption model, our strong customer commitments drive strong annual growth.

Speaker #1: Fueled by a highly differentiated platform and the expanding value we deliver to our customers, we remain on track to achieve our medium-term targets for both sales-led subscription revenue growth and adjusted free cash flow.

Speaker #1: Looking forward, we're confident in our ability to continue to drive profitable growth. We are the critical technology that accelerates data discovery, secures infrastructure, and maximizes application performance.

Navam Welihinda: Looking forward, we're confident in our ability to continue to drive profitable growth. We are the critical technology that accelerates data discovery, secures infrastructure, and maximizes application performance. With that, I'll open it up for Q&A.

Navam Welihinda: Looking forward, we're confident in our ability to continue to drive profitable growth. We are the critical technology that accelerates data discovery, secures infrastructure, and maximizes application performance. With that, I'll open it up for Q&A.

Speaker #1: With that, I'll open it up for Q&A.

Speaker #2: We will now begin the question-and-answer session. To ask a question, you may press star, then one on your telephone keypad. If you are using a speakerphone, please pick up your handset before pressing the keys.

Operator: We will now begin the question and answer session. To ask a question, you may press Star then one on your telephone keypad. If you are using a speakerphone, please pick up your handset before pressing the keys. To withdraw your question, please press Star then two. Our first question today is from Sanjit Singh with Morgan Stanley. Please go ahead.

Operator: We will now begin the question and answer session. To ask a question, you may press Star then one on your telephone keypad. If you are using a speakerphone, please pick up your handset before pressing the keys. To withdraw your question, please press Star then two. Our first question today is from Sanjit Singh with Morgan Stanley. Please go ahead.

Speaker #2: To withdraw your question, please press star, then two. Our first question today is from Sanjit Singh with Morgan Stanley. Please go ahead.

Speaker #3: Thank you for taking the questions and congrats on the stability that we're seeing across the business. Navam, I wanted to go back to some of the themes on the investor day a couple of months ago.

Sanjit Singh: Thank you for taking the questions and congrats on the stability that we're seeing across the business. Navam, I wanted to go back to some of the themes on the Investor Day a couple of months ago. There was a data point that you provided around the AI native customers or the AI customers being a relatively small amount of the customers in fiscal year 2024, but driving an outsized degree of expansion, that sort of year one to year two expansion. The gist of this question is that as, you know, we get to like 25% of penetration of your 100K customer cohort, is there an opportunity here for a growth to not just be stable, but actually to accelerate on a more sustained basis as we hit those critical tipping points, if you will?

Sanjit Singh: Thank you for taking the questions and congrats on the stability that we're seeing across the business. Navam, I wanted to go back to some of the themes on the Investor Day a couple of months ago. There was a data point that you provided around the AI native customers or the AI customers being a relatively small amount of the customers in fiscal year 2024, but driving an outsized degree of expansion, that sort of year one to year two expansion. The gist of this question is that as, you know, we get to like 25% of penetration of your 100K customer cohort, is there an opportunity here for a growth to not just be stable, but actually to accelerate on a more sustained basis as we hit those critical tipping points, if you will?

Speaker #3: There was a data point that you provided around the AI-native customers or the AI customers being a relatively small amount of the customers in fiscal year '24, but driving an outsized degree of expansion, that sort of year one to year two expansion.

Speaker #3: And so the gist of this question is that as we get to like 25% penetration of your 100K customer court, is there an opportunity here for a growth to not just be stable but actually to accelerate on a more sustained basis as we hit those critical tipping points, if you will?

Speaker #4: Yeah. Thanks for the question, Sanjit. And the trends that we laid out during the financial analyst day for the generative AI cohort, they remain the same.

Navam Welihinda: Yeah. Thanks for the question, Sanjit. The trends that we laid out during the Financial Analyst Day for the generative AI cohort, they remain the same. We continue to perform well, and we're seeing stronger growth on those generative AI cohorts today as it was when we disclosed it to you during Financial Analyst Day. We're seeing these tailwinds right now, and we're seeing more of our customers reach the 100K mark. Now remember that each of these customers in the 100K mark are also early in their journey. There's this other dimension of additional penetration and maturation in their own AI journey, which will drive faster growth as well. We're seeing the tailwinds right now.

Navam Welihinda: Yeah. Thanks for the question, Sanjit. The trends that we laid out during the Financial Analyst Day for the generative AI cohort, they remain the same. We continue to perform well, and we're seeing stronger growth on those generative AI cohorts today as it was when we disclosed it to you during Financial Analyst Day. We're seeing these tailwinds right now, and we're seeing more of our customers reach the 100K mark. Now remember that each of these customers in the 100K mark are also early in their journey. There's this other dimension of additional penetration and maturation in their own AI journey, which will drive faster growth as well. We're seeing the tailwinds right now.

Speaker #4: So we continue to perform well and we're seeing stronger growth on those generative AI cohorts today as it was when we disclosed it to you during financial analyst day.

Speaker #4: So we're seeing these tailwinds right now and we're seeing more of our customers reach the 100K mark now. Remember that each of these customers in the 100K mark are also early in their journey.

Speaker #4: So there's this other dimension of additional penetration and maturation in their own AI journey, which will drive faster growth as well. So we're seeing the tailwinds right now.

Speaker #4: We've seen tailwinds that average to 5%, but there's obviously more that there are some customers that have higher growth than that. And to answer your question, yeah, absolutely.

Navam Welihinda: We've seen tailwinds that average to 5%, but there's obviously more that there are some customers that have higher growth than that. To answer your question, yeah, absolutely, there is a possibility. The art of the possible is there for us to actually accelerate beyond that 5% that we laid out during Financial Analyst Day, the trends remain positive.

Navam Welihinda: We've seen tailwinds that average to 5%, but there's obviously more that there are some customers that have higher growth than that. To answer your question, yeah, absolutely, there is a possibility. The art of the possible is there for us to actually accelerate beyond that 5% that we laid out during Financial Analyst Day, the trends remain positive.

Speaker #4: There is a possibility the art of the possible is there for us to actually accelerate beyond that 5% that we laid out during Financial Analyst Day.

Speaker #4: And the trends remain positive.

Speaker #3: And Sanjit, just to add to that, that is exactly why we are so focused on the penetration of AI within our customer base. And as these customers right now, every quarter you're seeing us increase the penetration, the us in some small way.

Ash Kulkarni: Sanjit, just to add to that is exactly why we are so focused on the penetration of AI within our customer base. As these customers, you know, right now, every quarter, you're seeing us increase the penetration. The penetration, you know, initially starts with them using us in some small way, and as that usage grows, as you rightly pointed out, that's gonna add to the consumption, it's gonna add to the overall revenue, and that's gonna show in, you know, the continued strength and acceleration of the business.

Ash Kulkarni: Sanjit, just to add to that is exactly why we are so focused on the penetration of AI within our customer base. As these customers, you know, right now, every quarter, you're seeing us increase the penetration. The penetration, you know, initially starts with them using us in some small way, and as that usage grows, as you rightly pointed out, that's gonna add to the consumption, it's gonna add to the overall revenue, and that's gonna show in, you know, the continued strength and acceleration of the business.

Speaker #3: And as that usage grows, as you rightly pointed out, that's going to add to the consumption. It's going to add to the overall revenue, and that's going to show in the continued strength and acceleration of the business.

Speaker #5: Understood. And ask me a question for you. You made the point in your script about vector search and vector databases are not enough in terms of building a resilient and powerful AI application.

Sanjit Singh: Understood. Ash, maybe a question for you. You made the point in your script about vector search and vector databases are not enough in terms of building a resilient and powerful AI application. I think a lot of people would agree with that statement. When we brand the company as a context engine, what are the core pieces that are mandatory to secure status as the leading provider of context for AI applications?

Sanjit Singh: Understood. Ash, maybe a question for you. You made the point in your script about vector search and vector databases are not enough in terms of building a resilient and powerful AI application. I think a lot of people would agree with that statement. When we brand the company as a context engine, what are the core pieces that are mandatory to secure status as the leading provider of context for AI applications?

Speaker #5: And I think a lot of people would agree with that statement. So when we brand the company as a context engine, what are the core pieces that are mandatory to secure status as the leading provider of context for AI applications?

Speaker #4: That's a great question. I think the most important thing to keep in mind is context is going to change from task to task. And so the data platform, the context engineering platform that you provide needs to be able to do a whole bunch of things all together in a very consistent way.

Ash Kulkarni: That's a great question. I think the most important thing to keep in mind is, you know, context is gonna change from task to task. The data platform, the context engineering platform that you provide, needs to be able to do a whole bunch of things all together in a very consistent way. The first is the ability to bring in any and all kinds of data. As you know, you know, we have some unique capabilities in our ability to bring in not just structured information, but also unstructured, really, really messy information. The second is to then take that data and convert it into vectors for vector search, which is a very powerful technique, especially in the AI world, for semantic search.

Ash Kulkarni: That's a great question. I think the most important thing to keep in mind is, you know, context is gonna change from task to task. The data platform, the context engineering platform that you provide, needs to be able to do a whole bunch of things all together in a very consistent way. The first is the ability to bring in any and all kinds of data. As you know, you know, we have some unique capabilities in our ability to bring in not just structured information, but also unstructured, really, really messy information. The second is to then take that data and convert it into vectors for vector search, which is a very powerful technique, especially in the AI world, for semantic search.

Speaker #4: The first is the ability to bring in any and all kinds of data. And as you know, we have some unique capabilities in our ability to bring in not just structured information, but also unstructured, really, really messy information.

Speaker #4: The second is to then take that data and convert it into vectors for vector search, which is a very powerful technique. Especially in the AI world, for semantic search, but then also to be able to mix it with hybrid search techniques.

Ash Kulkarni: also to be able to mix it with hybrid search techniques, that includes textual search and then being able to re-rank against multiple techniques to get the most accurate context. The Jina AI embedding models, the Jina AI re-ranker models, those are a key part of that overall platform infrastructure. On top of it, you need something that will allow you to assemble agents using all of these capabilities. That's what Agent Builder was all about. As you know, it's a relatively new feature from us and a relatively new capability, but we are seeing great traction and adoption within our customer base. On top of it, you need workflows because agents are not just about chat anymore. They're not just about conversations.

Ash Kulkarni: also to be able to mix it with hybrid search techniques, that includes textual search and then being able to re-rank against multiple techniques to get the most accurate context. The Jina AI embedding models, the Jina AI re-ranker models, those are a key part of that overall platform infrastructure. On top of it, you need something that will allow you to assemble agents using all of these capabilities. That's what Agent Builder was all about. As you know, it's a relatively new feature from us and a relatively new capability, but we are seeing great traction and adoption within our customer base. On top of it, you need workflows because agents are not just about chat anymore. They're not just about conversations.

Speaker #4: That includes textual search and then being able to re-rank against multiple techniques to get the most accurate context. So the GINA AI embedding models, the GINA AI re-ranker models, those are a key part of that overall platform infrastructure.

Speaker #4: On top of it, then you need something that will allow you to assemble agents using all of these capabilities. And that's what agent builder was all about.

Speaker #4: As you know, it's a relatively new feature from us and a relatively new capability, but we are seeing great traction and adoption within our customer base.

Speaker #4: Then, on top of it, you need workflows because agents are not just about chat anymore. They're not just about conversations—they're about taking precise actions.

Ash Kulkarni: They're about taking precise actions, and that's where the workflow functionality that we released becomes really important. Lastly, the ability to monitor all of this, and that's where our LLM observability functionality becomes key. We believe that it's all of these capabilities, Sanjit, that taken together make the platform a very compelling platform for context engineering. On top of that, we've also added our Elastic Inference Service. You don't need to bring your own LLM. We can help you proxy to any LLM of choice that you might wanna use. We integrate with pretty much all of them.

Ash Kulkarni: They're about taking precise actions, and that's where the workflow functionality that we released becomes really important. Lastly, the ability to monitor all of this, and that's where our LLM observability functionality becomes key. We believe that it's all of these capabilities, Sanjit, that taken together make the platform a very compelling platform for context engineering. On top of that, we've also added our Elastic Inference Service. You don't need to bring your own LLM. We can help you proxy to any LLM of choice that you might wanna use. We integrate with pretty much all of them.

Speaker #4: And that's where the workflow functionality that we released becomes really important. And lastly, the ability to monitor all of this and that's where our LLM observability functionality becomes key.

Speaker #4: So we believe that it's all of these capabilities, Sanjit, that taken together make the platform a very compelling platform for context engineering. And on top of that, we've also added our Elastic Inference service.

Speaker #4: So you don't need to bring your own LLM. We can help you proxy to any LLM of choice that you might want to use.

Speaker #4: We integrate with pretty much all of them.

Speaker #5: Appreciate the color loss. Thank you.

Sanjit Singh: Appreciate the color, Ash. Thank you.

Sanjit Singh: Appreciate the color, Ash. Thank you.

Speaker #2: The next question is Rob Owens with Piper Sandler. Please go ahead.

Operator: The next question is Rob Owens with Piper Sandler. Please go ahead.

Operator: The next question is Rob Owens with Piper Sandler. Please go ahead.

Speaker #6: Great. Thank you very much for taking my question. I apologize upfront for the flurry of questions in one here, but I will keep it to one question but maybe three parts.

Rob Owens: Great. Thank you very much for taking my question. I'll apologize up front for the flurry of questions in one here, but I will keep it to one question, but maybe three parts. Really want to focus on the outperformance in other subscription. I understand you're gonna meet customers where they want to buy. I guess upfront, was some of that strength potentially pushouts that you saw in the prior quarter? If I look at your sales-led subscription forecast for Q4 and the fact it's down quarter-over-quarter, which you haven't seen historically, it's usually a little bit up. Is that really a function of the strength you saw here in the January quarter or something else to be read into that?

Rob Owens: Great. Thank you very much for taking my question. I'll apologize up front for the flurry of questions in one here, but I will keep it to one question, but maybe three parts. Really want to focus on the outperformance in other subscription. I understand you're gonna meet customers where they want to buy. I guess upfront, was some of that strength potentially pushouts that you saw in the prior quarter? If I look at your sales-led subscription forecast for Q4 and the fact it's down quarter-over-quarter, which you haven't seen historically, it's usually a little bit up. Is that really a function of the strength you saw here in the January quarter or something else to be read into that?

Speaker #6: I really want to focus on the outperformance in other subscription. And I understand you're going to meet customers where they want to buy. So I guess, upfront, was some of that strength potentially push-outs that you saw in the prior quarter? Then, if I look at your sales-led subscription forecast for Q4 and the fact it's down quarter over quarter—which you haven't seen historically, it's usually a little bit up—

Speaker #6: Is that really a function of the strength you saw here in the January quarter or something else to be read into that? And lastly, when we think about monetization of self-managed versus cloud customers and your ability to expand them over the coming years, can you maybe articulate the difference between the two if there's much there?

Rob Owens: Lastly, when we think about monetization of self-managed versus cloud customers and your ability to expand them over the coming years, can you maybe articulate the difference between the two, if there's much there? Again, apologize for the three questions, but hopefully they're brief answers.

Rob Owens: Lastly, when we think about monetization of self-managed versus cloud customers and your ability to expand them over the coming years, can you maybe articulate the difference between the two, if there's much there? Again, apologize for the three questions, but hopefully they're brief answers.

Speaker #6: So again, apologize for the three questions, but hopefully they're brief answers.

Speaker #4: Yeah, Rob. Let me start. This is Asher. Let me start and then pass it on to Navam. In terms of our strength in self-managed, this is not just about push outs or anything of that sort.

Ash Kulkarni: Yeah, Rob, let me start. This is Ash. Let me start and then pass it on to Navam. You know, in terms of our strength in self-managed, this is not just about pushouts or anything of that sort. We are continuing to see a lot of strength in our self-managed business. You know, at the end of the day, what we are seeing now, especially with AI, is a lot of customers are applying AI on data that they consider to be extremely critical, extremely sensitive. This is not just with government customers, this is also in other regulated industries. For that reason, they're choosing or they're preferring to keep the data where it's within their control, within their environment. That doesn't always mean in their own data centers.

Ash Kulkarni: Yeah, Rob, let me start. This is Ash. Let me start and then pass it on to Navam. You know, in terms of our strength in self-managed, this is not just about pushouts or anything of that sort. We are continuing to see a lot of strength in our self-managed business. You know, at the end of the day, what we are seeing now, especially with AI, is a lot of customers are applying AI on data that they consider to be extremely critical, extremely sensitive. This is not just with government customers, this is also in other regulated industries. For that reason, they're choosing or they're preferring to keep the data where it's within their control, within their environment. That doesn't always mean in their own data centers.

Speaker #4: We are continuing to see a lot of strength in our self-managed business. At the end of the day, what we are seeing now, especially with AI, is a lot of customers are applying AI on data that they consider to be extremely critical, extremely sensitive.

Speaker #4: This is not just with government customers. This is also in other regulated industries. And for that reason, they're choosing or they're preferring to keep the data where it's within their control, within their environment.

Speaker #4: And that doesn't always mean in their own data centers. It might also mean within their own cloud VPCs. And we give them the flexibility to be able to do that.

Ash Kulkarni: It might also mean within their own cloud VPCs, and we give them the flexibility to be able to do that. These are modern workloads that continue to grow as their usage of AI grows, and we are gonna continue to benefit from it, which is why we believe it is really important to not just look at cloud, but to look at the whole picture and take into account the strong growth that we are seeing even on self-managed. I'll let Navamad address the other questions.

Ash Kulkarni: It might also mean within their own cloud VPCs, and we give them the flexibility to be able to do that. These are modern workloads that continue to grow as their usage of AI grows, and we are gonna continue to benefit from it, which is why we believe it is really important to not just look at cloud, but to look at the whole picture and take into account the strong growth that we are seeing even on self-managed. I'll let Navamad address the other questions.

Speaker #4: So these are modern workloads that continue to grow as their usage of AI grows. And we are going to continue to benefit from it, which is why we believe it is really important to not just look at cloud, but to look at the whole picture and take into account the strong growth that we are seeing even on self-managed.

Speaker #4: And I'll let Navam at the address the other questions.

Speaker #5: Yeah, Rob, I'll address your quarterly sequential question. So overall, I'll start with how the business is doing. We're continuing to execute very well on the sales-led motion.

Navam Welihinda: Yeah, Rob, I'll address your our quarterly sequential question. You know, overall, I'll start with how the business is doing. We're continuing to execute very well on the sales-led motion. This is another quarter of good execution from the sales side, and we saw that play out in our CRPO and RPO numbers accelerating as well. If you're looking at commitments, we're seeing a good commitment volume and there's no deceleration on that. On top of that, the pipeline is very healthy and growing each quarter. Overall, from a business perspective, very happy with where the quarter turned out and very positive about the future quarters as well. That leads us to the guide.

Navam Welihinda: Yeah, Rob, I'll address your our quarterly sequential question. You know, overall, I'll start with how the business is doing. We're continuing to execute very well on the sales-led motion. This is another quarter of good execution from the sales side, and we saw that play out in our CRPO and RPO numbers accelerating as well. If you're looking at commitments, we're seeing a good commitment volume and there's no deceleration on that. On top of that, the pipeline is very healthy and growing each quarter. Overall, from a business perspective, very happy with where the quarter turned out and very positive about the future quarters as well. That leads us to the guide.

Speaker #5: This is another quarter of good execution from the sales side. And we saw that play out in our CRPO and RPO numbers accelerating as well.

Speaker #5: If you're looking at commitments, we're seeing a good commitment volume. And there's no deceleration on that. And on top of that, the pipeline is very healthy.

Speaker #5: And growing each quarter. So overall, from a business perspective, very happy with where the quarter turned out and very positive about the future quarters as well.

Speaker #5: So that leads us to the guide. So when you think about the guide, we always guide with an appropriate amount of prudence on what we can achieve and outperform every quarter.

Navam Welihinda: When you think about the guide, we always guide with an appropriate amount of prudence on what we can achieve and outperform every quarter. When you look at historical numbers versus actuals and guidance, you're comparing an actual number against a guidance number, and the guidance number has risk incorporated into that forward-looking projection. I'll first point to that, and the second point I'd make is that the Q4 has 3 less days, which translates to a 3% headwind or a $14 to $15 million headwind for us on a revenue basis because there's just less days of revenue to recognize. All of that is incorporated in the guide.

Navam Welihinda: When you think about the guide, we always guide with an appropriate amount of prudence on what we can achieve and outperform every quarter. When you look at historical numbers versus actuals and guidance, you're comparing an actual number against a guidance number, and the guidance number has risk incorporated into that forward-looking projection. I'll first point to that, and the second point I'd make is that the Q4 has 3 less days, which translates to a 3% headwind or a $14 to $15 million headwind for us on a revenue basis because there's just less days of revenue to recognize. All of that is incorporated in the guide.

Speaker #5: So when you look at historical numbers versus actuals and guidance, you're comparing an actual number against a guidance number. And the guidance number has risk incorporated into that forward-looking projection.

Speaker #5: So I'll first point to that. And the second point I'll make is that the fourth quarter has three less days, which translates to a 3% headwind, or a $14 million to $15 million headwind for us on a revenue basis, because there's just less days of revenue to recognize.

Speaker #5: And all of that is incorporated in the guide. And if you look at last year's Q4 guides or Q4 guides in the past, they're being occasions where we've guided lower than the current quarter.

Navam Welihinda: If you look at last year's Q4 guides or Q4 guides in the past, there have been occasions where we've guided lower than the current quarter. Just keep that in mind. We continue to keep well on track with achieving our midterm targets, and we feel very positive about the strength of the business itself.

Navam Welihinda: If you look at last year's Q4 guides or Q4 guides in the past, there have been occasions where we've guided lower than the current quarter. Just keep that in mind. We continue to keep well on track with achieving our midterm targets, and we feel very positive about the strength of the business itself.

Speaker #5: So just keep that in mind. We continue to keep well on track with achieving our midterm targets. And we feel very positive about the strength of the business itself.

Speaker #6: Great. Thank you.

Rob Owens: Great. Thank you.

Rob Owens: Great. Thank you.

Speaker #2: The next question is from Matt Hedberg with RBC. Please go ahead.

Operator: The next question is from Matt Hedberg with RBC. Please go ahead.

Operator: The next question is from Matt Hedberg with RBC. Please go ahead.

Speaker #6: Great. Thanks for taking my question, guys. Ash, I wanted to ask you about AI. And obviously, we've all seen the pressure the software market.

Matt Hedberg: Great. Thanks for taking my question, guys. You know, Ash, you know, I wanted to ask you about AI, and obviously we've all seen it pressure the software market, and I appreciate your comments at the start of the call. Thought it was really helpful to kind of get your perspective on AI, and it seems like there's a lot of great momentum from a customer perspective. I guess my question is, you know, when we're looking at these frontier models, do you see them as future competition or more of a partnership opportunity?

Matt Hedberg: Great. Thanks for taking my question, guys. You know, Ash, you know, I wanted to ask you about AI, and obviously we've all seen it pressure the software market, and I appreciate your comments at the start of the call. Thought it was really helpful to kind of get your perspective on AI, and it seems like there's a lot of great momentum from a customer perspective. I guess my question is, you know, when we're looking at these frontier models, do you see them as future competition or more of a partnership opportunity?

Speaker #6: And I appreciate your comments at the start of the closet. It was really helpful to kind of get your perspective on AI. And it seems like there's a lot of great momentum from a customer perspective.

Speaker #6: I guess my question is, when we're looking at these frontier models, do you see them as future competition or more of a partnership opportunity?

Speaker #4: You know, really, we don't—in our opinion, AI doesn't displace us. It really depends on us, because if you think about these frontier models, they're amazing reasoning engines.

Ash Kulkarni: You know, really, in our opinion, AI doesn't displace us. It really depends on us because if you think about these frontier models, they're amazing reasoning engines. Like the way I think about them is they are gonna be the operating systems of tomorrow. Just as operating systems today also require data systems to feed appropriate data and context to these operating systems to actually, you know, build applications with, you're gonna need the same thing going forward. Our role in this whole ecosystem is to make sure that we can very quickly, in real time, across all of the petabytes of data that every organization holds, give the right context to these LLMs so they can do their job. That's the reason why I believe that, you know, in the world of tomorrow, you're gonna have agents talking to each other.

Ash Kulkarni: You know, really, in our opinion, AI doesn't displace us. It really depends on us because if you think about these frontier models, they're amazing reasoning engines. Like the way I think about them is they are gonna be the operating systems of tomorrow. Just as operating systems today also require data systems to feed appropriate data and context to these operating systems to actually, you know, build applications with, you're gonna need the same thing going forward. Our role in this whole ecosystem is to make sure that we can very quickly, in real time, across all of the petabytes of data that every organization holds, give the right context to these LLMs so they can do their job. That's the reason why I believe that, you know, in the world of tomorrow, you're gonna have agents talking to each other.

Speaker #4: The way I think about them is they are going to be the operating systems of tomorrow. But just as operating systems today, also require data systems to feed appropriate data and context to these operating systems to actually build applications with, you're going to need the same thing going forward.

Speaker #4: And our role in this whole ecosystem is to make sure that we can very quickly in real time across all of the petabytes of data that every organization holds, give the right context to these LLMs so they can do their job.

Speaker #4: And that's the reason why I believe that in the world of tomorrow, you're going to have agents talking to each other. You're going to have agents that you build with Elastic Agent Builder that are talking to Claude Cowork, that are talking to things that you build with OpenAI Frontier.

Ash Kulkarni: You're gonna have agents that you build with Elastic Agent Builder that are talking to Cloud Cowork, that are talking to, you know, things that you build with OpenAI Frontier. We already support the MCP protocols, the A2A protocols that allow for that kind of communication. This is a world where we feel that, you know, the fact that we have this tremendous position, the capabilities with our vector database, the capabilities with our entire context engineering platform to become a critical part of the infrastructure going forward. We're already partnering with hyperscalers, and we already integrate with all of these frontier class models today.

Ash Kulkarni: You're gonna have agents that you build with Elastic Agent Builder that are talking to Cloud Cowork, that are talking to, you know, things that you build with OpenAI Frontier. We already support the MCP protocols, the A2A protocols that allow for that kind of communication. This is a world where we feel that, you know, the fact that we have this tremendous position, the capabilities with our vector database, the capabilities with our entire context engineering platform to become a critical part of the infrastructure going forward. We're already partnering with hyperscalers, and we already integrate with all of these frontier class models today.

Speaker #4: And we already support the MCP protocols, the A2A protocols that allow for that kind of communication. So this is a world where we feel that the fact that we have this tremendous position, the capabilities with our vector database, the capabilities with our entire context engineering platform to become a critical part of the infrastructure going forward, and we're already partnering with Hyperscalers and we already integrate with all of these frontier-class models today.

Speaker #6: It's a great perspective. Thank you for that. And then maybe just one quick follow-up about Elastic internally using AI. How are you seeing some of the tangible benefits?

Matt Hedberg: It's a great perspective. Thank you for that. Then maybe just one quick follow-up about Elastic internally using AI. You know, how are you seeing some of the tangible benefits and, you know, how might that impact headcount in the future?

Matt Hedberg: It's a great perspective. Thank you for that. Then maybe just one quick follow-up about Elastic internally using AI. You know, how are you seeing some of the tangible benefits and, you know, how might that impact headcount in the future?

Speaker #6: And how might that impact headcount in the future?

Speaker #4: Yeah, look, we are all in on AI, not just in terms of what we're doing externally, in terms of providing the platform that we're building, but also in terms of how we are using AI internally.

Ash Kulkarni: Yeah, look, we are all in on AI, not just in terms of what we are doing externally, in terms of providing the platform that we are building, but also in terms of how we are using AI internally. You know, just to give you some context in this, 2 years ago, we built out our first agent, our first support agent within the company, and that's been, you know, in production for a long time now. It's what our customers first hit when they have support questions. The amount of queries it's able to answer, and the number of support tickets it's able to deflect has not only improved the overall performance, the overall experience for our customers when they come to us for support, but it has also significantly reduced the demand on headcount from our side.

Ash Kulkarni: Yeah, look, we are all in on AI, not just in terms of what we are doing externally, in terms of providing the platform that we are building, but also in terms of how we are using AI internally. You know, just to give you some context in this, 2 years ago, we built out our first agent, our first support agent within the company, and that's been, you know, in production for a long time now. It's what our customers first hit when they have support questions. The amount of queries it's able to answer, and the number of support tickets it's able to deflect has not only improved the overall performance, the overall experience for our customers when they come to us for support, but it has also significantly reduced the demand on headcount from our side.

Speaker #4: Just to give you some context in this, a couple of years ago, we built out our first agent, our first support agent within the company.

Speaker #4: And that's been in production for a long time now. It's what our customers first hit when they have support questions. And the amount of queries it's able to answer and the number of support tickets it's able to deflect has not only improved the overall performance, the overall experience for our customers when they come to us for support, but it has also significantly reduced the demand on headcount from our side.

Speaker #4: So in the last two years, even as our business has been growing, and as you can imagine, typically support workloads grow with the business, we have been able to manage that workload growth without adding any headcount to that support team.

Ash Kulkarni: In the last two years, even as our business has been growing, and as you can imagine, typically support workloads grow with the business, we have been able to manage that workload growth without adding any headcount to that support team. you know, in other parts of the business, whether it's in HR, whether it's in finance, in legal, we are heavily using AI tools. Some of these are built on our stack. Some of them might be external products that we are leveraging. Even in engineering, we are finding tremendous value in using multiple different, you know, code generation tools that we use within the company.

Ash Kulkarni: In the last two years, even as our business has been growing, and as you can imagine, typically support workloads grow with the business, we have been able to manage that workload growth without adding any headcount to that support team. you know, in other parts of the business, whether it's in HR, whether it's in finance, in legal, we are heavily using AI tools. Some of these are built on our stack. Some of them might be external products that we are leveraging. Even in engineering, we are finding tremendous value in using multiple different, you know, code generation tools that we use within the company.

Speaker #4: In other parts of the business, whether it's in HR, whether it's in finance, in legal, we are heavily using AI tools. Some of these are built on our stack.

Speaker #4: Some of them might be external products that we are leveraging. And even in engineering, we are finding tremendous value in using multiple different code generation tools that we use within the company.

Speaker #4: So, overall, we believe that this is going to definitely help us not just accelerate the pace of innovation, which we are already seeing now, but also improve productivity and improve the overall efficiency of the business.

Ash Kulkarni: Overall, we believe that this is going to definitely help us not just accelerate the pace of innovation, which we are already seeing now, but also, you know, improve the productivity and improve the overall efficiency of the business. That's what's exciting about this. You know, we are able to help our customers with this, but we are also able to benefit from it ourselves.

Ash Kulkarni: Overall, we believe that this is going to definitely help us not just accelerate the pace of innovation, which we are already seeing now, but also, you know, improve the productivity and improve the overall efficiency of the business. That's what's exciting about this. You know, we are able to help our customers with this, but we are also able to benefit from it ourselves.

Speaker #4: And that's what's exciting about this. We are able to help our customers with this, but we're also able to benefit from it ourselves.

Speaker #6: Got it. Thanks.

Matt Kutrian: Got it. Thanks.

Matt Calkins: Got it. Thanks.

Speaker #2: The next question is from Brian Essex with JPMorgan. Please go ahead.

Operator: The next question is from Brian Essex with J.P. Morgan. Please go ahead.

Operator: The next question is from Brian Essex with J.P. Morgan. Please go ahead.

Speaker #5: Hi. Good afternoon. Thank you for taking the question. I appreciate your response to Matt's question. With regard to your vector database capabilities and context engineering platform, I guess as you look at the changing landscape and you look at different approaches different ways to think about things, are there anything how do we think about the platform and its ability to adhere to some of those approaches?

Brian Essex: Hi, good afternoon. Thank you for taking the question. I appreciate your response to Matt's question, with, you know, with regard to your vector database capabilities and context engineering platform. I guess as you look at the changing landscape and you look at different approaches, different ways to think about things, how do we think about the platform and its ability to adhere to some of those approaches? Like for example, the page index approach to RAG. You know, if they saw the cost and latency issues involved with that approach, you know, are you well-positioned to benefit from something like that and pivot with your approach?

Brian Essex: Hi, good afternoon. Thank you for taking the question. I appreciate your response to Matt's question, with, you know, with regard to your vector database capabilities and context engineering platform. I guess as you look at the changing landscape and you look at different approaches, different ways to think about things, how do we think about the platform and its ability to adhere to some of those approaches? Like for example, the page index approach to RAG. You know, if they saw the cost and latency issues involved with that approach, you know, are you well-positioned to benefit from something like that and pivot with your approach?

Speaker #5: For example, the page index approach to RAG. Are you if they solve the cost and latency issues involved with that approach, are you well positioned to benefit from something like that and pivot with your approach?

Speaker #4: Yeah, look, RAG, retrieval-augmented generation itself has progressed a lot since the last several years when it was first introduced as a concept. But fundamentally, this comes down to finding the most appropriate context that is relevant for the LLM to do its job.

Ash Kulkarni: Yeah. Look, RAG, you know, retrieval augmented generation itself has progressed a lot since the last several years when it was first introduced as a concept. Fundamentally, you know, this comes down to, you know, finding the most appropriate context that is relevant for the LLM to do its job. Sometimes that requires you to understand, you know, specific data relationships that might exist. Sometimes it requires you to just search through all of your data. Sometimes it requires you to understand, you know, specific things in, things like preferences and so on that you might have captured in other data systems. It's an amalgamation of all of this. As RAG continues to evolve, as these techniques become more and more sophisticated, we are actually on the leading front of capturing, you know, more than one single technique into our platform.

Ash Kulkarni: Yeah. Look, RAG, you know, retrieval augmented generation itself has progressed a lot since the last several years when it was first introduced as a concept. Fundamentally, you know, this comes down to, you know, finding the most appropriate context that is relevant for the LLM to do its job. Sometimes that requires you to understand, you know, specific data relationships that might exist. Sometimes it requires you to just search through all of your data. Sometimes it requires you to understand, you know, specific things in, things like preferences and so on that you might have captured in other data systems. It's an amalgamation of all of this. As RAG continues to evolve, as these techniques become more and more sophisticated, we are actually on the leading front of capturing, you know, more than one single technique into our platform.

Speaker #4: Sometimes that requires you to understand specific data relationships that might exist. Sometimes it requires you to just search through all of your data. Sometimes it requires you to understand specific things.

Speaker #4: Things like preferences and so on that you might have captured in other data systems. And it's an amalgamation of all of this. And as RAG continues to evolve, as these techniques become more and more sophisticated, we are actually on the leading front of capturing more than one single technique into our platform.

Speaker #4: We were one of the first to adopt hybrid search. And we were the first to talk about it. And since then, we have continued with that kind of momentum.

Ash Kulkarni: You know, we were one of the first to adopt hybrid search. We were the first to talk about it. Since then we have continued with that kind of momentum. Absolutely, I feel very, very confident that we are gonna be on the bleeding edge. You know, this is, at the end of the day, what Elastic was born to do. We've always been in the business of relevance. Without relevance, you don't get good search. Without relevance, you don't get good, accurate AI.

Ash Kulkarni: You know, we were one of the first to adopt hybrid search. We were the first to talk about it. Since then we have continued with that kind of momentum. Absolutely, I feel very, very confident that we are gonna be on the bleeding edge. You know, this is, at the end of the day, what Elastic was born to do. We've always been in the business of relevance. Without relevance, you don't get good search. Without relevance, you don't get good, accurate AI.

Speaker #4: So absolutely, I feel very, very confident that we are going to be on the bleeding edge at the end of the day—what Elastic was born to do.

Speaker #4: We've always been in the business of relevance. Without relevance, you don't get good search. Without relevance, you don't get good, accurate AI.

Speaker #5: Great. That's super helpful. Maybe just one quick follow-up. Any traction from the recent CISA win that you had? Are any Fed agencies leveraging that for some referenceability?

Brian Essex: Great. That's, that's super helpful. Maybe just one quick follow-up. Any traction from the recent CISA win that you had? Are any Fed agencies leveraging that for SIEM reference ability and are you seeing better activity on the back of that win?

Brian Essex: Great. That's, that's super helpful. Maybe just one quick follow-up. Any traction from the recent CISA win that you had? Are any Fed agencies leveraging that for SIEM reference ability and are you seeing better activity on the back of that win?

Speaker #5: And are you seeing better activity on the back of that win?

Speaker #4: Yeah, it's been a great success. Yeah, thank you. It's been a great success for us already. I think we mentioned it in our press release as well.

Ash Kulkarni: It's been a great success. Yeah. Thank you. It's been a great success for us already. I think we mentioned it in our press release as well, that, you know, SIEM as a service with CISA continues to grow. We saw additional agencies coming on board even in Q3. I would expect that CISA win to be just the beginning of, you know, multiple agencies coming onto that service over the next several quarters. Fundamentally, CISA is considered to be, you know, the primary agency responsible for cybersecurity in the civilian government in the United States. That kind of endorsement is something that goes a long way. It's a very exciting win. Like I said, you know, we are gonna benefit from it for many quarters and many years to come.

Ash Kulkarni: It's been a great success. Yeah. Thank you. It's been a great success for us already. I think we mentioned it in our press release as well, that, you know, SIEM as a service with CISA continues to grow. We saw additional agencies coming on board even in Q3. I would expect that CISA win to be just the beginning of, you know, multiple agencies coming onto that service over the next several quarters. Fundamentally, CISA is considered to be, you know, the primary agency responsible for cybersecurity in the civilian government in the United States. That kind of endorsement is something that goes a long way. It's a very exciting win. Like I said, you know, we are gonna benefit from it for many quarters and many years to come.

Speaker #4: That SIM-as-a-Service with CISA continues to grow, and we saw additional agencies coming on board even in Q3. So I would expect that CISA win to be just the beginning of multiple agencies coming onto that service over the next several quarters.

Speaker #4: Fundamentally, CISA is considered to be the primary agency responsible for cybersecurity in the civilian government in the United States. And that kind of endorsement is something that goes a long way.

Speaker #4: So it's a very exciting win. Like I said, we are going to benefit from it for many quarters, and many years to come.

Speaker #5: Great. Super helpful. Thank you very much and congrats.

Brian Essex: Great. Super helpful. Thank you very much and congrats.

Brian Essex: Great. Super helpful. Thank you very much and congrats.

Speaker #4: Thank you.

Ash Kulkarni: Thank you.

Ash Kulkarni: Thank you.

Speaker #2: The next question is from Brent Phil with Jefferies. Please go ahead.

Operator: The next question is from Brent Thill with Jefferies. Please go ahead.

Operator: The next question is from Brent Thill with Jefferies. Please go ahead.

Speaker #6: Thanks, Heyash. Just on the CRPO 15% constant currency 15 last quarter, I guess I mean, good mid-team growth, but I think everyone is asking, why aren't we seeing a faster inflection?

Brent Thill: Thanks. Hey, Ash. Just on the CRPO 15%, concurrency, 15 last quarter, I guess, I mean, good mid-teen growth, but I think everyone is asking, you know, why aren't we seeing a faster inflection? I know you have a true north to 20. It seems like the numbers support that you can accelerate to 20, but just curious how you bridge to 20 and perhaps why maybe you're not seeing a little bit stronger AI tailwind in the near term.

Brent Thill: Thanks. Hey, Ash. Just on the CRPO 15%, concurrency, 15 last quarter, I guess, I mean, good mid-teen growth, but I think everyone is asking, you know, why aren't we seeing a faster inflection? I know you have a true north to 20. It seems like the numbers support that you can accelerate to 20, but just curious how you bridge to 20 and perhaps why maybe you're not seeing a little bit stronger AI tailwind in the near term.

Speaker #6: I know you have a True North to 20. It seems like the numbers support that you can accelerate to 20. But just curious kind of how you bridged to 20 and perhaps why maybe you're not seeing a little bit stronger AI tailwind in the near term.

Speaker #4: Yeah, thanks for the question, Brent. So I'll start. CRPOs crossed over a billion were at 19% growth right now. RPOs at 22% growth. That's the best we've seen in two years and we're very happy with the progress that we're making.

Ash Kulkarni: Yeah, thanks for the question, Brent. I'll start. you know, CRPOs crossed over $1 billion. we're at 19% growth right now. RPO is at 22% growth. That's the best we've seen in 2 years, and we're very happy with the progress that we're making. If you just look at the absolute dollar additions that we added in the quarter, it's progressing very, very well. That's all pointing to the core things that are driving that CRPO growth, which is

Ash Kulkarni: Yeah, thanks for the question, Brent. I'll start. you know, CRPOs crossed over $1 billion. we're at 19% growth right now. RPO is at 22% growth. That's the best we've seen in 2 years, and we're very happy with the progress that we're making. If you just look at the absolute dollar additions that we added in the quarter, it's progressing very, very well. That's all pointing to the core things that are driving that CRPO growth, which is

Speaker #4: And if you just look at the absolute dollar additions that we added in the quarter, it's progressing very, very well. So that's all pointing to the core things that are driving that CRPO growth, which is strong customer commitments, which now we've been talking about for a couple of quarters now.

Navam Welihinda: Strong customer commitments, which now we've been talking about for a couple of quarters now, and it's been yet another quarter of very good sales execution leading to strong customer commitments. The AI tailwinds we talked about during Financial Analyst Day, we're seeing them right now, and they are continuing to grow as we see more and more of the 100 K have or adopt AI workloads from us. We think that there's a good, strong trajectory from this point ahead as we see more AI penetration among our 100 K customer base.

Navam Welihinda: Strong customer commitments, which now we've been talking about for a couple of quarters now, and it's been yet another quarter of very good sales execution leading to strong customer commitments. The AI tailwinds we talked about during Financial Analyst Day, we're seeing them right now, and they are continuing to grow as we see more and more of the 100 K have or adopt AI workloads from us. We think that there's a good, strong trajectory from this point ahead as we see more AI penetration among our 100 K customer base.

Speaker #4: And it's been yet another quarter of good, very good sales execution leading to strong customer commitments. So the AI tailwinds, we talked about during financial analyst day, we're seeing them right now.

Speaker #4: And they are continuing to grow as we see more and more of the 100K have or adopt AI workloads from us. So we think that there is a good, strong trajectory from this point ahead.

Speaker #4: As we see more AI penetration among our 100K customer base. The other thing that I will say to this Brent is that if you look at the full year guide for sales-led subscription revenue, you can see that the strength in our business continues and look, for us, the mid-term guide that we laid out is not the place where we end up.

Ash Kulkarni: The other thing that I will say to this, Brent, is that if you look at the full year guide for sales net subscription revenue, you can see that the strength in our business, you know, continues. Look, for us, the midterm guide that we laid out is not the place where we end up. That's the place that, you know, we believe we can go beyond that. If you remember, we talked about 20+, and really that's the way we see it. As more and more customers adopt our AI functionality, given the fact that those cohorts tend to grow and expand faster, we feel very, very good about how we are tracking to that midterm.

Ash Kulkarni: The other thing that I will say to this, Brent, is that if you look at the full year guide for sales net subscription revenue, you can see that the strength in our business, you know, continues. Look, for us, the midterm guide that we laid out is not the place where we end up. That's the place that, you know, we believe we can go beyond that. If you remember, we talked about 20+, and really that's the way we see it. As more and more customers adopt our AI functionality, given the fact that those cohorts tend to grow and expand faster, we feel very, very good about how we are tracking to that midterm.

Speaker #4: That's the place that we believe we can go beyond that. If you remember, we talked about 20 plus. And really, that's the way we see it.

Speaker #4: So as more and more customers adopt our AI functionality, given the fact that those cohorts tend to grow and expand faster, we feel very, very good about how we are tracking to that mid-term.

Speaker #4: And we feel very good about the fact that as that traction continues, we feel good about even exceeding what we've talked about in the past.

Ash Kulkarni: We feel very good about the fact that, you know, as that traction continues, you know, we feel good about even exceeding what we've talked about in the past.

Ash Kulkarni: We feel very good about the fact that, you know, as that traction continues, you know, we feel good about even exceeding what we've talked about in the past.

Speaker #2: The next question is from Howard Ma with Guggenheim. Please go ahead.

Operator: The next question is from Howard Ma with Guggenheim. Please go ahead.

Operator: The next question is from Howard Ma with Guggenheim. Please go ahead.

Speaker #5: Hey, great. Thanks. I wanted to ask about cloud. And I guess this one's for Navam. I want to throw out a caveat first, which is that I appreciate your deployment agnosticism and fewer days in Q4.

Howard Ma: Hey, great. Thanks. I wanted to ask about cloud, and I guess this one's for Navam. You know, I wanna throw out a caveat first, which is that I appreciate your deployment agnosticism and fewer days in Q4. When I look at cloud revenue in Q4 versus Q3 in FY 2022 and earlier, there was more of a sequential step up than in FY 2023 through FY 2025, which were obviously impacted by, you know, you had industry-wide cloud optimizations. Also, Elastic had company-specific go-to-market issues. Now that the go-to-market execution has improved significantly, and given the visibility that you now have into how large customers ramp consumption relative to the commits, and that includes some of the $10 million plus TCV contracts that you signed last quarter.

Howard Ma: Hey, great. Thanks. I wanted to ask about cloud, and I guess this one's for Navam. You know, I wanna throw out a caveat first, which is that I appreciate your deployment agnosticism and fewer days in Q4. When I look at cloud revenue in Q4 versus Q3 in FY 2022 and earlier, there was more of a sequential step up than in FY 2023 through FY 2025, which were obviously impacted by, you know, you had industry-wide cloud optimizations. Also, Elastic had company-specific go-to-market issues. Now that the go-to-market execution has improved significantly, and given the visibility that you now have into how large customers ramp consumption relative to the commits, and that includes some of the $10 million plus TCV contracts that you signed last quarter.

Speaker #5: When I look at cloud revenue in Q4 versus Q3 in FY22 and earlier, there was more of a sequential step-up than in FY23 through FY25, which were obviously impacted by you had industry-wide cloud optimizations, also Elastic had company-specific go-to-market issues.

Speaker #5: But now that the go-to-market execution has improved significantly, and given the visibility that you now have into how large customers ramp consumption relative to their commits—and that includes some of the $10 million-plus TCV contracts that you signed last quarter—the question is: Is there any reason why the sequential cloud growth in Q4 would not be more in line with the earlier years?

Howard Ma: The question is there any reason why the sequential cloud growth in Q4 would not be more in line with the earlier years?

Howard Ma: The question is there any reason why the sequential cloud growth in Q4 would not be more in line with the earlier years?

Speaker #4: So I'll start off with what I always start off on, which is sales-led subscription revenue growth is the right metric for you to focus on in measuring us as a barometer as the success of the company and the barometer of success of the company.

Navam Welihinda: I'll start off with what I always start off on, which is sales-led subscription revenue growth is the right metric for you to focus on in measuring us as a barometer, as the success of the company and the barometer of success of the company. I talked about this during our prepared remarks as well. There's multiple examples, including this quarter, of AI workloads being sold as self-managed and deployed either in the customer's cloud or in their hybrid environments. Sales-led subscription grew a healthy 21% this year. If you look at just cloud, you know, and the number there again is what is the sales-led cloud number, that grew 27% year-over-year this quarter.

Navam Welihinda: I'll start off with what I always start off on, which is sales-led subscription revenue growth is the right metric for you to focus on in measuring us as a barometer, as the success of the company and the barometer of success of the company. I talked about this during our prepared remarks as well. There's multiple examples, including this quarter, of AI workloads being sold as self-managed and deployed either in the customer's cloud or in their hybrid environments. Sales-led subscription grew a healthy 21% this year. If you look at just cloud, you know, and the number there again is what is the sales-led cloud number, that grew 27% year-over-year this quarter.

Speaker #4: And I talked about this during our prepared remarks as well. There's multiple examples, including this quarter, of AI workloads being sold as self-managed and deployed either in the customer's cloud or in their hybrid environments.

Speaker #4: So sales-led subscription grew. A healthy 21% this year. If you look at just cloud, and the number there again is what is the sales-led cloud number, that grew 27% year over year this quarter.

Speaker #4: So we're seeing very good traction on the metric that we matter, the metric that matters to us, which is sales-led subscription revenue. And also on the annual cloud number this quarter was very good as well at 27%.

Navam Welihinda: We're seeing very good traction on the metric that matters to us, which is sales-led subscription revenue, and also on the annual cloud number this quarter was very good as well at 27%. The forward quarters, number 1, you have 3 less days, so that's 3 less days to focus on it. The forward quarter is a risk-adjusted number, so you can't really compare an actual to a guidance number. The point I'd like to make is that we're seeing very strong commitments and very strong performance on sales-led.

Navam Welihinda: We're seeing very good traction on the metric that matters to us, which is sales-led subscription revenue, and also on the annual cloud number this quarter was very good as well at 27%. The forward quarters, number 1, you have 3 less days, so that's 3 less days to focus on it. The forward quarter is a risk-adjusted number, so you can't really compare an actual to a guidance number. The point I'd like to make is that we're seeing very strong commitments and very strong performance on sales-led.

Speaker #4: The forward quarters, number one, you have three less days, so that's three less days to focus on. The forward quarter is a risk-adjusted number.

Speaker #4: So you can't really compare an actual to a guidance number. But the point I'd like to make is that we're seeing very strong commitments and very strong performance on sales-led.

Speaker #5: Okay. Thanks.

Howard Ma: Okay, thanks.

Howard Ma: Okay, thanks.

Speaker #2: The next question is from Ryan McWilliams with Wells Fargo. Please go ahead.

Operator: The next question is from Ryan MacWilliams with Wells Fargo. Please go ahead.

Operator: The next question is from Ryan MacWilliams with Wells Fargo. Please go ahead.

Speaker #6: Hey, team. This is Deshawn on for Ryan McWilliams. I wanted to ask, it really seems that based on some of the work we've been doing, that the number of agents and AI services in production have really increased over the past couple of months.

Deshawn: Hi, team. This is Deshawn. I'm for Ryan MacWilliams. I wanted to ask, it really seems that based on some of the work we've been doing, that the number of agents and AI services in production have really increased over the past couple of months. I wanted to hear from you what you're seeing within your customers. Like, are you seeing the types of AI use cases broaden out compared to what you were seeing maybe two quarters ago and how that's impacting, you know, usage and spend amongst those customers?

[Analyst] (Wells Fargo): Hi, team. This is Deshawn. I'm for Ryan MacWilliams. I wanted to ask, it really seems that based on some of the work we've been doing, that the number of agents and AI services in production have really increased over the past couple of months. I wanted to hear from you what you're seeing within your customers. Like, are you seeing the types of AI use cases broaden out compared to what you were seeing maybe two quarters ago and how that's impacting, you know, usage and spend amongst those customers?

Speaker #6: And I wanted to hear from you what you're seeing within your customers. Are you seeing the types of AI use cases brought in out compared to what you were seeing maybe two quarters ago and how that's impacting usage and spend amongst those customers?

Speaker #4: Yes, we are seeing the usage brought in out in the sense that we are seeing more and more variety of use cases. That involve AI.

Ash Kulkarni: Yes, we are seeing the usage broaden out in the sense that we are seeing more and more variety of use cases that involve AI. You know, eight quarters ago, the bulk of what we were seeing was only around vector databases, vector search, hybrid search, semantic search. It was mostly around the chat style interface kind of work. Now we are seeing, you know, agentic workflows being put together, not just around, you know, what you would typically think of as search-related workflows, but also around security workflows, around observability workflows. That was the reason why we gave the stat around our total count of customers using us for various AI use cases beyond just vector database. You know, that includes things like Agent Builder. That includes things like Attack Discovery.

Ash Kulkarni: Yes, we are seeing the usage broaden out in the sense that we are seeing more and more variety of use cases that involve AI. You know, eight quarters ago, the bulk of what we were seeing was only around vector databases, vector search, hybrid search, semantic search. It was mostly around the chat style interface kind of work. Now we are seeing, you know, agentic workflows being put together, not just around, you know, what you would typically think of as search-related workflows, but also around security workflows, around observability workflows. That was the reason why we gave the stat around our total count of customers using us for various AI use cases beyond just vector database. You know, that includes things like Agent Builder. That includes things like Attack Discovery.

Speaker #4: Eight quarters ago, the bulk of what we were seeing was only around vector databases, vector search, hybrid search, semantic search. It was mostly around the chat-style interface kind of work.

Speaker #4: Now we are seeing agentic workflows being put together, not just around what you would typically think of as search-related workflows, but also around security workflows, around observability workflows.

Speaker #4: And that was the reason why we gave the stat around our total count of customers using us for various AI use cases beyond just vector database.

Speaker #4: And that includes things like agent builder. That includes things like attack discovery. And in these kinds of scenarios, people are trying to automate their SOC workflows, their cybersecurity workflows.

Ash Kulkarni: In these kinds of scenarios, people are trying to automate their SOC workflows, their cybersecurity workflows, you know, for detection, for remediation. They're trying to do the same for SRE workflows around observability. The variety of use cases is growing, and as that grows, you know, we see an opportunity not just in our core search business, but also in the work that we are doing in security and observability.

Ash Kulkarni: In these kinds of scenarios, people are trying to automate their SOC workflows, their cybersecurity workflows, you know, for detection, for remediation. They're trying to do the same for SRE workflows around observability. The variety of use cases is growing, and as that grows, you know, we see an opportunity not just in our core search business, but also in the work that we are doing in security and observability.

Speaker #4: For detection, for remediation, they're trying to do the same for SRE workflows around observability. So the variety of use cases is growing. And as that grows, we see an opportunity not just in our core search business, but also in the work that we are doing in security and observability.

Speaker #6: Thanks, guys. Appreciate it.

Deshawn: Thanks, guys. Appreciate it.

[Analyst] (Wells Fargo): Thanks, guys. Appreciate it.

Speaker #2: The next question is from Miller Jump with Truist Securities. Please go ahead.

Operator: The next question is from Miller Jump with Truist Securities. Please go ahead.

Operator: The next question is from Miller Jump with Truist Securities. Please go ahead.

Speaker #5: Hey, great. Thank you for taking the question, and congrats on the sales-led momentum. Ash, you mentioned a MongoDB competitive win in the prepared remarks.

Miller Jump: Hey, great. Thank you for taking the question and congrats on the sales wide momentum. Ash, you mentioned a MongoDB competitive win in the prepared remarks. We hadn't heard as much about this head-to-head between the two of you until fairly recently. Are you seeing MongoDB increasingly in bake offs as customers look to build AI apps, or is that more of a one-off?

Miller Jump: Hey, great. Thank you for taking the question and congrats on the sales wide momentum. Ash, you mentioned a MongoDB competitive win in the prepared remarks. We hadn't heard as much about this head-to-head between the two of you until fairly recently. Are you seeing MongoDB increasingly in bake offs as customers look to build AI apps, or is that more of a one-off?

Speaker #5: We hadn't heard as much about this head-to-head between the two of you until fairly recently. So are you seeing MongoDB increasingly in bake-offs as customers look to build AI apps?

Speaker #5: Or is that more of a one-off?

Speaker #4: No, this was a situation where the customer had started to use that technology for a basic search application. They had some issues scaling it, and as they were trying to build a more scalable solution, especially for hybrid search, they realized that they needed something that could perform.

Ash Kulkarni: No, this was a situation where the customer had started to use, you know, that technology for a basic search application. They had some issues scaling it, and as they were trying to build a more scalable solution, especially for hybrid search, they realized that they needed something that could perform, and that was the customer win that I talked about. You know, at the end of the day, where we tend to typically play is in the area of unstructured data. We don't tend to see them as much, but from time to time, you know, you do see these kinds of situations.

Ash Kulkarni: No, this was a situation where the customer had started to use, you know, that technology for a basic search application. They had some issues scaling it, and as they were trying to build a more scalable solution, especially for hybrid search, they realized that they needed something that could perform, and that was the customer win that I talked about. You know, at the end of the day, where we tend to typically play is in the area of unstructured data. We don't tend to see them as much, but from time to time, you know, you do see these kinds of situations.

Speaker #4: And that was the customer win that I talked about. At the end of the day, where we tend to typically play is in the area of unstructured data.

Speaker #4: We don't tend to see them as much. But from time to time, you do see these kinds of situations.

Speaker #5: Thanks. And if I could just ask a quick follow-up for Navam. As large deals are becoming more of a contributor in your go-to-market strategy moving up market, can you just remind us how you're handling those large deals in your guidance process and any considerations around seasonality there?

Miller Jump: Thanks. If I could just ask a quick follow-up for Navam. As large deals are becoming more of a contributor in your go-to-market strategy moving up market, can you just remind us how you're handling those large deals in your guidance process and any considerations around seasonality there? Thanks.

Miller Jump: Thanks. If I could just ask a quick follow-up for Navam. As large deals are becoming more of a contributor in your go-to-market strategy moving up market, can you just remind us how you're handling those large deals in your guidance process and any considerations around seasonality there? Thanks.

Speaker #5: Thanks.

Speaker #4: Yeah. Seasonality-wise, I think it's just follows the normal typical enterprise seasonality pattern where they end up being more tailored and weighted in Q3 and Q4.

Navam Welihinda: Yeah. Seasonality-wise, I think it just follows the normal typical enterprise seasonality pattern, where they end up being more tail-end weighted in Q3 and Q4. You know, we talked about large deals in the last quarter. They happen every quarter. It's just the volume of bookings are bigger in towards the tail end of the year. In terms of how we handle it, I think that this is a natural byproduct of just being successful with our customers, particularly the larger customers within the G2K. We welcome it. When we look at our guidance and what we expect the full year to be, we naturally take a haircut on specific deals that could move from one quarter to another. That's how we incorporate it into our guidance.

Navam Welihinda: Yeah. Seasonality-wise, I think it just follows the normal typical enterprise seasonality pattern, where they end up being more tail-end weighted in Q3 and Q4. You know, we talked about large deals in the last quarter. They happen every quarter. It's just the volume of bookings are bigger in towards the tail end of the year. In terms of how we handle it, I think that this is a natural byproduct of just being successful with our customers, particularly the larger customers within the G2K. We welcome it. When we look at our guidance and what we expect the full year to be, we naturally take a haircut on specific deals that could move from one quarter to another. That's how we incorporate it into our guidance.

Speaker #4: But we talked about large deals. In the last quarter, they happen every quarter. It's just the volume of bookings are bigger towards the tail end of the year.

Speaker #4: In terms of how we handle it, I think that this is a natural byproduct of just being successful with our customers, particularly the larger customers within the G2K.

Speaker #4: So we welcome it. When we look at our guidance and what we expect the full year to be, we naturally take a haircut on a specific deals that could move from one quarter to another.

Speaker #4: So that's how we incorporate it into our guidance. A risk-adjusted number on not actually counting on everything going our way.

Navam Welihinda: A risk-adjusted number on not actually counting on everything going our way.

Navam Welihinda: A risk-adjusted number on not actually counting on everything going our way.

Speaker #5: Thanks very much.

Miller Jump: Thanks very much.

Miller Jump: Thanks very much.

Speaker #2: The next question is from Koji Ikeda with Bank of America. Please go ahead.

Operator: The next question is from Koji Ikeda with Bank of America. Please go ahead.

Operator: The next question is from Koji Ikeda with Bank of America. Please go ahead.

Speaker #5: Hi. This is George McGreen. I'm for Koji. I appreciate you guys taking our questions today. I wanted to ask, just in the conversations that you guys have with customers, and their strategy around adopting AI, how would you say that the tone and the conversations differ versus a year ago?

George McGreehan: Hi, this is George McGreehan in for Koji. I appreciate, you guys taking our questions today. I wanted to ask, just in the conversations that you guys have with customers, and their strategy around adopting AI, how would you say that the tone and the conversations differ versus a year ago? What kind of in-inning are they in, today versus maybe a year ago in their, adoption journey with Elastic? Thank you.

George McGreehan: Hi, this is George McGreehan in for Koji. I appreciate, you guys taking our questions today. I wanted to ask, just in the conversations that you guys have with customers, and their strategy around adopting AI, how would you say that the tone and the conversations differ versus a year ago? What kind of in-inning are they in, today versus maybe a year ago in their, adoption journey with Elastic? Thank you.

Speaker #5: And what kind of inning are they in today versus maybe a year ago in their adoption journey with Elastic? Thank you.

Speaker #4: The general tone is definitely one of greater enthusiasm for AI. I think there's been enough proof points now for AI helping in all kinds of use cases.

Ash Kulkarni: You know, the general tone is definitely one of greater enthusiasm for AI. I think there's been enough proof points now, for AI helping in all kinds of use cases, whether it be around, you know, code development, or whether it be around customer support, in legal e-discovery like lots and lots of use cases across all functions. We are seeing the conversations be less evangelism, and more about helping them put together, you know, these kinds of sophisticated agentic applications. There's definitely been maturity. In terms of the total number of, you know, these agents that people have within their organization, you know, that number is still in the early days.

Ash Kulkarni: You know, the general tone is definitely one of greater enthusiasm for AI. I think there's been enough proof points now, for AI helping in all kinds of use cases, whether it be around, you know, code development, or whether it be around customer support, in legal e-discovery like lots and lots of use cases across all functions. We are seeing the conversations be less evangelism, and more about helping them put together, you know, these kinds of sophisticated agentic applications. There's definitely been maturity. In terms of the total number of, you know, these agents that people have within their organization, you know, that number is still in the early days.

Speaker #4: Whether it be around code development, whether it be around customer support, in legally discovery, lots and lots of use cases across all functions. And so we are seeing the conversations be less evangelism and more about helping them put together these kinds of sophisticated agentic applications.

Speaker #4: So there's definitely been maturity in terms of the total number of these agents that people have within their organization. That number is still in the early days.

Speaker #4: If you think about the total number of business processes and workflows that can be automated by AI, I think you have to be realistic that we are still in the early days because AI just is a pretty powerful and transformative capability.

Ash Kulkarni: If you think about the total number of business processes and workflows that can be automated by AI, I think you have to be realistic that we are still in the early days because, you know, AI just is a pretty powerful and transformative capability. You know, what you can do with these LLMs in terms of reasoning can be applied to many different functions and different work processes. We believe that the opportunity is still, you know, very significant and still ahead of us.

Ash Kulkarni: If you think about the total number of business processes and workflows that can be automated by AI, I think you have to be realistic that we are still in the early days because, you know, AI just is a pretty powerful and transformative capability. You know, what you can do with these LLMs in terms of reasoning can be applied to many different functions and different work processes. We believe that the opportunity is still, you know, very significant and still ahead of us.

Speaker #4: And what you can do with these LLMs in terms of reasoning can be applied to many, many different functions and different work processes. So we believe that the opportunity is still very significant and still ahead of us.

Speaker #2: The next question is from Mike Sikos with Needham. Please go ahead.

Operator: The next question is from Mike Cikos with Needham. Please go ahead.

Operator: The next question is from Mike Cikos with Needham. Please go ahead.

Speaker #6: Hey, guys. This is Matt Cuitrion from Mike Sikos over at Needham. Thanks for taking our questions. With all the advancements you're making to search with things like the Gena re-ranking models, are you able to charge customers more?

Matt Kutrian: Hey, guys. This is Matt Kutrian for Mike Cikos over at Needham. Thanks for taking our questions. With all the advancements you're making to search with things like the Jina re-ranking models, are you able to charge customers more, or is the improved speed and accuracy more of a acquisition vehicle?

Matt Calkins: Hey, guys. This is Matt Kutrian for Mike Cikos over at Needham. Thanks for taking our questions. With all the advancements you're making to search with things like the Jina re-ranking models, are you able to charge customers more, or is the improved speed and accuracy more of a acquisition vehicle?

Speaker #6: Or is the improved speed and accuracy more of an acquisition vehicle?

Speaker #4: So, we do charge in terms of consumption, right? So, we have a consumption model, as you know. So, pretty much everything that you do on our platform, it's metered and effectively based on compute, based on storage, and so on.

Ash Kulkarni: We do charge in terms of consumption, right? We have a consumption model, as you know. Pretty much everything that you do on our platform, it's metered and effectively, you know, based on compute, based on storage, and so on. For anything that's LLM or model related, it's based on tokens. All of our pricing is sort of public on our pricing pages. Yes, with these newer models, we are monetizing everything. As the usage continues to grow, as customers do more and more on our platform, that is what drives revenue for us.

Ash Kulkarni: We do charge in terms of consumption, right? We have a consumption model, as you know. Pretty much everything that you do on our platform, it's metered and effectively, you know, based on compute, based on storage, and so on. For anything that's LLM or model related, it's based on tokens. All of our pricing is sort of public on our pricing pages. Yes, with these newer models, we are monetizing everything. As the usage continues to grow, as customers do more and more on our platform, that is what drives revenue for us.

Speaker #4: And for anything that's LLM or model-related, it's based on tokens. And all of our pricing is, sort of, public on our pricing pages. But yes, with these newer models, we are monetizing everything.

Speaker #4: And as the usage continues to grow, as customers do more and more on our platform, that is what drives revenue for us.

Speaker #6: Got it. Very helpful. Thanks. And then maybe just taking a different slice at the guidance question here—so you beat on the Q3 guide and constant currency.

Matt Kutrian: Got it. Very helpful. Thanks. Maybe just taking a different slice at the guidance question here. You beat on the Q3 guide and constant currency, and then you raise the constant currency guide for sales rep subscription revenue, but you left constant currency unchanged for the full year guide. I can appreciate the three fewer days and the risk adjusted, but that would have been baked into the prior guide. Can you just help walk through the mechanics there of why that wouldn't have increased?

Matt Calkins: Got it. Very helpful. Thanks. Maybe just taking a different slice at the guidance question here. You beat on the Q3 guide and constant currency, and then you raise the constant currency guide for sales rep subscription revenue, but you left constant currency unchanged for the full year guide. I can appreciate the three fewer days and the risk adjusted, but that would have been baked into the prior guide. Can you just help walk through the mechanics there of why that wouldn't have increased?

Speaker #6: And then you raised the constant currency guide for sales-read subscription revenue. But you left constant currency unchanged for the full year guide. And I can appreciate the three fewer days and the risk-adjusted.

Speaker #6: But that would have been baked into the prior guide. Can you just help walk through the mechanics there of why that wouldn't have increased?

Speaker #4: Yeah. I mean, it's quite simple. The number that we care about is sales-read subscription revenue. We handily beat that number this quarter. And we raised more than we beat.

Navam Welihinda: I mean, it's quite simple. The number that we care about is sales-led subscription revenue. We handily beat that number this quarter. We raised more than we beat. That's a reaction of what we think is happening with the business and the sort of the positive momentum that we're seeing on the sales line. Overall, you know, we're not thinking about it too much more than we feel good about the forward momentum of sales with subscription revenue, and we beat the number, and we're raising more than we beat.

Navam Welihinda: I mean, it's quite simple. The number that we care about is sales-led subscription revenue. We handily beat that number this quarter. We raised more than we beat. That's a reaction of what we think is happening with the business and the sort of the positive momentum that we're seeing on the sales line. Overall, you know, we're not thinking about it too much more than we feel good about the forward momentum of sales with subscription revenue, and we beat the number, and we're raising more than we beat.

Speaker #4: That's a reaction of what we think is happening with the business. And the sort of the positive momentum that we're seeing on the sales line.

Speaker #4: So overall, what we're not thinking about it too much more than we feel good about the forward momentum of sales-read subscription revenue. And we beat the number.

Speaker #4: And we're raising more than we beat.

Speaker #6: Got it. Thank you.

Matt Kutrian: Got it. Thank you.

Matt Calkins: Got it. Thank you.

Speaker #2: The next question is from Eric Heath with KeyBanc Capital Markets. Please go ahead. Mr. Heath, your line is open on our end—perhaps it's muted on yours.

Operator: The next question is from Eric Heath with KeyBanc Capital Markets. Please go ahead. Mr. Heath, your line is open on our end. Perhaps it's muted on yours. Showing no further questions, this concludes our question and answer session. I would like to turn the conference back over to Ash Kulkarni for any closing remarks.

Operator: The next question is from Eric Heath with KeyBanc Capital Markets. Please go ahead. Mr. Heath, your line is open on our end. Perhaps it's muted on yours. Showing no further questions, this concludes our question and answer session. I would like to turn the conference back over to Ash Kulkarni for any closing remarks.

Speaker #2: Showing no further questions. This concludes our question-and-answer session. I would like to turn the conference back over to Ash Kulkarni for any closing remarks.

Speaker #4: Thank you all for joining us today. We heard Elastic are very proud of our business results and excited about the opportunity ahead. Thank you.

Navam Welihinda: Thank you all for joining us today. We here at Elastic are very proud of our business results and excited about the opportunity ahead. Thank you.

Navam Welihinda: Thank you all for joining us today. We here at Elastic are very proud of our business results and excited about the opportunity ahead. Thank you.

Operator: The conference is now concluded. Thank you for attending today's presentation. You may now disconnect.

Operator: The conference is now concluded. Thank you for attending today's presentation. You may now disconnect.

Q3 2026 Elastic NV Earnings Call

Demo

Elastic

Earnings

Q3 2026 Elastic NV Earnings Call

ESTC

Thursday, February 26th, 2026 at 10:00 PM

Transcript

No Transcript Available

No transcript data is available for this event yet. Transcripts typically become available shortly after an earnings call ends.

Want AI-powered analysis? Try AllMind AI →