Q4 2025 Blaize Holdings Inc Earnings Call

Operator: Good day, and thank you for standing by. Welcome to the Blaize Q4 2025 Earnings Conference Call. At this time, all participants are in a listen-only mode. Please be advised that today's conference is being recorded. After the speaker's presentation, there will be a question-and-answer session. To ask a question, please press star one one on your telephone and wait for your name to be announced. To withdraw your question, please press star one one again. I would now like to hand the conference over to your speaker today, Lana Adair, Investor Relations.

Operator: Good day, and thank you for standing by. Welcome to the Blaize Q4 2025 Earnings Conference Call. At this time, all participants are in a listen-only mode. Please be advised that today's conference is being recorded. After the speaker's presentation, there will be a question-and-answer session. To ask a question, please press star one one on your telephone and wait for your name to be announced. To withdraw your question, please press star one one again. I would now like to hand the conference over to your speaker today, Lana Adair, Investor Relations.

Speaker #1: Good day, and thank you for staying by. Welcome to the Blaize Holdings 2025 earnings conference call. At this time, all participants are in a listen-only mode.

Speaker #1: Please be advised that today's conference is being recorded. After the speakers' presentation, there will be a question-and-answer session. To ask a question, please press star 11 on your telephone and wait for your name to be announced.

Speaker #1: To withdraw your question, please press star 11 again. I would now like to hand the conference over to your speaker today, Lana Adair, Investor Relations.

Lana Adair: Good afternoon, everyone. Thank you for joining Blaize Holdings, Inc.'s Q4 2025 Earnings Call. Before we begin the prepared remarks, we would like to remind you that earlier today, Blaize Holdings, Inc. issued a press release announcing its Q4 and full year results. Earnings materials are available on the investor relations section of Blaize Holdings, Inc.'s website. Today's earnings call and press release reflect management's view as of today only and includes statements related to our 2026 financial guidance, revenue, gross margin, competitive position, anticipated industry trends, market opportunities, products, and financing opportunities, all of which constitute forward-looking statements under the federal securities laws. Actual results may differ materially from those contained in or implied by these forward-looking statements due to risks and uncertainties associated with our business.

Lana Adair: Good afternoon, everyone. Thank you for joining Blaize Holdings, Inc.'s Q4 2025 Earnings Call. Before we begin the prepared remarks, we would like to remind you that earlier today, Blaize Holdings, Inc. issued a press release announcing its Q4 and full year results. Earnings materials are available on the investor relations section of Blaize Holdings, Inc.'s website. Today's earnings call and press release reflect management's view as of today only and includes statements related to our 2026 financial guidance, revenue, gross margin, competitive position, anticipated industry trends, market opportunities, products, and financing opportunities, all of which constitute forward-looking statements under the federal securities laws. Actual results may differ materially from those contained in or implied by these forward-looking statements due to risks and uncertainties associated with our business.

Speaker #2: Good afternoon, everyone. Thank you for joining Blaize Holdings, Inc. Q4 2025 earnings call. Before we begin, the prepared remarks, we would like to remind you that earlier today, Blaize Holdings, Inc., issued a press release announcing its fourth quarter and full year.

Speaker #2: Results. Earnings materials are available on the Investor Relations section of Blaize Holdings, Inc., website. Today's earnings call and press release reflect management's view as of today only and include statements related to our 2026 financial guidance, revenue gross margin, competitive position, anticipated industry trends, market opportunities, products, and financing opportunities.

Speaker #2: All of which constitute forward-looking all of which constitute forward-looking statements under the Federal Securities Law. Actual results may differ materially from those contained in or implied by these forward-looking statements due to risks and uncertainties associated with our business.

Lana Adair: For a discussion of material risks and other important factors that could impact our actual results, please refer to the company's Form 10-K for the year ended December 31, 2025, including the Risk Factors section therein and today's press release, both of which can be found on our investor relations website. Any forward-looking statements that we make on the call are based on assumptions as of today and other than as may be required by law. We undertake no obligation to update these statements as a result of new information or future events. Information discussed on this call concerning Blaize Holdings, Inc.'s industry, competitive position, and the markets in which it operates is based on information from independent industry and research organizations, other third-party sources, and management's estimates.

Lana Adair: For a discussion of material risks and other important factors that could impact our actual results, please refer to the company's Form 10-K for the year ended December 31, 2025, including the Risk Factors section therein and today's press release, both of which can be found on our investor relations website. Any forward-looking statements that we make on the call are based on assumptions as of today and other than as may be required by law. We undertake no obligation to update these statements as a result of new information or future events. Information discussed on this call concerning Blaize Holdings, Inc.'s industry, competitive position, and the markets in which it operates is based on information from independent industry and research organizations, other third-party sources, and management's estimates.

Speaker #2: For a discussion of material risks and other important factors that could impact our actual results, please refer to the company's Form 10-K for the year ended December 31, 2025, including the risk factors section therein and today's press release both of which can be found on our Investor Relations website.

Speaker #2: Any forward-looking statements that we make on the call are based on assumptions as of today and, other than as may be required by law, may change.

Speaker #2: We undertake no obligation to update these statements as a result of new information or future events. Information discussed on this call concerning Blaize Holdings, Inc., industry, competitive position, and the markets in which it operates is based on information from independent industry and research organizations other third-party sources and management's estimates.

Lana Adair: These estimates are derived from publicly available information released by independent industry analysts and other third-party sources, as well as data from Blaize Holdings, Inc.'s internal research. These estimates are based on reasonable assumptions and computations made upon reviewing such data and Blaize Holdings, Inc.'s experience and knowledge of such industry and markets. By definition, assumptions are subject to uncertainty and risk, which could cause results to differ materially from those expressed in the estimates. During this call, we will discuss certain non-GAAP financial measures. These non-GAAP financial measures should be considered as a supplement to, and not a substitute for, measures prepared in accordance with GAAP. For a reconciliation of non-GAAP financial measures discussed during this call to the most directly comparable GAAP measure, please refer to today's press release. Now I'd like to introduce Dinakar Munagala, the CEO of Blaize Holdings, Inc.

Lana Adair: These estimates are derived from publicly available information released by independent industry analysts and other third-party sources, as well as data from Blaize Holdings, Inc.'s internal research. These estimates are based on reasonable assumptions and computations made upon reviewing such data and Blaize Holdings, Inc.'s experience and knowledge of such industry and markets. By definition, assumptions are subject to uncertainty and risk, which could cause results to differ materially from those expressed in the estimates. During this call, we will discuss certain non-GAAP financial measures. These non-GAAP financial measures should be considered as a supplement to, and not a substitute for, measures prepared in accordance with GAAP. For a reconciliation of non-GAAP financial measures discussed during this call to the most directly comparable GAAP measure, please refer to today's press release. Now I'd like to introduce Dinakar Munagala, the CEO of Blaize Holdings, Inc.

Speaker #2: These estimates are derived from publicly available information released by independent industry analysts and other third-party sources, as well as data from Blaize Holdings, Inc., internal research.

Speaker #2: These estimates are based on reasonable assumptions and computations made upon reviewing such data and Blaize Holdings, Inc.'s experience and knowledge of such industry and markets.

Speaker #2: By definition, assumptions are subject to uncertainty and risk, which could cause results to differ materially from those expressed in the estimates. During this call, we will discuss certain non-gap financial measures.

Speaker #2: These non-gap financial measures should be considered as a supplement to and not a substitute for measures prepared in accordance with GAAP. For a reconciliation of non-gap financial measures, discussed during this call to the most directly comparable GAAP measure, please refer to today's press release.

Speaker #2: Now I'd like to introduce Dinakar Munagala, the CEO of Blaize Holdings, Inc.

Dinakar Munagala: Good afternoon. Over the course of 2025, we grew our revenue from approximately $1 million in Q1 to $23.8 million in Q4. We exceeded the upper end of our revenue guidance, representing approximately 20 times growth over the year. This reflects strong momentum across inference infrastructure, sovereign AI, and public safety applications. Customers today evaluate AI infrastructure on three things, cost per inference, power efficiency, and revenue per rack. At the same time, many enterprise inference workloads do not require the largest models. We are seeing increasing adoption towards smaller task-specific models that deliver strong results with far greater efficiency and faster time into real business outcomes. That is where Blaize is focused. Over the past several months, we strengthened our execution.

Dinakar Munagala: Good afternoon. Over the course of 2025, we grew our revenue from approximately $1 million in Q1 to $23.8 million in Q4. We exceeded the upper end of our revenue guidance, representing approximately 20 times growth over the year. This reflects strong momentum across inference infrastructure, sovereign AI, and public safety applications. Customers today evaluate AI infrastructure on three things, cost per inference, power efficiency, and revenue per rack. At the same time, many enterprise inference workloads do not require the largest models. We are seeing increasing adoption towards smaller task-specific models that deliver strong results with far greater efficiency and faster time into real business outcomes. That is where Blaize is focused. Over the past several months, we strengthened our execution.

Speaker #3: Good afternoon. Over the course of 2025, we grew our revenue from approximately $1 million in the first quarter to $23.8 million in the fourth quarter.

Speaker #3: We exceeded the upper end of our revenue guidance representing approximately 20 times growth over the year. This reflects strong momentum across inference infrastructure, sovereign AI, and public safety applications.

Speaker #3: Customers today evaluate AI infrastructure on three things: cost per inference, power efficiency, and revenue per rack. At the same time, many enterprise inference workloads do not require the largest models.

Speaker #3: We are seeing increasing adoption of smaller, task-specific models that deliver strong results with far greater efficiency and faster time to real business outcomes. That is where Blaize is focused.

Speaker #3: Over the past several months, we strengthened our execution. We brought on a chief revenue officer, Stephen Patak, to scale our commercial efforts globally. In January, we signed an MOU with Nokia's Asia Pacific division.

Dinakar Munagala: We brought on a Chief Revenue Officer, Stephen Patak, to scale our commercial efforts globally. In January, we signed an MOU with Nokia's Asia Pacific division, and we are now advancing that collaboration through an innovation hub in Singapore to build and validate our combined AI platform. We will launch this at GITEX Asia in Singapore, where we'll present to enterprises, governments, cloud providers, and data center operators across Asia Pacific. We are already seeing early traction taking shape across the region, spanning cloud infrastructure, sovereign AI, and real-world applications. Many of these opportunities follow multi-phase models where our systems expand as workloads grow. One of the most concrete examples is in India, where we signed an MOU with the government of Telangana supporting its AI cloud innovation hub.

Dinakar Munagala: We brought on a Chief Revenue Officer, Stephen Patak, to scale our commercial efforts globally. In January, we signed an MOU with Nokia's Asia Pacific division, and we are now advancing that collaboration through an innovation hub in Singapore to build and validate our combined AI platform. We will launch this at GITEX Asia in Singapore, where we'll present to enterprises, governments, cloud providers, and data center operators across Asia Pacific. We are already seeing early traction taking shape across the region, spanning cloud infrastructure, sovereign AI, and real-world applications. Many of these opportunities follow multi-phase models where our systems expand as workloads grow. One of the most concrete examples is in India, where we signed an MOU with the government of Telangana supporting its AI cloud innovation hub.

Speaker #3: And we are now advancing that collaboration through an innovation hub in Singapore to build and validate our combined AI platform. We will launch this at GITEX Asia in Singapore where we'll present to enterprises, governments, cloud providers, and data center operators across Asia Pacific.

Speaker #3: We are already seeing early traction taking shape across the region spanning cloud infrastructure, sovereign AI, and real-world applications. Many of these opportunities follow multi-phase models.

Speaker #3: Where our systems expand as workloads grow. One of the most concrete examples is in India, where we signed an MOU with the Government of Telangana supporting its AI cloud innovation hub.

Dinakar Munagala: This foundational platform spans mining safety, smart cities, and agriculture, where we jointly enable real-time intelligence of worker safety, equipment operations, and environmental conditions. In China, we are also expanding our footprint with regional solution providers focused on AI data center build-out, driving assisted living and smart community solutions in patient safety and remote monitoring with enterprise engagements underway. In Korea, we are working with solution partners like GCAL, specializing in factory safety and industrial monitoring. Across Southeast Asia and Australia, we're working with Nokia and vertical systems integrators to explore AI use cases in urban safety, retail analytics, maritime infrastructure, and airport security. In the US, Europe, and Latin America, we're expanding engagements across enterprise and data center environments focused on AI infrastructure, public safety, industrial robotics, and autonomous operations.

Dinakar Munagala: This foundational platform spans mining safety, smart cities, and agriculture, where we jointly enable real-time intelligence of worker safety, equipment operations, and environmental conditions. In China, we are also expanding our footprint with regional solution providers focused on AI data center build-out, driving assisted living and smart community solutions in patient safety and remote monitoring with enterprise engagements underway. In Korea, we are working with solution partners like GCAL, specializing in factory safety and industrial monitoring. Across Southeast Asia and Australia, we're working with Nokia and vertical systems integrators to explore AI use cases in urban safety, retail analytics, maritime infrastructure, and airport security. In the US, Europe, and Latin America, we're expanding engagements across enterprise and data center environments focused on AI infrastructure, public safety, industrial robotics, and autonomous operations.

Speaker #3: This foundational platform spans mining safety, smart cities, and agriculture. Where we jointly enable real-time intelligence of worker safety, equipment operations, and environmental conditions. In China, we are also expanding our footprint with regional solution providers focused on AI data center build-out.

Speaker #3: Driving assisted living and smart community solutions in patient safety and remote monitoring, with enterprise engagements underway. In Korea, we are working with solution partners like GSIL, specializing in factory safety and industrial monitoring.

Speaker #3: Across Southeast Asia and Australia, we are working with Nokia and vertical systems integrators to explore AI use cases in urban safety, retail analytics, maritime infrastructure, and airport security.

Speaker #3: In the US, Europe, and Latin America, we're expanding engagements across enterprise and data center environments focused on AI infrastructure, public safety, industrial robotics, and autonomous operations.

Dinakar Munagala: For the third consecutive year, we will showcase our solutions at ISC West, the largest converged security trade show. The Middle East and North Africa continues to be a strong growth market. Governments and enterprises are investing in security, visibility, and sovereign infrastructure. In Saudi Arabia, we support energy and urban city use cases. In the UAE, we support civil defense, aerial monitoring, and drone detection. In North Africa, we support large-scale industrial ecosystems. Blaize enables real-time detection and monitoring, supporting infrastructure security across energy, industrial, and transportation environments. Our capabilities extend into robotics and autonomous systems. These systems require low latency and efficient inference where hybrid architectures become essential. This shifts AI from centralized data centers to distributed infrastructure. AI infrastructure is no longer limited to hyperscalers. It is now distributed across regional cloud providers, data center operators, and sovereign programs. AI environments today remain highly fragmented.

Dinakar Munagala: For the third consecutive year, we will showcase our solutions at ISC West, the largest converged security trade show. The Middle East and North Africa continues to be a strong growth market. Governments and enterprises are investing in security, visibility, and sovereign infrastructure. In Saudi Arabia, we support energy and urban city use cases. In the UAE, we support civil defense, aerial monitoring, and drone detection. In North Africa, we support large-scale industrial ecosystems. Blaize enables real-time detection and monitoring, supporting infrastructure security across energy, industrial, and transportation environments. Our capabilities extend into robotics and autonomous systems. These systems require low latency and efficient inference where hybrid architectures become essential. This shifts AI from centralized data centers to distributed infrastructure. AI infrastructure is no longer limited to hyperscalers. It is now distributed across regional cloud providers, data center operators, and sovereign programs. AI environments today remain highly fragmented.

Speaker #3: And for the third consecutive year, we will showcase our solutions at ISC West, the largest converged security trade show. The Middle East and North Africa continues to be a strong growth market.

Speaker #3: Governments and enterprises are investing in security, visibility, and sovereign infrastructure. In Saudi Arabia, we support energy and urban city use cases. In the UAE, we support civil defense, aerial monitoring, and drone detection.

Speaker #3: In North Africa, we support large-scale industrial ecosystems. Blaize enables real-time detection and monitoring, supporting infrastructure security across energy, industrial, and transportation environments. Our capabilities extend into robotics and autonomous systems.

Speaker #3: These systems require low-latency and efficient inference where hybrid architectures become essential. This shifts AI from centralized data centers to distributed infrastructure. AI infrastructure is no longer limited to hyperscalers.

Speaker #3: It is now distributed across regional cloud providers, data center operators, and sovereign programs. AI environments today remain highly fragmented. Thousands of vendors deliver narrow AI capabilities focused on vision, documents, identity, or automation.

Dinakar Munagala: Thousands of vendors deliver narrow AI capabilities focused on vision, documents, identity, or automation. Organizations are left integrating multiple systems before they can deliver real outcomes. The opportunity is to move from fragmented tools to integrated services. These capabilities are consolidating into platforms, and that transition is happening now. What ties all of this together is the underlying economics. At scale, this is about cost, efficiency, and utilization. In our analysis, GPU-only infrastructure can scale revenue but remains constrained by high-end recurring compute cost. By contrast, the Blaize model is designed to be cash flow efficient from the start, driven by lower silicon cost and power efficiency. A hybrid configuration combining GPUs and Blaize inference acceleration can deliver roughly 50% lower infrastructure cost with approximately 60% lower power consumption or more than 2x improvement in efficiency.

Dinakar Munagala: Thousands of vendors deliver narrow AI capabilities focused on vision, documents, identity, or automation. Organizations are left integrating multiple systems before they can deliver real outcomes. The opportunity is to move from fragmented tools to integrated services. These capabilities are consolidating into platforms, and that transition is happening now. What ties all of this together is the underlying economics. At scale, this is about cost, efficiency, and utilization. In our analysis, GPU-only infrastructure can scale revenue but remains constrained by high-end recurring compute cost. By contrast, the Blaize model is designed to be cash flow efficient from the start, driven by lower silicon cost and power efficiency. A hybrid configuration combining GPUs and Blaize inference acceleration can deliver roughly 50% lower infrastructure cost with approximately 60% lower power consumption or more than 2x improvement in efficiency.

Speaker #3: Organizations are left integrating multiple systems before they can deliver real outcomes. The opportunity is to move from fragmented tools to integrated services. These capabilities are consolidating into platforms.

Speaker #3: And that transition is happening now. What ties all of this together is the underlying economics. At scale, this is about cost, efficiency, and utilization.

Speaker #3: In our analysis, GPU-only infrastructure can scale revenue but remains constrained by high and recurring compute costs. By contrast, the Blaize model is designed to be cash flow efficient from the start, driven by lower silicon costs and power efficiency.

Speaker #3: A hybrid configuration combining GPUs and Blaize inference acceleration can deliver roughly a 50% lower infrastructure cost, with approximately 60% lower power consumption, or more than 2x improvement in efficiency.

Dinakar Munagala: To support this model, we are progressing toward the initial release of the Blaize AI services platform in Q2. This is not just about cost. It brings fragmented AI capabilities into a unified services layer and enables customers to move faster from infrastructure to real-world outcomes. The platform combines inference silicon, intelligent software, API-based AI services. For AI providers, instead of relying on GPU rental, Blaize enables operators to monetize AI outcomes. Revenue comes from inference transactions, AI events, and application services. As services scale, revenue grows faster than cost, driving operating leverage and margin expansion. In our analysis, traditional infrastructure models remain cost-constrained over time. The Blaize AI services model enables more efficient scaling of revenue with improving economics as services grow. This is the difference between scaling compute and scaling a business. AI infrastructure investment continues to expand globally.

Dinakar Munagala: To support this model, we are progressing toward the initial release of the Blaize AI services platform in Q2. This is not just about cost. It brings fragmented AI capabilities into a unified services layer and enables customers to move faster from infrastructure to real-world outcomes. The platform combines inference silicon, intelligent software, API-based AI services. For AI providers, instead of relying on GPU rental, Blaize enables operators to monetize AI outcomes. Revenue comes from inference transactions, AI events, and application services. As services scale, revenue grows faster than cost, driving operating leverage and margin expansion. In our analysis, traditional infrastructure models remain cost-constrained over time. The Blaize AI services model enables more efficient scaling of revenue with improving economics as services grow. This is the difference between scaling compute and scaling a business. AI infrastructure investment continues to expand globally.

Speaker #3: To support this model, we are progressing toward the initial release of the Blaize AI services platform in the second quarter. This is not just about cost.

Speaker #3: It brings fragmented AI capabilities into a unified services layer and enables customers to move faster from infrastructure to real-world outcomes. The platform combines inference silicon, intelligent software, and API-based AI services.

Speaker #3: For AI providers, instead of relying on GPU rental, Blaize enables operators to monetize AI outcomes. Revenue comes from inference transactions, AI events, and application services.

Speaker #3: As services scale, revenue grows faster than cost, driving operating leverage and margin expansion. In our analysis, traditional infrastructure models remain cost-constrained over time. The Blaize AI services model enables more efficient scaling of revenue, with improving economics as services grow.

Speaker #3: This is the difference between scaling compute and scaling a business. AI infrastructure investment continues to expand globally. This phase of the industry is no longer defined by larger models.

Dinakar Munagala: This phase of the industry is no longer defined by larger models. It is defined by monetizing inference at scale. Platforms that combine efficient architecture with AI services are defining how AI operates today. Blaize is built for that model. Our focus remains on expanding commercial activities, scaling AI services, and converting pipeline into revenue. Thank you. I will now hand this off to our CFO, Harminder Sehmi.

Dinakar Munagala: This phase of the industry is no longer defined by larger models. It is defined by monetizing inference at scale. Platforms that combine efficient architecture with AI services are defining how AI operates today. Blaize is built for that model. Our focus remains on expanding commercial activities, scaling AI services, and converting pipeline into revenue. Thank you. I will now hand this off to our CFO, Harminder Sehmi.

Speaker #3: It is defined by monetizing inference at scale. Platforms that combine efficient architecture with AI services are defining how AI operates today. Blaize is built for that model.

Speaker #3: Our focus remains on expanding commercial activities, scaling AI services, and converting pipeline into revenue. Thank you. I will now hand this off to our CFO, Harminder Semi.

Harminder Sehmi: Thank you, Dinakar, and good afternoon, everyone. I'm pleased to share our Q4 and full year 2025 results today. I'd like to begin with a few highlights. This is the fourth consecutive quarter where we exceeded our revenue guidance range since we became a public company in January 2025. Revenue of $38.6 million for the full year 2025 outperformed the upper end of our guidance by $600,000. Revenue for the Q4 doubled to $23.8 million from the prior quarter. Adjusted EBITDA loss was $50.5 million, an improvement of $4.5 million from the lower end of our guidance range for the year. This includes a $1 million benefit from higher gross margin and $3.5 million in lower OpEx and deferred technology costs.

Harminder Sehmi: Thank you, Dinakar, and good afternoon, everyone. I'm pleased to share our Q4 and full year 2025 results today. I'd like to begin with a few highlights. This is the fourth consecutive quarter where we exceeded our revenue guidance range since we became a public company in January 2025. Revenue of $38.6 million for the full year 2025 outperformed the upper end of our guidance by $600,000. Revenue for the Q4 doubled to $23.8 million from the prior quarter. Adjusted EBITDA loss was $50.5 million, an improvement of $4.5 million from the lower end of our guidance range for the year. This includes a $1 million benefit from higher gross margin and $3.5 million in lower OpEx and deferred technology costs.

Speaker #1: Thank you, Dinika, and good afternoon, everyone. I'm pleased to share our fourth quarter and full year 2025 results today. I'd like to begin with a few highlights.

Speaker #1: This is the fourth consecutive quarter where we exceeded our revenue guidance range since we became a public company in January 2025. Revenue of $38.6 million for the full year 2025 outperformed the upper end of our guidance by $600,000.

Speaker #1: Revenue for the fourth quarter doubled to $23.8 million from the prior quarter. And adjusted EBITDA loss was $50.5 million. An improvement of 4.5 million from the lower end of our guidance range for the year.

Speaker #1: This includes a $1 million benefit from higher gross margin, and $3.5 million in lower OPEX and deferred technology costs. Focusing on revenue and gross margin.

Harminder Sehmi: Focusing on revenue and gross margin. In Q4, we delivered revenue of $23.8 million, exceeding the upper end of our guidance by $700,000. This performance was driven by customer deployments of servers in the Asia Pacific region, supporting AI solutions into the smart health space. We are seeing continued demand as customers expand into AI data center infrastructure build-outs, which we expect to contribute to future revenue growth. Turning to the full year, 2025 marked an important milestone as our first full year of operations as a public company. We're pleased with the progress we achieved. Revenue of $38.6 million was up significantly from $1.6 million in the prior year. This reflects our success in laying the foundations to meet the rising demand for AI solutions across high growth markets.

Harminder Sehmi: Focusing on revenue and gross margin. In Q4, we delivered revenue of $23.8 million, exceeding the upper end of our guidance by $700,000. This performance was driven by customer deployments of servers in the Asia Pacific region, supporting AI solutions into the smart health space. We are seeing continued demand as customers expand into AI data center infrastructure build-outs, which we expect to contribute to future revenue growth. Turning to the full year, 2025 marked an important milestone as our first full year of operations as a public company. We're pleased with the progress we achieved. Revenue of $38.6 million was up significantly from $1.6 million in the prior year. This reflects our success in laying the foundations to meet the rising demand for AI solutions across high growth markets.

Speaker #1: In the fourth quarter, we delivered revenue of $23.8 million, exceeding the upper end of our guidance by $700,000. This performance was driven by customer deployments of servers in the Asia-Pacific region, supporting AI solutions into the smart health space.

Speaker #1: We are seeing continued demand as customers expand into AI data center infrastructure buildouts, which we expect to contribute to future revenue growth. Turning to the full year, 2025 marked an important milestone as our first full year of operations as a public company.

Speaker #1: We're pleased with the progress we achieved. Revenue of $38.6 million was up significantly from $1.6 million in the prior year. This reflects our success in laying the foundations to meet the rising demand for AI solutions across high-growth markets.

Harminder Sehmi: Our growing partnerships with systems integrators and software providers is key to accelerating and streamlining the adoption of AI solutions powered by Blaize hardware and software. Let me now address the gross margin trends. Gross margin for Q4 was 11%, and it was 16% for the full year. In prior updates, I have indicated that this approach has been important to our strategic plans as we've been able to more rapidly seed substantial commercial relationships. I expect the quarterly trend to continue for H1 2026 as we adapt to the global memory constraints. Blaize hardware and software is expected to form a higher mix in our AI solutions from H2 2026. This should result in gross margins of between 30% and 35% in Q4.

Harminder Sehmi: Our growing partnerships with systems integrators and software providers is key to accelerating and streamlining the adoption of AI solutions powered by Blaize hardware and software. Let me now address the gross margin trends. Gross margin for Q4 was 11%, and it was 16% for the full year. In prior updates, I have indicated that this approach has been important to our strategic plans as we've been able to more rapidly seed substantial commercial relationships. I expect the quarterly trend to continue for H1 2026 as we adapt to the global memory constraints. Blaize hardware and software is expected to form a higher mix in our AI solutions from H2 2026. This should result in gross margins of between 30% and 35% in Q4.

Speaker #1: Our growing partnerships with systems integrators and software providers is key to accelerating and streamlining the adoption of AI solutions powered by Blaize hardware and software.

Speaker #1: Let me now address the gross margin trends. Gross margin for the fourth quarter was 11%, and it was 16% for the full year. In prior updates, I have indicated that this approach has been important to our strategic plans.

Speaker #1: As we've been able to more rapidly seed substantial commercial relationships. I expect the quarterly trend to continue for the first half of 2026, as we adapt to the global memory constraints.

Speaker #1: Blaize hardware and software is expected to form a higher mix in our AI solutions from the second half of 2026. This should result in gross margins of between 30% and 35% in the fourth quarter.

Harminder Sehmi: Turning to our Q4 and full year net loss and operating expenses. The GAAP net loss for the full year was $206.9 million compared to a GAAP net loss of $61.2 million in 2024. I'd like to spend a few moments breaking these numbers down to provide clarity on the underlying results. Key line items in our 2025 financials were a non-cash $226 million charge arising from the change in fair value of legacy Blaize convertible notes and warrants, non-cash $37.5 million in share-based compensation charge, and transaction expenses of $12 million related to going public.

Harminder Sehmi: Turning to our Q4 and full year net loss and operating expenses. The GAAP net loss for the full year was $206.9 million compared to a GAAP net loss of $61.2 million in 2024. I'd like to spend a few moments breaking these numbers down to provide clarity on the underlying results. Key line items in our 2025 financials were a non-cash $226 million charge arising from the change in fair value of legacy Blaize convertible notes and warrants, non-cash $37.5 million in share-based compensation charge, and transaction expenses of $12 million related to going public.

Speaker #1: Turning to our fourth quarter and full year net loss and operating expenses. The GAAP net loss for the full year was $206.9 million, compared to a GAAP net loss of $61.2 million in 2024.

Speaker #1: I'd like to spend a few moments breaking these numbers down to provide clarity on the underlying results. Key line items in our 2025 financials were a non-cash $226 million charge arising from the change in fair value, of legacy Blaize convertible notes and warrants.

Speaker #1: Non-cash $37.5 million in share-based compensation charge, and transaction expenses of $12 million related to going public. These were offset by a $123.2 million credit, inclusive of both cash and non-cash components.

Harminder Sehmi: These were offset by a $123.2 million credit, inclusive of both cash and non-cash components, primarily driven by the change in value of warrants and earn out shares, among other items. The adjusted EBITDA loss for fiscal 2025 was thus $50.5 million, up from a loss of $42.7 million in the prior year. The key reasons for the year-on-year increase were $3.2 million in building our teams, investment in our technology roadmap of $2.4 million, an increase of $1.5 million in marketing, and $5 million in new expenses related to our preparations to operate as a public company, some of which are not expected to recur in 2026. I will now review operating expenses on a sequential basis, Q4 versus Q3, 2025.

Harminder Sehmi: These were offset by a $123.2 million credit, inclusive of both cash and non-cash components, primarily driven by the change in value of warrants and earn out shares, among other items. The adjusted EBITDA loss for fiscal 2025 was thus $50.5 million, up from a loss of $42.7 million in the prior year. The key reasons for the year-on-year increase were $3.2 million in building our teams, investment in our technology roadmap of $2.4 million, an increase of $1.5 million in marketing, and $5 million in new expenses related to our preparations to operate as a public company, some of which are not expected to recur in 2026. I will now review operating expenses on a sequential basis, Q4 versus Q3, 2025.

Speaker #1: Primarily driven by the change in value of warrants and earn-out shares, among other items. The adjusted EBITDA loss for fiscal 2025 was thus $50.5 million.

Speaker #1: Up from a loss of $42.7 million in the prior year. The key reasons for the year-on-year increase were $3.2 million in building our teams.

Speaker #1: Investment in our technology roadmap of $2.4 million. An increase of 1.5 million in marketing. And $5 million in new expenses. Related to our preparations to operate as a public company.

Speaker #1: Some of which are not expected to recur in 2026. I will now review operating expenses on a sequential basis—fourth versus third quarter 2025.

Harminder Sehmi: In Q4 2025, total operating expenses of $14.5 million, excluding $9.5 million in stock-based compensation, were largely flat versus the $14.9 million in Q3, excluding the stock-based compensation also of $9.5 million. Research and development expenses and sales, general and administrative costs were similarly flat quarter on quarter. We will invest prudently in people in line with growing revenue opportunities in 2026. Our engineers continue to develop the next generation products, and we expect related external costs to kick in in H2. Our adjusted EBITDA loss in Q4 2025 was $11.1 million, unchanged from Q3. We ended fiscal 2025 with $46 million in cash and cash equivalents.

Harminder Sehmi: In Q4 2025, total operating expenses of $14.5 million, excluding $9.5 million in stock-based compensation, were largely flat versus the $14.9 million in Q3, excluding the stock-based compensation also of $9.5 million. Research and development expenses and sales, general and administrative costs were similarly flat quarter on quarter. We will invest prudently in people in line with growing revenue opportunities in 2026. Our engineers continue to develop the next generation products, and we expect related external costs to kick in in H2. Our adjusted EBITDA loss in Q4 2025 was $11.1 million, unchanged from Q3. We ended fiscal 2025 with $46 million in cash and cash equivalents.

Speaker #1: In the fourth quarter of 2025, total operating expenses of $14.5 million excluding $9.5 million in stock-based compensation were largely flat versus the $14.9 million in the third quarter.

Speaker #1: Excluding the stock-based compensation also of $9.5 million, research and development expenses and sales, general, and administrative costs were similarly flat quarter on quarter. We will invest prudently in people, in line with growing revenue opportunities in 2026.

Speaker #1: Our engineers continue to develop the next generation products, and we expect related external costs to kick in in the second half. Our adjusted EBITDA loss in the fourth quarter of 2025 was $11.1 million.

Speaker #1: Unchanged from the third quarter. We ended fiscal 2025 with $46 million in cash and cash equivalents. The available funds under our committed equity facility are $15.6 million.

Harminder Sehmi: The available funds under our committed equity facility are $15.6 million. Now I'd like to spend a few moments talking about our recently announced shelf before moving ahead to guidance for 2026. As is common with companies that become eligible and meet the criteria to file a shelf S-3 registration statement, we took the opportunity to do so on the first year anniversary of our merger. This shelf allows us to raise up to $250 million through a broad range of securities in the next three years and on an as-needed basis. Our shelf offers broad flexibility to raise capital quickly when market conditions are favorable. We believe the shelf is helpful for strategic positioning. It will provide working capital needs, fund field trials, and enable continued investment in new product development. Moving to our guidance for the current year.

Harminder Sehmi: The available funds under our committed equity facility are $15.6 million. Now I'd like to spend a few moments talking about our recently announced shelf before moving ahead to guidance for 2026. As is common with companies that become eligible and meet the criteria to file a shelf S-3 registration statement, we took the opportunity to do so on the first year anniversary of our merger. This shelf allows us to raise up to $250 million through a broad range of securities in the next three years and on an as-needed basis. Our shelf offers broad flexibility to raise capital quickly when market conditions are favorable. We believe the shelf is helpful for strategic positioning. It will provide working capital needs, fund field trials, and enable continued investment in new product development. Moving to our guidance for the current year.

Speaker #1: Now, I'd like to spend a few moments talking about our recently announced shelf before moving ahead to guidance for 2026. As is common with companies that become eligible, and meet the criteria to file a shelf S-3 registration statement, we took the opportunity to do so on the first-year anniversary of our merger.

Speaker #1: This shelf allows us to raise up to $250 million through a broad range of securities, in the next three years, and on an as-needed basis.

Speaker #1: Our shelf offers broad flexibility to raise capital quickly when market conditions are favorable. We believe the shelf is helpful for strategic positioning. It will provide working capital needs, fund field trials, and enable continued investment in new product development.

Speaker #1: Moving to our guidance for the current year. We operate in a dynamic environment that now includes global memory supply constraints and geopolitical tensions. We continue to monitor the supply chain closely and will invest prudently in research and development and go-to-market capability.

Harminder Sehmi: We operate in a dynamic environment that now includes global memory supply constraints and geopolitical tensions. We continue to monitor the supply chain closely and will invest prudently in research and development and go-to-market capability. We see demand across both edge and data center deployments. This creates an opportunity for recurring revenue as we expand our AI services platform. We're continuing our current partnerships as well as adding new customers. We believe new partnerships with recognized names like Nokia should lead to additional strategic opportunities in areas where we have not yet developed traction. With that, our 2026 fiscal guidance is as follows: Revenue of $130 million remains unchanged. I expect H1 to be lighter than H2. Flat gross margins for H1 of 2026, expected to average between 30% and 35% by Q4.

Harminder Sehmi: We operate in a dynamic environment that now includes global memory supply constraints and geopolitical tensions. We continue to monitor the supply chain closely and will invest prudently in research and development and go-to-market capability. We see demand across both edge and data center deployments. This creates an opportunity for recurring revenue as we expand our AI services platform. We're continuing our current partnerships as well as adding new customers. We believe new partnerships with recognized names like Nokia should lead to additional strategic opportunities in areas where we have not yet developed traction. With that, our 2026 fiscal guidance is as follows: Revenue of $130 million remains unchanged. I expect H1 to be lighter than H2. Flat gross margins for H1 of 2026, expected to average between 30% and 35% by Q4.

Speaker #1: We see demand across both edge and data center deployments. This creates an opportunity for recurring revenue as we expand our AI services platform. We're continuing our current partnerships as well as adding new customers.

Speaker #1: We believe new partnerships with recognized names like Nokia should lead to additional strategic opportunities in areas where we have not yet developed traction. With that, our 2026 fiscal guidance is as follows.

Speaker #1: Revenue of $130 million remains unchanged. I expect the first half to be lighter than the second. Flat gross margins for the first half of 2026.

Speaker #1: Expected to average between 30% and 35% by the fourth quarter. Adjusted EBITDA loss of between $45 million and $50 million. In closing, we delivered strong revenue growth in the second half of 2025 and continue to build momentum across our customer base.

Harminder Sehmi: adjusted EBITDA loss of between $45 million and $50 million. In closing, we delivered strong revenue growth in H2 2025 and continued to build momentum across our customer base. We remain focused on disciplined cost management and operational execution. Our outlook for 2026 remains consistent with what we have previously shared. With that, I'll turn it back over to the operator for questions.

Harminder Sehmi: adjusted EBITDA loss of between $45 million and $50 million. In closing, we delivered strong revenue growth in H2 2025 and continued to build momentum across our customer base. We remain focused on disciplined cost management and operational execution. Our outlook for 2026 remains consistent with what we have previously shared. With that, I'll turn it back over to the operator for questions.

Speaker #1: We remain focused on disciplined cost management and operational execution. Our outlook for 2026 remains consistent with what we have previously shared. With that, I'll turn it back over to the operator for questions.

Operator: Thank you. As a reminder, to ask a question, please press star one one on your telephone and wait for your name to be announced. To withdraw your question, please press star one one again. One moment for questions. Our first question comes from Gil Luria with D.A. Davidson. You may proceed.

Operator: Thank you. As a reminder, to ask a question, please press star one one on your telephone and wait for your name to be announced. To withdraw your question, please press star one one again. One moment for questions. Our first question comes from Gil Luria with D.A. Davidson. You may proceed.

Speaker #2: Thank you. As a reminder, to ask a question, please press star one one on your telephone and wait for your name to be announced. To withdraw your question, please press star one one again.

Speaker #2: One moment for questions. Our first question comes from Giluria with D.A. Davidson. You may proceed.

Gil Luria: Good afternoon. When you talk in your press release, and we've talked a lot about the different types of applications that are in front of you. I think in the release, you referred to public safety, retail, smart cities, aerial robotics, and I know there's auto coming down the pipe. How would you prioritize them in terms of what you're going to have this year and how those opportunities play out over the next three or four years?

Gil Luria: Good afternoon. When you talk in your press release, and we've talked a lot about the different types of applications that are in front of you. I think in the release, you referred to public safety, retail, smart cities, aerial robotics, and I know there's auto coming down the pipe. How would you prioritize them in terms of what you're going to have this year and how those opportunities play out over the next three or four years?

Speaker #3: Good afternoon. We talk in your press release and we've talked a lot about the different types of applications that are in front of you.

Speaker #3: I think in the release, you referred to public safety, retail, smart cities, aerial robotics, and I know there's auto coming down the pike. How would you prioritize them in terms of what you're going to have this year, and how those opportunities play out over the next three or four years?

Harminder Sehmi: Hi, Gil. I think the commonality is AI inference. This is where we're seeing momentum and comprising our full stack, the silicon, the systems, you know, servers and our software on top.

Dinakar Munagala: Hi, Gil. I think the commonality is AI inference. This is where we're seeing momentum and comprising our full stack, the silicon, the systems, you know, servers and our software on top.

Speaker #4: Hi, Gil. So I think the commonality is AI inference. This is where we're seeing momentum, and comprising our full stack—the silicon, servers, and our software on top.

Dinakar Munagala: Specific use cases that we're seeing momentum around, a combination of, you know, smart health, factory automation, industrial use cases, and but also part of relationships with drones and such use cases. I don't know if you want to add any further, Sameer.

Dinakar Munagala: Specific use cases that we're seeing momentum around, a combination of, you know, smart health, factory automation, industrial use cases, and but also part of relationships with drones and such use cases. I don't know if you want to add any further, Sameer.

Speaker #4: Specific use cases that we're seeing momentum in are around a combination of smart health, factory automation, industrial use cases, and we're also part of relationships with drones and such use cases.

Speaker #4: I don't know if you want to add any further.

[Company Representative] (Blaize Holdings): No, so, Gil Luria, the pipeline that we've got includes all of those. How we set priorities is really the pace at which the any POCs or pilots are getting concluded with those customers. As you know, inference solutions require access to data. So in short, the near-term priority is just converting a pipeline where we've got access to those customers and data. In the medium term, it is how do we expand more business into some of those customers.

Harminder Sehmi: No, so, Gil Luria, the pipeline that we've got includes all of those. How we set priorities is really the pace at which the any POCs or pilots are getting concluded with those customers. As you know, inference solutions require access to data. So in short, the near-term priority is just converting a pipeline where we've got access to those customers and data. In the medium term, it is how do we expand more business into some of those customers.

Speaker #3: No, you've—so, Gil, hey, the pipeline that we've got includes all of those. How we set priorities is really the pace at which the NEPOCs or pilots are getting concluded with those customers.

Speaker #3: As you know, inference requires—or inference solutions require—access to data. So in short, the near-term priority is just converting pipeline where we've got access to those customers and data, and in the medium term, it is how do we expand more business into some of those customers.

Gil Luria: Thank you. Then the second question is about gross margins. Appreciate the visibility into the end of this year. Longer term and at scale, what do we expect our long-term model to look for in terms of gross margins on the hardware side and on the software side? With more of a push to services, do we still expect software and services to be about a quarter of the mix in a longer term model? That'll help us get the full picture.

Gil Luria: Thank you. Then the second question is about gross margins. Appreciate the visibility into the end of this year. Longer term and at scale, what do we expect our long-term model to look for in terms of gross margins on the hardware side and on the software side? With more of a push to services, do we still expect software and services to be about a quarter of the mix in a longer term model? That'll help us get the full picture.

Speaker #5: Thank you. And then the second question is about gross margins. I appreciate the visibility into the end of this year, but longer term and at scale, what do we expect our long-term model to look like in terms of gross margins on the hardware side and on the software side?

Speaker #5: And with more of a push to services, do we still expect software and services to be about a quarter of the mix in a longer-term model?

Speaker #5: That'll help us get the full picture.

Harminder Sehmi: Yes, in the longer term for us is 55%+, and that's gonna be a blend of hardware and software. I think what we're observing now with the remarks that Dinakar went through on AI services platform, which essentially becomes a combination of hardware and software. It's not, you know, you're not distinguishing between the two. And there is a revenue share type model that we can see coming our way. 55%+ as a blend. I think software and recurring revenue, if I can put it that way, could become a larger portion of the mix. But too early to say just yet, and we'll continue to make announcements as and when some of those deployments get public.

Harminder Sehmi: Yes, in the longer term for us is 55%+, and that's gonna be a blend of hardware and software. I think what we're observing now with the remarks that Dinakar went through on AI services platform, which essentially becomes a combination of hardware and software. It's not, you know, you're not distinguishing between the two. And there is a revenue share type model that we can see coming our way. 55%+ as a blend. I think software and recurring revenue, if I can put it that way, could become a larger portion of the mix. But too early to say just yet, and we'll continue to make announcements as and when some of those deployments get public.

Speaker #4: So yes, in the longer term for us, it's 55% plus. And that's going to be a blend of hardware and software. I think what we're observing now, with the remarks that Danika went through on the AI Services Platform, which essentially becomes a combination of hardware and software.

Speaker #4: So it's not—you're not distinguishing between the two. And there is a revenue share-type model that we can see coming our way. So, 55% plus as a blend.

Speaker #4: I think software and recurring revenue, if I can put it that way, could become a larger portion of the mix. But too early to say just yet.

Speaker #4: And we'll continue to make announcements as and when some of those deployments get public.

Gil Luria: Appreciate it. Thank you very much.

Gil Luria: Appreciate it. Thank you very much.

Speaker #5: Appreciate it. Thank you very much.

Operator: Thank you. Our next question comes from Craig Ellis with B. Riley Securities. You may proceed.

Operator: Thank you. Our next question comes from Craig Ellis with B. Riley Securities. You may proceed.

Speaker #2: Thank you. Our next question comes from Craig Ellis with B. Riley Securities. You may proceed.

Craig Ellis: Yeah, thanks for taking the question. Guys, congratulations on hitting the strong revenue on ramp in Q4. I wanted to start the line of inquiry following up on the $130 million revenue guide for calendar 2026. Can you help us understand the extent to which Starshine and Yotta are driving that versus other things like maybe converting the Nokia MOU into revenue or maybe even getting traction on some of the new capabilities that we identified in the press release that you've talked about, the services platform and AI application delivery?

Craig Ellis: Yeah, thanks for taking the question. Guys, congratulations on hitting the strong revenue on ramp in Q4. I wanted to start the line of inquiry following up on the $130 million revenue guide for calendar 2026. Can you help us understand the extent to which Starshine and Yotta are driving that versus other things like maybe converting the Nokia MOU into revenue or maybe even getting traction on some of the new capabilities that we identified in the press release that you've talked about, the services platform and AI application delivery?

Speaker #6: Yeah, thanks for taking the question. And, guys, congratulations on hitting the strong revenue on-ramp in the fourth quarter. I wanted to start the line of inquiry by following up on the $130 million revenue guide for calendar '26.

Speaker #6: Can you help us understand the extent to which Starshine and Yada are driving that, versus other things like maybe converting the Nokia MOU into revenue, or maybe even getting traction on some of the new capabilities that we identified in the press release and that you've talked about—the services platform and AI application delivery?

Harminder Sehmi: Let me start, and then Dinakar can come in. Yes, you know, Yotta and Starshine are partnerships that we developed late last year. They still remain important to us. The pace at which we deliver products to them is largely driven by their end user needs. There are other partners that we have introduced in the back end of last year. Over the next maybe 3 to 6 months, we expect to add maybe 1 or 2 more partners. When I stand back and look at it, the revenue guidance is really supported by some of the engagements we've had and what we expect to close during the year.

Harminder Sehmi: Let me start, and then Dinakar can come in. Yes, you know, Yotta and Starshine are partnerships that we developed late last year. They still remain important to us. The pace at which we deliver products to them is largely driven by their end user needs. There are other partners that we have introduced in the back end of last year. Over the next maybe 3 to 6 months, we expect to add maybe 1 or 2 more partners. When I stand back and look at it, the revenue guidance is really supported by some of the engagements we've had and what we expect to close during the year.

Speaker #4: So let me start, and then Danika can come in. So yes, Yada and Starshine are partnerships that we developed late last year. They still remain important to us.

Speaker #4: The pace at which we deliver products to them is largely driven by their end-user needs. There are other partners that we have introduced in the back end of last year.

Speaker #4: Over the next maybe three to six months, we expect to add maybe one or two more partners. So, when I stand back and look at it, the revenue guidance is really supported by some of the engagements we've had and what we expect to close.

Speaker #4: During the year, an important point to make is that the AI services platform, and the relationship with partners like Nokia, are expected to start to feature into Q4, towards the end of next year.

Harminder Sehmi: An important point to make is that the AI services platform and the relationship with partners like Nokia, you know, are expected to start to feature into towards the end of next year. Really, when we look at the business going forward, it falls into three sort of big buckets. Number one is just system revenue, which is a combination of mainly hardware, but could be Blaize and third party. Number two is there's an attach rate of Blaize software which gets monetized. Overall, when you look at the system, particularly when you're applying it to cloud service providers, tier two cloud service providers, you know, it's giving everybody or them an opportunity to start to monetize the infrastructure that they've invested in.

Harminder Sehmi: An important point to make is that the AI services platform and the relationship with partners like Nokia, you know, are expected to start to feature into towards the end of next year. Really, when we look at the business going forward, it falls into three sort of big buckets. Number one is just system revenue, which is a combination of mainly hardware, but could be Blaize and third party. Number two is there's an attach rate of Blaize software which gets monetized. Overall, when you look at the system, particularly when you're applying it to cloud service providers, tier two cloud service providers, you know, it's giving everybody or them an opportunity to start to monetize the infrastructure that they've invested in.

Speaker #4: And really, when we look at the business going forward, it falls into three sort of big buckets. Number one is just system revenue, which is a combination of mainly hardware, but it could be Blaize and third party.

Speaker #4: And number two is, there's an attach rate of Blaize software, which gets monetized. And overall, when you look at the system—particularly when you're applying it to cloud service providers, tier two cloud service providers—it's giving everybody, or them, an opportunity to start to monetize the infrastructure that they've invested in.

Dinakar Munagala: Yeah. Just to add, I guess, is that where we were with you know a couple of key relationships, I think that is actually growing in two dimensions. One is within anchor customers, there is a land and expand that is typically starts upon use case. Once we establish credibility, that leads to additional use cases and additional opportunity there. Also once we've developed a certain use case, it is relevant to a larger market. We're witnessing that momentum as well, where a solution that we developed for, with a certain partner is required in a different geography, different customer, and so on. We're witnessing that kind of demand as well. The common theme is of course our combination of-

Dinakar Munagala: Yeah. Just to add, I guess, is that where we were with you know a couple of key relationships, I think that is actually growing in two dimensions. One is within anchor customers, there is a land and expand that is typically starts upon use case. Once we establish credibility, that leads to additional use cases and additional opportunity there. Also once we've developed a certain use case, it is relevant to a larger market. We're witnessing that momentum as well, where a solution that we developed for, with a certain partner is required in a different geography, different customer, and so on. We're witnessing that kind of demand as well. The common theme is of course our combination of-

Speaker #4: Yeah, and just to add, I guess, where we were with a couple of key relationships—I think that is actually growing in two dimensions.

Speaker #4: One is within anchor customers, there is a land-and-expand, where we typically start with one use case. And once we establish credibility, that leads to additional use cases and additional opportunity there.

Speaker #4: But also, once we've developed a certain use case, it is relevant to a larger market. So we're witnessing that momentum as well, where a solution that we developed with a certain partner is required in a different geography, different customer, and so on.

Speaker #4: So we're witnessing that kind of demand as well. The common theme is, of course, our combination of the three components that Harminda mentioned: hardware, software, and API revenue that we expect will kick in with the launch of our AI services platform.

[Company Representative] (Blaize Holdings): The three components that Harminder mentioned, hardware, software, and API, revenue that we expect will kick in with the launch of our AI services platform.

Dinakar Munagala: The three components that Harminder mentioned, hardware, software, and API, revenue that we expect will kick in with the launch of our AI services platform.

Craig Ellis: Got it. That's very helpful. Then the follow-up is really a two-parter. In past calls and conversations, we've quantified the opportunity pipeline at about $725 million, 725. Can you give us an update on whether that's still the right way to look at the opportunity pipeline or has it changed? Then on the adjusted EBITDA guidance for the year, can you clarify the extent to which mask set costs are included? Will there be any chip-related mask set costs that we should be incorporating into our OpEx modeling? Thanks, guys.

Craig Ellis: Got it. That's very helpful. Then the follow-up is really a two-parter. In past calls and conversations, we've quantified the opportunity pipeline at about $725 million, 725. Can you give us an update on whether that's still the right way to look at the opportunity pipeline or has it changed? Then on the adjusted EBITDA guidance for the year, can you clarify the extent to which mask set costs are included? Will there be any chip-related mask set costs that we should be incorporating into our OpEx modeling? Thanks, guys.

Speaker #5: Got it. That's very helpful. And then the follow-up is really a two-parter. In past calls and conversations, we've quantified the opportunity pipeline at about $725 million, $725.

Speaker #5: Can you give us an update on whether that's still the right way to look at the opportunity pipeline, or has it changed? And then, on the adjusted EBITDA guidance for the year, can you clarify the extent to which mass ad costs are included?

Speaker #5: Will there be any chip-related mass at costs that we should be incorporating into our OPEX modeling? Thanks, guys.

Harminder Sehmi: Sure. Every pipeline is dynamic, and we've seen meaningful traction in the Asia Pacific region in particular. We have, Dinakar talked about Stephen Patak joining us as CRO. You know, he's got very good visibility of what's gonna support 2026 revenue. Whether we use pipeline as a public measure, you know, for us, it's really about trying to get the contract signed and, you know, converted. Having said all of that, the pipeline is still significant. It's very significant. It does change. The geopolitical tensions have had some, a bit of an impact on some deployments where we-

Harminder Sehmi: Sure. Every pipeline is dynamic, and we've seen meaningful traction in the Asia Pacific region in particular. We have, Dinakar talked about Stephen Patak joining us as CRO. You know, he's got very good visibility of what's gonna support 2026 revenue. Whether we use pipeline as a public measure, you know, for us, it's really about trying to get the contract signed and, you know, converted. Having said all of that, the pipeline is still significant. It's very significant. It does change. The geopolitical tensions have had some, a bit of an impact on some deployments where we-

Speaker #4: Sure. So, every pipeline is dynamic. And we've seen meaningful traction in the Asia-Pacific region in particular. We have Danika, talked about Stephen Paytak joining us as CRO.

Speaker #4: So, he's got—he's working through, he's got very good visibility of what's going to support 2026 revenue. And whether we use pipeline as a public measure, for us, it's really about trying to get the contract signed and converted.

Speaker #4: Having said all of that, the pipeline is still significant. It's very significant. It does change. The geopolitical tensions have had some bit of an impact on some deployments, where we're not quite sure when they will come back in.

Craig Ellis: Yeah.

Harminder Sehmi: We're not quite sure when, you know, they will come back in. The Nokia partnership and the AI services platform will add to our pipeline, which isn't in the numbers today. Let me leave that up there. Let me leave that there. Then the second point was about the adjusted EBITDA. As you know, the core design of our chip is common across the roadmap. The good thing is that our engineers, in-house engineers whose costs are really in the payroll, they continue to work on adding features and reacting to what's happening in the marketplace. The external costs, so when you're talking about mask sets, you know, that's tape out. That'll be in 2027 and beyond.

Craig Ellis: Yeah.

Harminder Sehmi: We're not quite sure when, you know, they will come back in. The Nokia partnership and the AI services platform will add to our pipeline, which isn't in the numbers today. Let me leave that up there. Let me leave that there. Then the second point was about the adjusted EBITDA. As you know, the core design of our chip is common across the roadmap. The good thing is that our engineers, in-house engineers whose costs are really in the payroll, they continue to work on adding features and reacting to what's happening in the marketplace. The external costs, so when you're talking about mask sets, you know, that's tape out. That'll be in 2027 and beyond.

Speaker #4: The Nokia partnership and the AI services platform will add to our pipeline, which isn't in the numbers today. So let me leave that there.

Speaker #4: Let me leave that there. And then the second point was about the adjusted EBITDA. As you know, the core design of our chip is common across the roadmap.

Speaker #4: And the good thing is that our in-house engineers, whose costs are really in the payroll, they continue to work on adding features and reacting to what's happening in the marketplace.

Speaker #4: The external costs—so, when you're talking about mask sets, that's tape-out. That'll be in 2027 and beyond. But the early part of the third-party external costs, I see some of those kicking in towards the second half.

Harminder Sehmi: The early part of the third party external costs, I see some of those kicking in towards the H2, and that's generally going to be third party IP that we buy and some of the professional services that we pay for the third party physical design companies.

Harminder Sehmi: The early part of the third party external costs, I see some of those kicking in towards the H2, and that's generally going to be third party IP that we buy and some of the professional services that we pay for the third party physical design companies.

Speaker #4: And that's generally going to be third-party IP that we buy, and some of the professional services that we pay for—the third-party physical design companies.

Craig Ellis: Very helpful. Thanks, Harminder.

Craig Ellis: Very helpful. Thanks, Harminder.

Speaker #5: Very helpful. Thanks, Harminda.

Operator: Thank you. Our next question comes from Richard Shannon with Craig-Hallum Capital Group. You may proceed.

Operator: Thank you. Our next question comes from Richard Shannon with Craig-Hallum Capital Group. You may proceed.

Speaker #1: Thank you. Our next question comes from Richard Shannon with Craig Allen Capital Group. You may proceed.

Richard Shannon: Thanks a lot, guys, for letting me ask a couple questions. Maybe a follow-up on one of the prior questions here, maybe looking at a different angle here on calendar 2026. Would love to get a sense of, relative to this $130 million guidance for the year, how much of this is in backlog or, you know, some sort of commitments here. I think last call you talked about, kind of visibility of $160 million in between your biggest customers. Obviously, we're one quarter through that, but love to get a sense of what that support looks like, and then maybe to ask more specifically, how do we think about customer concentration or mix, this year within that? Thank you.

Richard Shannon: Thanks a lot, guys, for letting me ask a couple questions. Maybe a follow-up on one of the prior questions here, maybe looking at a different angle here on calendar 2026. Would love to get a sense of, relative to this $130 million guidance for the year, how much of this is in backlog or, you know, some sort of commitments here. I think last call you talked about, kind of visibility of $160 million in between your biggest customers. Obviously, we're one quarter through that, but love to get a sense of what that support looks like, and then maybe to ask more specifically, how do we think about customer concentration or mix, this year within that? Thank you.

Speaker #6: Thanks, guys, for letting me ask a couple of questions. Maybe a follow-up on one of the prior questions here, maybe looking at a different angle here on calendar '26.

Speaker #6: Would love to get a sense of, relative to this $130 million guidance for the year, how much of this is in backlog or some sort of commitments here.

Speaker #6: I think last call you talked about kind of visibility of $160 million in the two-year biggest customers, obviously rolling quarter through that. But love to get a sense of what that support looks like.

Speaker #6: And then maybe to ask more specifically, how do we think about customer concentration or mix this year within that? Thank you.

Harminder Sehmi: Yes, we do have. We announced those two large contracts. They still remain, we're still delivering against those. As I mentioned in my prepared remarks, Richard, the pace at which those purchase orders come in are kind of determined by the end user, the customers, what they're gonna deploy. We have added new customers into our pipeline and they too have. Backlog may be a different way to look at it. We'd like to think that if you've got a design win and a customer has a need for either edge or as we'll find out over the coming weeks and months, AI services, customers draw down by issuing purchase orders on us.

Harminder Sehmi: Yes, we do have. We announced those two large contracts. They still remain, we're still delivering against those. As I mentioned in my prepared remarks, Richard, the pace at which those purchase orders come in are kind of determined by the end user, the customers, what they're gonna deploy. We have added new customers into our pipeline and they too have. Backlog may be a different way to look at it. We'd like to think that if you've got a design win and a customer has a need for either edge or as we'll find out over the coming weeks and months, AI services, customers draw down by issuing purchase orders on us.

Speaker #4: So yes, we do have—we announced those two large contracts. They still remain; we're still delivering against those. As I mentioned in my prepared remarks, Richard, the pace at which those purchase orders come in is kind of determined by the end user, the customers—what they want to deploy.

Speaker #4: We have added new customers into our pipeline, and they, too, have some backlog, so maybe that's a different way to look at it. We'd like to think that if you've got a design win and your customer has a need for either edge, or, as we'll find out over the coming weeks and months, AI services, customers draw down by issuing purchase orders.

Speaker #4: On us. And so we stay close to them in order to manage our own supply chain and so that we can deliver those. I hope that helps a little bit more in explaining and understanding why we are comfortable about our $130 million guidance.

Harminder Sehmi: We stay close to them in order to manage our own supply chain and so that we can deliver those. I hope that helps a little bit more explaining, understanding why we are comfortable about our $130 million guidance. Then,

Harminder Sehmi: We stay close to them in order to manage our own supply chain and so that we can deliver those. I hope that helps a little bit more explaining, understanding why we are comfortable about our $130 million guidance. Then,

Operator: That's.

Richard Shannon: That's.

Speaker #4: And then.

Harminder Sehmi: Yeah. You said about customer concentration. I don't know, Dinakar, if you want to take it.

Speaker #1: So that's.

Harminder Sehmi: Yeah. You said about customer concentration. I don't know, Dinakar, if you want to take it.

Speaker #4: Yeah. You said about customer concentration. I don't know, Danika, if you want to.

Dinakar Munagala: Thank you. The customer concentration is, we're actually moving beyond our initial customers. As I mentioned previously, the use case, once it's perfected, it's relevant to more customers, so we're getting that pull. AI inference is growing very rapidly. I mean, there was a point of time, one training chip for every one training chip, there were like four or eight. Now we're hearing numbers like 16, right? This is rapidly changing. So the key message is having a hybridized platform with our AI services that we can deliver into these use cases and within the same customer as well as across other customers is seeing quite a bit of momentum.

Dinakar Munagala: Thank you. The customer concentration is, we're actually moving beyond our initial customers. As I mentioned previously, the use case, once it's perfected, it's relevant to more customers, so we're getting that pull. AI inference is growing very rapidly. I mean, there was a point of time, one training chip for every one training chip, there were like four or eight. Now we're hearing numbers like 16, right? This is rapidly changing. So the key message is having a hybridized platform with our AI services that we can deliver into these use cases and within the same customer as well as across other customers is seeing quite a bit of momentum.

Speaker #3: Actually, the customer concentration is—we're actually moving beyond our initial customers. As I mentioned previously, the use case, once it's perfected, is relevant to more customers.

Speaker #3: So we're getting that pull. And AI inference is growing very rapidly. I mean, there was a point of time, for every one training chip, there were like four or eight.

Speaker #3: Now we're hearing numbers like 16, right? And this is rapidly changing. So the key message is: having a hybridized platform with our AI services that we can deliver into these use cases, and within the same customer as well as across other customers, is seeing quite a bit of momentum.

Richard Shannon: Okay. First off, my follow-on question here is regarding Nokia. Very interesting and powerful press release you had earlier this year about an MOU here. Would love to understand what are kind of the next steps here, especially announceable steps in this relationship, and when you ultimately look forward to be contributing to backlog and eventually revenues. Dinakar, or excuse me, I think Harminder, I think you mentioned in one of your replies maybe talking about sometime end of next year, which seemed kind of a long time process. So just wanted to clarify if that's what you meant there. Thank you.

Richard Shannon: Okay. First off, my follow-on question here is regarding Nokia. Very interesting and powerful press release you had earlier this year about an MOU here. Would love to understand what are kind of the next steps here, especially announceable steps in this relationship, and when you ultimately look forward to be contributing to backlog and eventually revenues. Dinakar, or excuse me, I think Harminder, I think you mentioned in one of your replies maybe talking about sometime end of next year, which seemed kind of a long time process. So just wanted to clarify if that's what you meant there. Thank you.

Speaker #6: Okay. Third off, my follow-up question here is regarding Nokia. Very interesting and powerful press release she had earlier this year about the MOU here. We'd love to understand, what are kind of the next steps here, especially announceable steps in this relationship.

Speaker #6: And when you ultimately look forward to contributing to backlog and eventually revenues, Danika—I thought, or excuse me, I think Harminda, I think you mentioned in one of your replies maybe talking about sometime end of next year, which seemed kind of a long time process.

Speaker #6: So just wanted to clarify that's what you meant there. Thank you.

Dinakar Munagala: Right. Let me start, and Harminder can add. We are actually quite excited with the whole Nokia relationship. This started almost six months ago on a visit to Singapore where we met with their Asia Pacific leadership. We showed them our platform, and they got visibly excited, and then they saw how this allows for a collaboration for them to participate in the AI infrastructure build-out. The tangible next steps, right, are we are building a joint solution, an AI platform focused on inference needs into their customers, as well as customers that we can bring to the table. You know, their networking stack plus our AI system stack, software stack.

Dinakar Munagala: Right. Let me start, and Harminder can add. We are actually quite excited with the whole Nokia relationship. This started almost six months ago on a visit to Singapore where we met with their Asia Pacific leadership. We showed them our platform, and they got visibly excited, and then they saw how this allows for a collaboration for them to participate in the AI infrastructure build-out. The tangible next steps, right, are we are building a joint solution, an AI platform focused on inference needs into their customers, as well as customers that we can bring to the table. You know, their networking stack plus our AI system stack, software stack.

Speaker #4: Right. So let me start in. I'm in the Kanhad. So we're actually quite excited with the whole Nokia relationship. This started almost six months ago on a visit to Singapore.

Speaker #4: Where we met with their Asia-Pacific leadership. We showed them our platform, and they got visibly excited. And then they saw how this allows for a collaboration, for them to participate in the AI infrastructure build-out.

Speaker #4: So, the tangible next steps, right, are we are building a joint solution—an AI platform focused on inference needs—into their customers, as well as customers that we can bring to the table.

Speaker #4: Their networking stack plus our AI system stack—software stack. And this is the joint solution that we will actually demonstrate and launch at GITEX Asia in maybe less than a couple of weeks.

Dinakar Munagala: This is the joint solution that we will actually demonstrate and launch at GITEX Asia in maybe less than a couple of weeks. There is going to be a joint go-to-market, co-selling into their customers, system integrators, cloud service providers, and enterprises. You know, those are the near-term next steps.

Dinakar Munagala: This is the joint solution that we will actually demonstrate and launch at GITEX Asia in maybe less than a couple of weeks. There is going to be a joint go-to-market, co-selling into their customers, system integrators, cloud service providers, and enterprises. You know, those are the near-term next steps.

Speaker #4: There is going to be a joint go-to-market co-selling into their customers—system integrators, cloud service providers, and enterprises. And those are the near-term next steps.

Harminder Sehmi: Yeah. Sorry, Richard, I don't know whether I misspoke or maybe you misheard. No. The revenues from AI services platform generally, of which of course Nokia will be a part as a partner, is towards the end of this year. As Dinakar mentioned, we're launching certain aspects of the platform sooner, and as more and more APIs are developed, that just allows us to expand the population that can start to pay for or utilize these services and of course pay us for it.

Harminder Sehmi: Yeah. Sorry, Richard, I don't know whether I misspoke or maybe you misheard. No. The revenues from AI services platform generally, of which of course Nokia will be a part as a partner, is towards the end of this year. As Dinakar mentioned, we're launching certain aspects of the platform sooner, and as more and more APIs are developed, that just allows us to expand the population that can start to pay for or utilize these services and of course pay us for it.

Speaker #5: Yeah. Sorry, Richard. I don't know whether I misspoke or maybe you misheard. No, the revenues from AI services platform generally, of which, of course, Nokia will be a part as a partner, is towards the end of this year.

Speaker #5: So, as Danika mentioned, we're launching certain aspects of the platform sooner. And as more and more APIs are developed, that just allows us to expand the population that can start to pay for or utilize these services and, of course, pay us for it.

Richard Shannon: Okay. Perfect. That timeframe made a lot more sense. I'm glad I asked about that. That is all my questions, guys. Thank you.

Richard Shannon: Okay. Perfect. That timeframe made a lot more sense. I'm glad I asked about that. That is all my questions, guys. Thank you.

Speaker #1: Okay, perfect. That timeframe made a lot more sense. I'm glad I asked about that. That is all my questions, guys. Thank you.

Operator: Thank you. As a reminder, to ask a question, please press star one one on your telephone. Our next question comes from Kevin Cassidy with Rosenblatt Securities. You may proceed.

Operator: Thank you. As a reminder, to ask a question, please press star one one on your telephone. Our next question comes from Kevin Cassidy with Rosenblatt Securities. You may proceed.

Speaker #2: Thank you. And as a reminder, to ask a question, please press star one one on your telephone. Our next question comes from Kevin Cassidy with Rosenblatt Securities.

Speaker #2: He may proceed.

Chris Myers: Hi, this is Chris Myers on for Kevin Cassidy. I think you guys already answered my question. It was gonna be about the revenue timing on the Nokia MOU. I guess, in general, if you could just talk a little bit more about that broader opportunity set and if there's similar infrastructure wins that could come up that are, I guess, along the lines of this deal.

Chris Myers: Hi, this is Chris Myers on for Kevin Cassidy. I think you guys already answered my question. It was gonna be about the revenue timing on the Nokia MOU. I guess, in general, if you could just talk a little bit more about that broader opportunity set and if there's similar infrastructure wins that could come up that are, I guess, along the lines of this deal.

Speaker #7: Hi, this is Chris Myers on for Kevin Cassidy. I think you guys already answered my question. It was going to be about the revenue timing on the Nokia MOU.

Speaker #7: But I guess, in general, if you could just talk a little bit more about that broader opportunity set, and if there are similar infrastructure wins that could come up that are, I guess, along the lines of this deal.

Dinakar Munagala: Yeah. We do have similar opportunities that we are working on in other continents. As they, you know, materialize, we'll be sure to update you. There is of course Asia, quite a bit of momentum we are witnessing. Africa is another place that we have seen some initial traction. Of course, US as well, there is massive infrastructure happening, and they do want hybridized AI to serve business outcomes, right? Less to do with, you know, what's under the hood, but more about, "Hey, can you solve my business outcome in a certain CapEx and OpEx spend?" These are the kind of questions that our team gets asked. Our solutions are a perfect fit also because we're seeing that the average model size is dramatically shrinking, right?

Dinakar Munagala: Yeah. We do have similar opportunities that we are working on in other continents. As they, you know, materialize, we'll be sure to update you. There is of course Asia, quite a bit of momentum we are witnessing. Africa is another place that we have seen some initial traction. Of course, US as well, there is massive infrastructure happening, and they do want hybridized AI to serve business outcomes, right? Less to do with, you know, what's under the hood, but more about, "Hey, can you solve my business outcome in a certain CapEx and OpEx spend?" These are the kind of questions that our team gets asked. Our solutions are a perfect fit also because we're seeing that the average model size is dramatically shrinking, right?

Speaker #3: Yeah, we do have similar opportunities that we are working on in other continents. And as they materialize, we'll be sure to update you. There is, of course, Asia—there's quite a bit of momentum we are witnessing.

Speaker #3: Africa is another place that we are seeing some initial traction. Of course, the US as well. There is massive infrastructure happening. And they do want hybridized AI to serve business outcomes, right?

Speaker #3: It's less to do with what's under the hood, but more about, hey, can you solve my business outcome within a certain CapEx and OpEx spend?

Speaker #3: These are the kind of questions that our team gets asked, and our solutions are a perfect fit, also because we're seeing that the average model sizes are dramatically shrinking, right?

Dinakar Munagala: These models rival the larger models, and they still achieve the same business outcome. This is a perfect fit for our graph streaming architecture. In combination with GPUs, we're able to deliver to this outcome. We're seeing such kind of momentum. This is across the board, right, wherever our sales teams are present.

Dinakar Munagala: These models rival the larger models, and they still achieve the same business outcome. This is a perfect fit for our graph streaming architecture. In combination with GPUs, we're able to deliver to this outcome. We're seeing such kind of momentum. This is across the board, right, wherever our sales teams are present.

Speaker #3: These models rival the larger models, and they still achieve the same business outcome. And this is a perfect fit for our graph streaming architecture.

Speaker #3: In combination with GPUs, we're able to deliver this outcome. So we're seeing this kind of momentum. This is across the board, right? Wherever our sales teams are present.

Chris Myers: Awesome, guys. Thank you so much, and, congratulations on the great progress.

Chris Myers: Awesome, guys. Thank you so much, and, congratulations on the great progress.

Speaker #1: Awesome, guys. Thank you so much, and congratulations on the great progress.

Dinakar Munagala: Thank you.

Dinakar Munagala: Thank you.

Speaker #4: Sure. Thank you.

Operator: Thank you. I would now like to turn the call back over to Dinakar Munagala for any closing remarks.

Operator: Thank you. I would now like to turn the call back over to Dinakar Munagala for any closing remarks.

Speaker #2: Thank you. And I would now like to turn the call back over to Danika Armunagala for any closing remarks.

Dinakar Munagala: Thank you, operator. Before we close, let me briefly recap. We delivered strong revenue growth and expanded our global footprint, driven by key partnerships including Nokia and cloud service providers in Asia Pacific, as well as our work with state government initiatives in India. We're preparing to launch our Blaize AI Platform in Q2, positioning us to capture the next phase of AI monetization while improving our revenue mix and margin profile. At the same time, we're seeing a clear shift towards smaller task-specific models that deliver strong performance with far greater efficiency. This aligns directly with our Graph Streaming Processor architecture and strengthens our position as AI infrastructure build-out continues to scale. I also wanted to acknowledge the situation in the Middle East.

Dinakar Munagala: Thank you, operator. Before we close, let me briefly recap. We delivered strong revenue growth and expanded our global footprint, driven by key partnerships including Nokia and cloud service providers in Asia Pacific, as well as our work with state government initiatives in India. We're preparing to launch our Blaize AI Platform in Q2, positioning us to capture the next phase of AI monetization while improving our revenue mix and margin profile. At the same time, we're seeing a clear shift towards smaller task-specific models that deliver strong performance with far greater efficiency. This aligns directly with our Graph Streaming Processor architecture and strengthens our position as AI infrastructure build-out continues to scale. I also wanted to acknowledge the situation in the Middle East.

Speaker #3: Thank you, operator. Before we close, let me briefly recap. We delivered strong revenue growth and expanded our global footprint, driven by key partnerships, including Nokia and cloud service providers in Asia-Pacific.

Speaker #3: As well as our work with state government initiatives in India, we're preparing to launch our Blaize AI Services platform in Q2, positioning us to capture the next phase of AI monetization while improving our revenue mix and margin profile.

Speaker #3: At the same time, we're seeing a clear shift towards smaller, task-specific models. That delivers strong performance with far greater efficiency. This aligns directly with our graph streaming processor architecture and strengthens our position.

Speaker #3: As AI infrastructure build-out continues to scale, I also wanted to acknowledge the situation in the Middle East. A priority remains the safety of our employees, partners, and customers in the region.

Dinakar Munagala: Our priority remains the safety of our employees, partners, and customers in the region, and we are committed to maintaining continuity and stability in our operations. Thank you to our analysts, investors, as well as our customers and partners for your continued support. We look forward to updating you the next quarter.

Dinakar Munagala: Our priority remains the safety of our employees, partners, and customers in the region, and we are committed to maintaining continuity and stability in our operations. Thank you to our analysts, investors, as well as our customers and partners for your continued support. We look forward to updating you the next quarter.

Speaker #3: And we are committed to maintaining continuity and stability in our operations. Thank you to our analysts, investors, as well as our customers and partners, for your continued support.

Speaker #3: We look forward to updating you next quarter.

Operator: Thank you. This concludes the conference. Thank you for your participation. You may now disconnect.

Operator: Thank you. This concludes the conference. Thank you for your participation. You may now disconnect.

Q4 2025 Blaize Holdings Inc Earnings Call

Demo

Blaize Holdings

Earnings

Q4 2025 Blaize Holdings Inc Earnings Call

BZAI

Tuesday, March 24th, 2026 at 9:00 PM

Transcript

No Transcript Available

No transcript data is available for this event yet. Transcripts typically become available shortly after an earnings call ends.

Want AI-powered analysis? Try AllMind AI →