Q4 2025 Blaize Holdings Inc Earnings Call
Speaker #1: Good day and thank you for standing by . Welcome to the blaze . Fourth quarter 2020 Earnings Conference Call . At this time , all participants are in a listen only mode .
Operator: Good day, and thank you for standing by. Welcome to the Blaize Q4 2025 Earnings Conference Call. At this time, all participants are in a listen-only mode. Please be advised that today's conference is being recorded. After the speaker's presentation, there will be a question-and-answer session. To ask a question, please press star one one on your telephone and wait for your name to be announced. To withdraw your question, please press star one one again. I would now like to hand the conference over to your speaker today, Lana Adair, Investor Relations.
Speaker #1: Please be advised that today's conference is being recorded . After the speakers presentation , there will be a question and answer session . To ask a question , please press star one one on your telephone and wait for your name to be announced .
Speaker #1: To withdraw your question, please press star one one again. I would now like to hand the conference over to your speaker today.
Speaker #1: Lana Adair, Investor Relations.
Speaker #2: Good afternoon everyone . Thank you for joining Blaize Holdings Inc's fourth quarter 2020 earnings call . Before we begin , the prepared remarks , we would like to remind you that earlier today , Blaize Holdings Inc. issued a press release announcing its results .
Lana Adair: Good afternoon, everyone. Thank you for joining Blaize Holdings, Inc. Q4 2025 Earnings Call. Before we begin the prepared remarks, we would like to remind you that earlier today, Blaize Holdings, Inc. issued a press release announcing its results. Earnings materials are available on the investor relations section of Blaize Holdings, Inc.'s website. Today's earnings call and press release reflect management's view as of today only and includes statements related to our 2026 financial guidance, revenue, gross margin, competitive position, anticipated industry trends, market opportunities, products, and financing opportunities, all of which constitute forward-looking statements under the federal securities laws. Actual results may differ materially from those contained in or implied by these forward-looking statements due to risks and uncertainties associated with our business.
Speaker #2: Earnings materials are available on the Investor Relations section of Blaize Holdings Inc's website. Today's earnings call and press release reflect management's view as of today only, and include statements related to our 2026 financial guidance.
Speaker #2: Revenue , gross margin , competitive position , anticipated industry trends , market opportunities , products , and financing opportunities , all of which constitute forward looking of which constitute forward looking statements under the Federal securities laws .
Speaker #2: Actual results may differ materially from those contained in or implied by , these forward looking statements due to risks and uncertainties associated with our business For a discussion of material risks and other important factors that could impact our actual results , please refer to the Company's Form 10-K for the year ended December 31st , 2025 , including the Risk Factors section therein and today's press release .
Lana Adair: For a discussion of material risks and other important factors that could impact our actual results, please refer to the company's Form 10-K for the year ended December 31, 2025, including the Risk Factors section therein and today's press release, both of which can be found on our investor relations website. Any forward-looking statements that we make on the call are based on assumptions as of today and other than as may be required by law. We undertake no obligation to update these statements as a result of new information or future events. Information discussed on this call concerning Blaize Holdings, Inc.'s industry, competitive position, and the markets in which it operates is based on information from independent industry and research organizations, other third-party sources, and management's estimates.
Speaker #2: Both of which can be found on our Investor Relations website Any forward looking statements that we make on the call are based on assumptions .
Speaker #2: As of today, and other than as may be required by law, we undertake no obligation to update these statements as a result of new information or future events.
Speaker #2: Information discussed on this call concerning Blaize Inc.'s industry competitive position and the markets in which it operates is based on information from independent industry and research organizations, other third-party sources, and management's estimates.
Speaker #2: These estimates are derived from publicly available information released by independent industry analysts and other third-party sources, as well as data from Blaize Holdings, Inc. internal research. These estimates are based on reasonable assumptions and computations made upon reviewing such data and Blaize Holdings, Inc.'s experience in such industry and markets.
Lana Adair: These estimates are derived from publicly available information released by independent industry analysts and other third-party sources, as well as data from Blaize Holdings, Inc.'s internal research. These estimates are based on reasonable assumptions and computations made upon reviewing such data and Blaize Holdings, Inc.'s experience and knowledge of such industry and markets. By definition, assumptions are subject to uncertainty and risk, which could cause results to differ materially from those expressed in the estimates. During this call, we will discuss certain non-GAAP financial measures. These non-GAAP financial measures should be considered as a supplement to, and not a substitute for, measures prepared in accordance with GAAP. For a reconciliation of non-GAAP financial measures discussed during this call to the most directly comparable GAAP measure, please refer to today's press release. Now I'd like to introduce Dinakar Munagala, the CEO of Blaize Holdings, Inc.
Speaker #2: By definition, assumptions are subject to uncertainty and risk, which could cause results to differ materially from those expressed in the estimates.
Speaker #2: During this call , we will discuss certain non-GAAP financial measures . These non-GAAP financial measures should be considered as a supplement to , and not a substitute for measures prepared in accordance with GAAP .
Speaker #2: For a reconciliation of non-GAAP financial measures discussed during this call to the most directly comparable GAAP measure . Please refer to today's press release Now , I'd like to introduce Dinakar Nagalla , the CEO of Blaize Holdings , Inc.
Speaker #3: Good afternoon Over the course of 2025 , we grew our revenue from approximately $1 million in the first quarter to $23.8 million in the fourth quarter We exceeded the upper end of our revenue guidance , representing approximately 20 times growth over the year This reflects strong momentum across inference infrastructure .
Dinakar Munagala: Good afternoon. Over the course of 2025, we grew our revenue from approximately $1 million in Q1 to $23.8 million in Q4. We exceeded the upper end of our revenue guidance, representing approximately 20 times growth over the year. This reflects strong momentum across inference infrastructure, sovereign AI, and public safety applications. Customers today evaluate AI infrastructure on three things, cost per inference, power efficiency, and revenue per rack. At the same time, many enterprise inference workloads do not require the largest models. We are seeing increasing adoption towards smaller task-specific models that deliver strong results with far greater efficiency and faster time into real business outcomes. That is where Blaize is focused. Over the past several months, we strengthened our execution. We brought on a Chief Revenue Officer, Stephen Patak, to scale our commercial efforts globally.
Speaker #3: Sovereign AI and public safety applications. Customers today evaluate AI infrastructure on three things: cost per inference, power efficiency, and revenue per rack. At the same time, many enterprise inference workloads do not require the largest models.
Speaker #3: We are seeing increasing adoption towards smaller, task-specific models that deliver strong results with far greater efficiency and faster time to real business outcomes.
Speaker #3: That is where Blaise is focused Over the past several months , we strengthened our execution . We brought on a Chief revenue Officer , Steven Patak , to scale our commercial efforts globally .
Speaker #3: In January , we signed an MOU with Nokia's Asia Pacific Division and we are now advancing that collaboration through an innovation hub in Singapore to build and validate our combined AI platform .
Dinakar Munagala: In January, we signed an MoU with Nokia's Asia Pacific division, and we are now advancing that collaboration through an innovation hub in Singapore to build and validate our combined AI platform. We will launch this at GITEX Asia in Singapore, where we'll present to enterprises, governments, cloud providers, and data center operators across Asia Pacific. We are already seeing early traction taking shape across the region, spanning cloud infrastructure, sovereign AI, and real-world applications. Many of these opportunities follow multi-phase models where our systems expand as workloads grow. One of the most concrete examples is in India, where we signed an MoU with the government of Telangana supporting its AI cloud innovation hub. This foundational platform spans mining safety, smart cities, and agriculture, where we jointly enable real-time intelligence of worker safety, equipment operations, and environmental conditions.
Speaker #3: We will launch this at Gitex Asia in Singapore , where we'll present to enterprises , governments , cloud providers , and data center operators across Asia Pacific We are already seeing early traction taking shape across the region , spanning cloud infrastructure , sovereign AI and real world applications .
Speaker #3: Many of these opportunities follow multiphase models where our systems expand as workloads grow One of the most concrete examples is in India , where we signed an MOU with the government of Telangana supporting its AI cloud innovation hub This foundational platform spans mining , safety , smart cities and agriculture , where we jointly enable real time intelligence of worker safety equipment operations and environmental conditions .
Speaker #3: In China , we are also expanding our footprint with regional solution providers focused on AI data center build out , driving assisted living and smart community solutions in patient safety and remote monitoring with enterprise engagements underway in Korea , we are working with solution partners like Gkl specializing in factory safety and industrial monitoring across Southeast Asia and Australia We are working with Nokia and Vertical systems integrators to explore AI , use cases in urban safety , retail analytics , maritime infrastructure and airport security .
Dinakar Munagala: In China, we are also expanding our footprint with regional solution providers focused on AI data center build-out, driving assisted living and smart community solutions in patient safety and remote monitoring with enterprise engagements underway. In Korea, we are working with solution partners like GSiL, specializing in factory safety and industrial monitoring. Across Southeast Asia and Australia, we're working with Nokia and vertical systems integrators to explore AI use cases in urban safety, retail analytics, maritime infrastructure, and airport security. In the US, Europe, and Latin America, we're expanding engagements across enterprise and data center environments focused on AI infrastructure, public safety, industrial robotics, and autonomous operations. For the third consecutive year, we will showcase our solutions at ISC West, the largest converged security trade show. The Middle East and North Africa continues to be a strong growth market.
Speaker #3: In the US, Europe, and Latin America, we are expanding engagements across enterprise and data center environments focused on AI infrastructure, public safety, industrial robotics, and autonomous operations.
Speaker #3: And for the third consecutive year , we will showcase our solutions at ISC West , the largest converged security trade show , the Middle East and North Africa continues to be a strong growth market Governments and enterprises are investing in security , visibility and sovereign infrastructure in Saudi Arabia .
Dinakar Munagala: Governments and enterprises are investing in security, visibility, and sovereign infrastructure. In Saudi Arabia, we support energy and urban city use cases. In the UAE, we support civil defense, aerial monitoring, and drone detection. In North Africa, we support large-scale industrial ecosystems. Blaize enables real-time detection and monitoring, supporting infrastructure security across energy, industrial, and transportation environments. Our capabilities extend into robotics and autonomous systems. These systems require low latency and efficient inference where hybrid architectures become essential. This shifts AI from centralized data centers to distributed infrastructure. AI infrastructure is no longer limited to hyperscalers. It is now distributed across regional cloud providers, data center operators, and sovereign programs. AI environments today remain highly fragmented. Thousands of vendors deliver narrow AI capabilities focused on vision, documents, identity, or automation. Organizations are left integrating multiple systems before they can deliver real outcomes.
Speaker #3: We support energy and urban city use cases in the UAE. We support civil defense, aerial monitoring, and drone detection in North Africa.
Speaker #3: We support large scale industrial ecosystems . Blaze enables real time detection and monitoring , supporting infrastructure security across energy , industrial and transportation environments Our capabilities extend into robotics and autonomous systems .
Speaker #3: These systems require low latency and efficient inference . Where hybrid architectures become essential This shifts AI from centralized data centers to distributed infrastructure .
Speaker #3: AI infrastructure is no longer limited to hyperscalers . It is now distributed across regional cloud providers . Data center operators , and sovereign programs .
Speaker #3: AI environments today remain highly fragmented . Thousands of vendors deliver narrow AI capabilities focused on vision documents , identity or automation . Organizations are left integrating multiple systems before they can deliver real outcomes The opportunity is to move from fragmented tools to integrated services These capabilities are consolidating into platforms , and that transition is happening now .
Dinakar Munagala: The opportunity is to move from fragmented tools to integrated services. These capabilities are consolidating into platforms, and that transition is happening now. What ties all of this together is the underlying economics. At scale, this is about cost, efficiency, and utilization. In our analysis, GPU-only infrastructure can scale revenue but remains constrained by high-end recurring compute cost. By contrast, the Blaize model is designed to be cash flow efficient from the start, driven by lower silicon cost and power efficiency. A hybrid configuration combining GPUs and Blaize inference acceleration can deliver roughly a 50% lower infrastructure cost with approximately 60% lower power consumption or more than 2x improvement in efficiency. To support this model, we are progressing toward the initial release of the Blaize AI services platform in Q2. This is not just about cost.
Speaker #3: What ties all of this together is the underlying economics at scale. This is about cost efficiency and utilization. In our analysis, GPU-only infrastructure can scale revenue, but remains constrained by high and recurring compute costs.
Speaker #3: By contrast , the blaze model is designed to be cash flow efficient from the start . Driven by lower silicon cost and power efficiency .
Speaker #3: A hybrid configuration combining GPUs and Blaise inference acceleration can deliver roughly a 50% lower infrastructure cost with approximately 60% lower power consumption , or more than two x improvement in efficiency To support this model , we are progressing toward the initial release of the blaze .
Speaker #3: AI services platform in the second quarter . This is not just about cost . It brings fragmented AI capabilities into a unified services layer and enables customers to move faster from infrastructure to real world outcomes The platform combines inference silicon , intelligent software , API based AI services for AI providers .
Dinakar Munagala: It brings fragmented AI capabilities into a unified services layer and enables customers to move faster from infrastructure to real-world outcomes. The platform combines inference silicon, intelligent software, API-based AI services. For AI providers, instead of relying on GPU rental, Blaize enables operators to monetize AI outcomes. Revenue comes from inference transactions, AI events, and application services. As services scale, revenue grows faster than cost, driving operating leverage and margin expansion. In our analysis, infrastructure models remain cost-constrained over time. The Blaize AI services model enables more efficient scaling of revenue with improving economics as services grow. This is the difference between scaling compute and scaling a business. AI infrastructure investment continues to expand globally. This phase of the industry is no longer defined by larger models. It is defined by monetizing inference at scale. Platforms that combine efficient architecture with AI services are defining how AI operates today.
Speaker #3: Instead of relying on GPU rental Blaze enables operators to monetize AI outcomes . Revenue comes from inference transactions . AI events , and application services .
Speaker #3: As services scale, revenue grows faster than cost, driving operating leverage and margin expansion. In our analysis, infrastructure models remain cost constrained over time. The Blaize AI services model enables more efficient scaling of revenue with improving economics as services grow. This is the difference between scaling compute and scaling a business.
Speaker #3: AI infrastructure investment continues to expand globally. This phase of the industry is no longer defined by larger models; it is defined by monetizing inference at scale. Platforms that combine efficient architecture and services are defining how AI operates today. Blaize is built for that model.
Dinakar Munagala: Blaize is built for that model. Our focus remains on expanding commercial activities, scaling AI services, and converting pipeline into revenue. Thank you. I will now hand this off to our CFO, Harminder Sehmi.
Speaker #3: Our focus remains on expanding commercial activities , scaling AI services , and converting pipeline into revenue Thank you . I will now hand this off to our CFO , Harminder semi .
Speaker #3: Thank you . Danica , and good afternoon everyone I'm pleased to share our fourth quarter and full year 2025 results today I'd like to begin with a few highlights This is the fourth consecutive quarter where we exceeded our revenue guidance range Since we became a public company in January 2025 .
Harminder Sehmi: Thank you, Dinakar, and good afternoon, everyone. I'm pleased to share our Q4 and full year 2025 results today. I'd like to begin with a few highlights. This is the fourth consecutive quarter where we exceeded our revenue guidance range since we became a public company in January 2025. Revenue of $38.6 million for the full year 2025 outperformed the upper end of our guidance by $600,000. Revenue for the Q4 doubled to $23.8 million from the prior quarter. Adjusted EBITDA loss was $50.5 million, an improvement of $4.5 million from the lower end of our guidance range for the year. This includes a $1 million benefit from higher gross margin and $3.5 million in lower OpEx and deferred technology costs.
Speaker #3: Revenue of $38.6 million for the full year 2025 outperformed the upper end of our guidance by $600,000. Revenue for the fourth quarter doubled to $23.8 million from the prior quarter, and adjusted EBITDA loss was $50.5 million, an improvement of $4.5 million from the lower end of our guidance range.
Speaker #3: For the year, this includes a $1 million benefit from higher gross margin and $3.5 million in lower opex and deferred technology costs. Focusing on revenue and gross margin in the fourth quarter, we delivered revenue of $23.8 million, exceeding the upper end of our guidance by $700,000.
Harminder Sehmi: Focusing on revenue and gross margin. In Q4, we delivered revenue of $23.8 million, exceeding the upper end of our guidance by $700,000. This performance was driven by customer deployments of servers in the Asia Pacific region, supporting AI solutions into the smart health space. We are seeing continued demand as customers expand into AI data center infrastructure build-outs, which we expect to contribute to future revenue growth. Turning to the full year, 2025 marked an important milestone as our first full year of operations as a public company. We're pleased with the progress we achieved. Revenue of $38.6 million was up significantly from $1.6 million in the prior year. This reflects our success in laying the foundations to meet the rising demand for AI solutions across high growth markets.
Speaker #3: This performance was driven by customer deployments of servers in the Asia Pacific region , supporting AI solutions into the smart health space We are seeing continued demand as customers expand into AI data center infrastructure , build outs , which we expect to contribute to future revenue growth .
Speaker #3: Turning to the full year, 2025 marked an important milestone as our first full year of operations as a public company. We're pleased with the progress we achieved.
Speaker #3: Revenue of $38.6 million was up significantly from $1.6 million in the prior year. This reflects our success in laying the foundations to meet the rising demand for AI solutions across high-growth markets.
Speaker #3: Our growing partnerships with systems and software providers is key to accelerating and streamlining the adoption of AI solutions powered by Blaise hardware and software .
Harminder Sehmi: Our growing partnerships with systems integrators and software providers is key to accelerating and streamlining the adoption of AI solutions powered by Blaize Hardware and Software. Let me now address the gross margin trends. Gross margin for Q4 was 11%, and it was 16% for the full year. In prior updates, I have indicated that this approach has been important to our strategic plans as we've been able to more rapidly seed substantial commercial relationships. I expect the quarterly trend to continue for H1 2026 as we adapt to the global memory constraints. Blaize Hardware and Software is expected to form a higher mix in our AI solutions from H2 2026. This should result in gross margins of between 30% and 35% in Q4.
Speaker #3: Let me now address the gross margin trends. Gross margin for the fourth quarter was 11%, and it was 16% for the full year.
Speaker #3: In prior updates, I have indicated that this approach has been important to our strategic plans. As we've been able to more rapidly cede substantial commercial relationships.
Speaker #3: I expect the quarterly trend to continue for the first half of 2026 , as we adapt to the global memory constraints Blaze hardware and software is expected to form a higher mix in our AI solutions from the second half of 2026 .
Speaker #3: This should result in gross margins of between 30% and 35% in the fourth quarter. Turning to our fourth quarter and full year net loss and operating expenses, the GAAP net loss for the full year was $206.9 million, compared to a GAAP net loss of $61.2 million in 2020.
Harminder Sehmi: Turning to our Q4 and full-year net loss and operating expenses. The GAAP net loss for the full year was $206.9 million compared to a GAAP net loss of $61.2 million in 2024. I'd like to spend a few moments breaking these numbers down to provide clarity on the underlying results. Key line items in our 2025 financials were a non-cash $226 million dollar charge arising from the change in fair value of legacy Blaize convertible notes and warrants. Non-cash $37.5 million dollars in share-based compensation charge and transaction expenses of $12 million dollars related to going public.
Speaker #3: I'd like to spend a few moments breaking these numbers down to provide clarity on the underlying results. Key line items in our 2025 financials were a non-cash $226 million charge arising from the change in fair value of legacy Blaize.
Speaker #3: Convertible notes and warrants non-cash $37.5 million in share based compensation charge and transaction expenses of $12 million related to going public These were offset by a $123.2 million credit , inclusive of both cash and non-cash components , primarily driven by the change in value of warrants and Earnout shares , among other items .
Harminder Sehmi: These were offset by a $123.2 million credit, inclusive of both cash and non-cash components, primarily driven by the change in value of warrants and earn out shares, among other items. The Adjusted EBITDA loss for fiscal 2025 was thus $50.5 million, up from a loss of $42.7 million in the prior year. The key reasons for the year-on-year increase were $3.2 million in building our teams, investment in our technology roadmap of $2.4 million, an increase of $1.5 million in marketing, and $5 million in new expenses related to our preparations to operate as a public company, some of which are not expected to recur in 2026. I will now review operating expenses on a sequential basis, Q4 versus Q3, 2025.
Speaker #3: The adjusted EBITDA loss for fiscal 2025 was thus $50.5 million , up from a loss of $42.7 million in the prior year . The key reasons for the year on year increase were $3.2 million in building our team's investment in our technology roadmap of $2.4 million , an increase of $1.5 million in marketing and $5 million in new expenses related to our preparations to operate as a public company Some of which are not expected to recur in 2026 .
Speaker #3: I will now review operating expenses on a sequential basis . Fourth versus third quarter 2025 . In the fourth quarter of 2025 . Total operating expenses of $14.5 million , excluding $9.5 million in stock based compensation , were largely flat versus the $14.9 million in the third quarter .
Harminder Sehmi: In Q4 of 2025, total operating expenses of $14.5 million, excluding $9.5 million in stock-based compensation, were largely flat versus the $14.9 million in Q3, excluding the stock-based compensation also of $9.5 million. Research and development expenses and sales, general and administrative costs were similarly flat quarter on quarter. We will invest prudently in people in line with growing revenue opportunities in 2026. Our engineers continue to develop the next generation products, and we expect related external costs to kick in H2. Our Adjusted EBITDA loss in Q4 of 2025 was $11.1 million, unchanged from Q3. We ended fiscal 2025 with $46 million in cash and cash equivalents.
Speaker #3: Excluding the stock based compensation also of $9.5 million , research and development expenses and sales , general and administrative costs were similarly flat quarter on quarter We will invest prudently in people in line with growing revenue opportunities in 2026 .
Speaker #3: Our engineers continue to develop the next generation products, and we expect related external costs to kick in in the second half. Our adjusted EBITDA loss in the fourth quarter of 2025 was $11.1 million, unchanged from the third quarter. We ended fiscal 2025 with $46 million in cash and cash equivalents. The available funds under our committed equity facility are $15.6 million. Now, I'd like to spend a few moments talking about our recently announced shelf before moving ahead to guidance for 2026.
Harminder Sehmi: The available funds under our committed equity facility are $15.6 million. Now I'd like to spend a few moments talking about our recently announced shelf before moving ahead to guidance for 2026. As is common with companies that become eligible and meet the criteria to file a Form S-3 registration statement, we took the opportunity to do so on the first-year anniversary of our merger. This shelf allows us to raise up to $250 million through a broad range of securities in the next three years and on an as-needed basis. Our shelf offers broad flexibility to raise capital quickly when market conditions are favorable. We believe the shelf is helpful for strategic positioning. It will provide working capital needs, fund field trials, and enable continued investment in new product development. Moving to our guidance for the current year.
Speaker #3: As is common with companies that become eligible and meet the criteria to file a shelf S-3 registration statement, we took the opportunity to do so on the first-year anniversary of our merger.
Speaker #3: This shelf allows us raise up to $250 million through a broad range of securities in the next three years , and on an as needed basis , our shelf offers broad flexibility to raise capital quickly when market conditions are favorable We believe the shelf is helpful for strategic positioning It will provide working capital needs , fund field trials and enable continued investment in new product development Moving to our guidance for the current year , we operate in a dynamic environment that now includes global memory supply constraints and geopolitical tensions We continue to monitor the supply chain closely and will invest prudently in research and development and go to market capability We see demand across both edge and data center deployments .
Harminder Sehmi: We operate in a dynamic environment that now includes global memory supply constraints and geopolitical tensions. We continue to monitor the supply chain closely and will invest prudently in research and development and go-to-market capability. We see demand across both edge and data center deployments. This creates an opportunity for recurring revenue as we expand our AI services platform. We're continuing our current partnerships as well as adding new customers. We believe new partnerships with recognized names like Nokia should lead to additional strategic opportunities in areas where we have not yet developed traction. With that, our 2026 fiscal guidance is as follows. Revenue of $130 million remains unchanged. I expect the H1 to be lighter than the H2. Flat gross margins for the H1 of 2026, expected to average between 30% and 35% by the Q4.
Speaker #3: This creates an opportunity for recurring revenue as we expand our AI services platform , we're continuing our current partnerships as well as adding new customers We believe new partnerships with recognized names like Nokia should lead to additional strategic opportunities in areas where we have not yet developed traction With that , our 2026 fiscal guidance is as follows .
Speaker #3: Revenue of $130 million remains unchanged. I expect the first half to be lighter than the second. Flat gross margins for the first half of 2026.
Speaker #3: Expected to average between 30% and 35% by the fourth quarter, adjusted EBITDA loss of between $45 million and $50 million. In closing, we delivered strong revenue growth in the second half of 2025 and continued to build momentum across our customer base. We remain focused on disciplined cost management and operational execution. Our outlook for 2026 remains consistent with what we have previously shared. With that, I'll turn it back over to the operator for questions.
Harminder Sehmi: Adjusted EBITDA loss of between $45 million and $50 million. In closing, we delivered strong revenue growth in H2 2025 and continued to build momentum across our customer base. We remain focused on disciplined cost management and operational execution. Our outlook for 2026 remains consistent with what we have previously shared. With that, I'll turn it back over to the operator for questions.
Speaker #1: Thank you. As a reminder, to ask a question, please press star one one on your telephone and wait for your name to be announced.
Operator: Thank you. As a reminder, to ask a question, please press star one one on your telephone and wait for your name to be announced. To withdraw your question, please press star one one again. One moment for questions. Our first question comes from Gil Luria with D.A. Davidson. You may proceed.
Speaker #1: To withdraw your question , please press star one one again , one moment for questions And our first question comes from Gil Luria with D.A.
Speaker #1: Davidson. You may proceed.
Speaker #4: Afternoon. When you talked in your press release—and we've talked a lot about the different types of applications that are in front of you—
Gil Luria: Good afternoon. When you talk in your press release, and we've talked a lot about the different types of applications that are in front of you. I think in the release you referred to public safety, retail, smart cities, aerial robotics, and auto coming down the pipe. How would you prioritize them in terms of what you're going to have this year and how those opportunities play out over the next three or four years?
Speaker #4: I think in the release , you referred to public safety retail , smart cities . Aerial robotics . And I know there's auto coming down the pike .
Speaker #4: How would you prioritize them in terms of what you're going to have this year And how those opportunities play out over the next 3 or 5 , 3 or 4 years ?
Speaker #5: Hi . So I think the commonality is AI inference . This is where we're seeing momentum and comprising our full stack . The silicon , the systems , you know , servers and software on top specific use cases that we have , we're seeing momentum around a combination of , you know , smart health , factory automation , industrial use cases .
Stephen Patak: Hi, Gil. So I think the commonality is AI inference. This is where we're seeing momentum, and comprising our full stack, the silicon, servers and our software on top. Specific use cases that we're seeing momentum are around a combination of smart health, factory automation, industrial use cases, and but also part of relationships with drones and such use cases. I don't know if you want to add any further, Dinakar.
Speaker #5: And we're also part of relationships with drones and such use cases . I don't know if you want to add any further . I mean , know you've .
Dinakar Munagala: No, Gil, the pipeline that we've got includes all of those. How we set priorities is really the pace at which any POCs or pilots are getting concluded with those customers. As you know, inference solutions require access to data. In short, the near-term priority is just converting a pipeline where we've got access to those customers and data. In the medium term, it is how do we expand more business into some of those customers.
Speaker #3: So Gailhaguet the , the pipeline that we've got includes all of those . The how we set priorities is really the pace at which the .
Speaker #3: Any POCs are pilots are getting concluded with those customers . As you know , inference requires or inference solutions requires access to data .
Speaker #3: So in short , the near term priority is just converting pipeline where we've got access to those customers and data . And in the medium term , it is how do we expand more business into some of those customers
Speaker #4: Thank you . And then the second question is about gross margins . Appreciate the the visibility into the end of this year . But longer term and at scale , what do we expect our long term model to look for in terms of gross margins on the hardware side and on the software side ?
Gil Luria: Thank you. The second question is about gross margins. Appreciate the visibility into the end of this year. Longer term and at scale, what do we expect our long-term model to look for in terms of gross margin on the hardware side and on the software side? With more of a push to services, do we still expect software and services to be about a quarter of the mix in a longer term model? That'll help us get the full picture.
Speaker #4: And with more of a push to services, do we still expect software and services to be about a quarter of the mix in a longer-term model?
Speaker #4: That'll help us get the full picture
Speaker #3: So yes, in the longer term, for us it's 55% plus. And that's going to be a blend of hardware and software.
Harminder Sehmi: Yes, in the longer term for us, is 55%+, and that's gonna be a blend of hardware and software. I think what we're observing now with the remarks that Dinakar went through on AI services platform, which essentially becomes a combination of hardware and software. It's not, you know, you're not distinguishing between the two. There is a revenue share type model that we can see coming our way. 55%+ as a blend. I think software and recurring revenue, if I can put it that way, could become a larger portion of the mix, but too early to say just yet, and we'll continue to make announcements as and when some of those deployments get public.
Speaker #3: I think what we're observing now, with the remarks that Danica went through on AI services platform, which essentially becomes a combination of hardware and software.
Speaker #3: So it's not , you know , you're not distinguishing between the two and there is a revenue share type model that we can we can see coming our way .
Speaker #3: So 55% plus is a blend I think software and recurring revenue , if I can put it that way , could become a larger portion of the mix .
Speaker #3: But too early to say just yet . And we'll continue to make announcements as and when some of those deployments get public .
Speaker #4: Appreciate it. Thank you very much.
Gil Luria: Appreciate it. Thank you very much.
Speaker #1: Thank you. Our next question comes from Craig Ellis with B. Riley Securities. You may proceed.
Operator: Thank you. Our next question comes from Craig Ellis with B. Riley Securities. You may proceed.
Speaker #4: Yeah . Thanks for taking the question . And guys , congratulations on hitting the strong revenue on ramp in the fourth quarter . I wanted to start the line of inquiry following up on the 130 million revenue guide per calendar , 26 .
Craig Ellis: Yeah, thanks for taking the question. Guys, congratulations on hitting the strong revenue on-ramp in Q4. I wanted to start the line of inquiry following up on the $130 million revenue guide for calendar 2026. Can you help us understand the extent to which Starshine and Yotta are driving that versus other things like maybe converting the Nokia MoU into revenue or maybe even getting traction on some of the new capabilities that we identified in the press release and that you've talked about, the services platform and AI application delivery?
Speaker #4: Can you help us understand the extent to which Starshine and Yada are driving that versus other things, like maybe converting the Nokia MoU into revenue, or maybe even getting traction on some of the new capabilities that we identified in the press release that you've talked about?
Speaker #4: The services platform and AI application delivery.
Speaker #3: So let me let me start and then can come , come in . So yes , you know , Yota and Starshine are partnerships that we developed late last year .
Harminder Sehmi: Let me start, and then Dileepa can come in. Yes, you know, Yotta and Starshine are partnerships that we developed late last year. They still remain important to us. The pace at which we deliver products to them is largely driven by their end user needs. There are other partners that we have introduced in the back end of last year. Over the next maybe three to six months, we expect to add maybe one or two more partners. When I stand back and look at it, the revenue guidance is really supported by some of the engagements we've had and what we expect to close during the year.
Speaker #3: They still remain important to us. The pace at which we deliver products to them is largely driven by their end user needs.
Speaker #3: There are other partners that we have introduced in the back end of last year. Over the next, maybe three to six months, we expect to add maybe one or two more partners.
Speaker #3: So, when I stand back and look at it, the revenue guidance is really supported by some of the engagements we've had.
Speaker #3: And what we expect to close during the year . The an important point to make is that the the AI services platform and the relationship with partners like Nokia , you know , are expected to start to feature into towards the end of next year .
Harminder Sehmi: An important point to make is that the AI services platform and the relationship with partners like Nokia are expected to start to feature into towards the end of next year. Really, when we look at the business going forward, it falls into three sort of big buckets. Number one is just system revenue, which is a combination of mainly hardware, but could be plus and third parties. Number two is there's an attach rate of Blaize software which gets monetized. Overall, when you look at the system, particularly when you're applying it to cloud service providers, tier two cloud service providers, it's giving everybody or them an opportunity to start to monetize the infrastructure that they've invested in.
Speaker #3: And really , when we look at the business going forward , it falls into three sort of big buckets . Number one is just system revenue , which is a combination of mainly hardware , but could be blazoned third party .
Speaker #3: And number two is there's an attach rate of blaze software , which gets monetized and overall , when you look at the system , when you particularly when you're applying it to cloud service providers , tier two , cloud service providers , the , you know , it's giving everybody or them an opportunity to start to monetize the infrastructure that they've invested in .
Speaker #5: Yeah . And just to add , I guess , is that where we were with , you know , a couple of key relationships , I think that is actually growing in two dimensions .
Dinakar Munagala: Yeah. Just to add, I guess, is that where we were with you know a couple of key relationships, I think that is actually growing in two dimensions. One is within anchor customers, there is a land and expand that is typically starts upon use case. Once we establish credibility, that leads to additional use cases and additional opportunity there. Also once we've developed a certain use case, it is relevant to a larger market. We're witnessing that momentum as well, where a solution that we developed for with a certain partner is required in a different geography, different customer and so on. We're witnessing that kind of demand as well.
Speaker #5: One is within anchor customers , there is a land and expand . These typically start with one use case and once we establish credibility , that leads to additional use cases and additional opportunity , there .
Speaker #5: But also, once we've developed a certain use case, it is relevant to a larger market. So we're witnessing that momentum as well.
Speaker #5: Where a solution that we developed for with a certain partner is required in a , in a different geography , different customer and so on .
Speaker #5: So we're witnessing that kind of demand as well . The common theme is , of course , our combination of the three components that Hermina mentioned hardware , software , and API revenue that we expect will kick in with the launch of our AI services platform
Dinakar Munagala: The common theme is of course, our combination of the three components that Harminder mentioned, hardware, software, and API revenue that we expect will kick in with the launch of our AI services platform.
Craig Ellis: Got it. That's very helpful. Then the follow-up is really a two-parter. In past calls and conversations, we've quantified the opportunity pipeline at about $725 million. Can you give us an update on whether that's still the right way to look at the opportunity pipeline or has it changed? Then on the Adjusted EBITDA guidance for the year, can you clarify to the extent to which mask set costs are included? Will there be any chip-related mask set costs that we should be incorporating into our OpEx modeling? Thanks, guys.
Speaker #4: Got it . That's very helpful . And then the follow up is , is really a two parter in past calls and conversations , we've quantified the the opportunity pipeline at about 725,000,725 .
Speaker #4: Can can you give us an update on whether that's still the right way to look at the opportunity pipeline ? Or has it changed and and then on the adjusted EBITDA guidance for the year , can you clarify to the extent to which Macit costs are included ?
Speaker #4: Will there be any chip-related mass set costs that we should be incorporating into our OPEX modeling? Thanks, guys.
Harminder Sehmi: Sure. Every pipeline is dynamic, and we've seen meaningful traction in the Asia Pacific region in particular. We have Dinakar talked about Stephen Patak joining us as CRO. We, you know, he's got, he's working through, he's got very good visibility of what's gonna support 2026 revenue. Whether we use pipeline as a public measure, you know, for us, it's really about trying to get the contract signed and, you know, converted. Having said all of that, the pipeline is still significant. It's very significant. It does change. The geopolitical tensions have had some, a bit of an impact on some deployments where we-
Speaker #3: Sure. So, every pipeline is dynamic, and we've seen meaningful traction in the Asia Pacific region in particular. We have talked about Stephen Paddock joining us as CRO.
Speaker #3: So what you know, he's got, he's working through, he's got very good visibility of what's going to support 2026 revenue, and whether we use pipeline as a public measure.
Speaker #3: You know for us , it's really about trying to get the contract signed and and , you know , convert it . Having said all of that , the pipeline is still significant .
Speaker #3: It's very significant . It does change the geopolitical tensions have had some a bit of an impact on some deployments where we we're not quite sure when , you know , they will come back in the Nokia partnership and the AI services platform will add to our pipeline , which isn't in the numbers today .
Dinakar Munagala: Yeah
Harminder Sehmi: ... we're not quite sure when, you know, they will come back, in. The Nokia partnership and the AI services platform will add to our pipeline, which isn't in the numbers today. Let me leave that there. The second point was about the Adjusted EBITDA. As you know, the core design of our chip is common across the roadmap. The good thing is that our engineers, in-house engineers, whose costs are really in the payroll, they continue to work on adding features and reacting to what's happening in the marketplace. The external costs, so when you're talking about mask sets, you know, that's tape-out, that'll be in 2027 and beyond.
Speaker #3: So Let me leave that up there . Let me leave that there . And then the , the second point was about the adjusted EBITDA As you know , the core design of our chip is common across the the roadmap .
Speaker #3: And the good thing is that our in-house engineers, whose costs are really in the payroll, they continue to work on adding features and reacting to what's happening in the marketplace.
Speaker #3: The external costs . So when you talk about mask , that's , you know , that's tapeout that'll be in 2027 and beyond , but the , the early part of the third party external costs , I see some of those kicking in towards the second half .
Harminder Sehmi: The early part of the third-party external costs, I see some of those kicking in towards the second half, and that's generally going to be third-party IP that we buy and some of the professional services that we pay for, the third-party physical design companies.
Speaker #3: And that's generally going to be third-party IP that we buy, and some of the professional services that we pay for— the third-party physical design companies.
Craig Ellis: Very helpful. Thanks, Harminder.
Speaker #4: Very helpful . Thanks
Operator: Thank you. Our next question comes from Richard Shannon with Craig-Hallum Capital Group. You may proceed.
Speaker #1: Thank you. Our next question comes from Richard Shannon with Craig-Hallum Capital Group. You may proceed.
Richard Shannon: Thanks a lot, guys, for letting me ask a couple questions. Maybe a follow-up on one of the prior questions here, maybe look at a different angle here on calendar 2026. Would love to get a sense of relative to this $130 million guidance for the year, how much of this is in backlog or, you know, some sort of commitments here. I think last call you talked about kinda visibility of $160 million in between your biggest customers. Obviously, we're one quarter through that, but love to get a sense of what that support looks like. Maybe to ask more specifically, how do we think about customer concentration or mix this year within that? Thank you.
Speaker #6: Thanks a lot , guys , for letting me ask a couple questions , maybe a follow up on on one of the prior questions here , maybe looking at a different angle here on calendar 26 .
Speaker #6: We'd love to get a sense of relative to the hundred and $30 million guidance for the year , how much of this is in backlog or , you know , some , some sort of commitments here ?
Speaker #6: I think last call you talked about visibility of $160 million in two of your biggest customers. Obviously, one quarter through that.
Speaker #6: But I'd love to get a sense of what that support looks like. And then maybe ask more specifically, how do we think about customer concentration or mix this year within that?
Speaker #6: Thank you
Harminder Sehmi: Yes, we do have. We announced those two large contracts. They still remain. We're still delivering against those. As I mentioned in my prepared remarks, Richard, the pace at which those purchase orders come in are kind of determined by the end user, the customers, what they wanna deploy. We have added new customers into our pipeline and they too have. Backlog may be a different way to look at it. We'd like to think that if you've got a design win and a customer has a need for either edge or, as we'll find out over the coming weeks and months, AI services, customers draw down by issuing purchase orders on us.
Speaker #3: So yes , we , we do have we announced those two large contracts . They still remain . We're still delivering against those .
Speaker #3: The as I mentioned in my prepared remarks , Richard , the the pace at which those purchase orders come in are kind of determined by the end user .
Speaker #3: The customers , what they're going to deploy . We have added new customers into our pipeline and they too have have a . So backlog may be a different way to look at it .
Speaker #3: We... We’d like to think that if you want, if you’ve got a design win and you are a customer that has a need for either edge or...
Speaker #3: As we'll find out over the coming weeks and months, AI services customers draw down by issuing purchase orders on us. And so.
Harminder Sehmi: We stay close to them in order to manage our own supply chain and so that we can deliver those. I hope that helps a little bit more, understanding why we are comfortable about our $130 million guidance.
Speaker #3: So we stay close to them in order to manage our own supply chain and so that we can deliver those. I hope that helps a little bit more.
Speaker #3: Explaining , understanding why we are comfortable about our 130 million guidance . And then that . Yeah , you said about customer concentration , I don't think if you want .
Dinakar Munagala: That's that.
Harminder Sehmi: Yeah. You said about customer concentration. I don't know if you want to-
Dinakar Munagala: Actually, the customer concentration is, we're actually moving beyond our initial customers. As I mentioned previously, the use case, once it's perfected, it's relevant to more customers, so we're getting that pull. AI inference is growing very rapidly. I mean, there was a point of time, one training chip for every one training chip, there were like four or eight. Now we're hearing numbers like 16, right? This is rapidly changing. The key message is having a hybridized platform with our AI services that we can deliver into these use cases and, within the same customer as well as across other customers, is seeing quite a bit of momentum.
Speaker #5: To access the customer contact concentration is we're actually moving beyond our initial customers . As I mentioned previously , the use case , once it's perfected , it's relevant to more customers .
Speaker #5: So we're getting that pull and AI inference is growing very rapidly . I mean , there was a point of time , one training chip for every one training chip there were like 4 or 8 .
Speaker #5: Now we're hearing numbers like 16 , right . And this is rapidly changing . So so the key message is having a hybridized platform with our AI services that we can deliver into these use cases .
Speaker #5: And within the same customer , as well as across other customers , is , is , is seeing quite a bit of momentum
Richard Shannon: Okay. First off, my follow-up question here is regarding Nokia. Very interesting and powerful press release you had earlier this year about an MOU here. Would love to understand what are kind of the next steps here, especially announceable steps in this relationship, and when you ultimately look forward to be contributing to backlog and eventually revenues. Dinakar, I thought or excuse me, I think, Harminder, I think you mentioned in one of your replies maybe talking about sometime end of next year, which seemed kind of a long time process. So just wanted to clarify if that's what you meant there. Thank you.
Speaker #6: Okay . Fair enough . My follow up question here is regarding Nokia . Very interesting . And powerful , press release you had earlier this year about an MOU here .
Speaker #6: We'd love to understand what are the next steps here , especially announcing steps in this relationship . And when you ultimately look forward to be contributing to to backlog and eventually revenues , I thought , or excuse me , I think I think you mentioned one of your replies .
Speaker #6: Maybe talking about sometime end of next year, which seemed kind of a long time process. So just wanted to clarify, that's what you meant there.
Speaker #6: Thank you .
Dinakar Munagala: Right. Let me start and we can add. We are actually quite excited with the whole Nokia relationship. This started almost six months ago on a visit to Singapore, where we met with their Asia Pacific leadership. We showed them our platform, and they got visibly excited, and then they saw how this allows for a collaboration for them to participate in the AI infrastructure build-out. The tangible next steps, right, are we are building a joint solution, an AI platform focused on inference needs into their customers, as well as customers that we can bring to the table. You know, their networking stack plus our AI system stack, software stack.
Speaker #5: All right , so let me start with the can had . So we're actually quite excited with the whole Nokia relationship . This started almost six months ago on a visit to Singapore , where we met with their Asia Pacific leadership .
Speaker #5: We showed them our platform and they got visibly excited. And then they saw how this allows for a collaboration for them to participate in the AI infrastructure build-out.
Speaker #5: So the tangible next steps , right ? Are we are building a joint solution , an AI platform focused on inference needs into their customers as well as customers that we can bring to the table .
Speaker #5: You know , their networking stack , plus our AI system stack software stack . And this is the joint solution that we will actually demonstrate .
Dinakar Munagala: This is the, it's a joint solution that we will actually demonstrate and launch at GITEX ASIA in maybe less than a couple of weeks. There is going to be a joint go-to-market, co-selling into their customers, system integrators, cloud service providers, and enterprises. You know, those are the near-term next steps.
Speaker #5: And launch at Gitex Asia in maybe less than a couple of weeks . There is going to be a joint go to market co-selling into their customers , system integrators , cloud service providers , and enterprises , and , you know , that is the those are the near term .
Speaker #5: Next steps .
Harminder Sehmi: Yeah. Sorry, Richard, I don't know whether I misspoke or maybe you misheard. No. The revenues from AI services platform generally, of which of course Nokia will be a part as a partner, is towards the end of this year. As Dinakar mentioned, we're launching certain aspects of the platform sooner, and as more and more APIs are developed, that just allows us to expand the population that can start to pay for or utilize, these services and of course, pay us for it.
Speaker #3: Yeah . Sorry , Richard . I don't know whether I misspoke or maybe you misheard . Know the revenues from AI services platform generally of which of course Nokia will be a part as a partner is towards the end of this year .
Speaker #3: So as I mentioned, we're launching certain aspects of the platform sooner, and as more and more APIs are developed, that just allows us to expand the population that can start to pay for or utilize these services.
Speaker #3: And, of course, pay us for it.
Richard Shannon: Okay. Perfect. That timeframe made a lot more sense. I'm glad I asked about that. That is all my questions, guys. Thank you.
Speaker #6: Okay , perfect . That time frame made a lot more sense . I'm glad I asked about that . That is all my questions , guys .
Speaker #6: Thank you .
Operator: Thank you. As a reminder, to ask a question, please press star one one on your telephone. Our next question comes from Kevin Cassidy with Rosenblatt Securities. You may proceed.
Speaker #1: Thank you. And as a reminder, to ask a question, please press star one one on your telephone. Our next question comes from Kevin Cassidy with Rosenblatt Securities.
Speaker #1: You may proceed .
Chris Myers: Hi, this is Chris Myers on for Kevin Cassidy. I think you guys already answered my question. It was gonna be about the revenue timing on the Nokia MOU, but I guess in general, if you could just talk a little bit more about that broader opportunity set and if there's similar infrastructure wins that could come up that are, I guess, along the lines of this deal.
Speaker #7: Hi, this is Chris Myers on for Kevin Cassidy. I think you guys already answered my question. It was going to be about the revenue timing on the—
Speaker #7: Nokia MoU, but I guess in general, if you could just talk a little bit more about that broader opportunity set and if there's similar infrastructure wins that could.
Speaker #7: come up that are, I guess, along the lines of this deal
Dinakar Munagala: Yeah, we do have similar opportunities that we are working on in other continents. As they, you know, materialize, we'll be sure to update you. There is, of course, Asia, quite a bit of momentum we are witnessing. Africa is another place that we have seen some initial traction. Of course, US as well. There is massive infrastructure happening, and they do want hybridized AI to serve business outcomes, right? Less to do with, you know, what's under the hood, but more about, "Hey, can you solve my business outcome in a certain CapEx and OpEx spend?" These are the kind of questions that our team gets asked. Our solutions are a perfect fit also because we're seeing that the average model size is dramatically shrinking, right?
Speaker #5: Yeah , we do have similar opportunities that we are working on in other continents . And as they , you know , materialize will be sure to update , update you .
Speaker #5: There is, of course, in Asia quite a bit of momentum we are witnessing. Africa is another place where we are seeing some initial traction.
Speaker #5: Of course, us as well. There is massive infrastructure happening, and they do want hybridized AI to serve business outcomes, right?
Speaker #5: Less to do with , you know , what's under the hood , but more about , hey , can you solve my business outcome in a certain CapEx and opex spend ?
Speaker #5: These are the kind of questions that our team gets asked, and our solutions are a perfect fit. Also, because we're seeing that the average model size is dramatically shrinking, right?
Dinakar Munagala: These models rival the larger models, and they still achieve the same business outcome. This is a perfect fit for our graph streaming architecture. In combination with GPUs, we're able to deliver to this outcome. We're seeing such kind of momentum. This is across the board, right? Wherever our sales teams are present.
Speaker #5: These models rival the larger models, and they still achieve the same business outcome. And this is a perfect fit for our graph streaming architecture.
Speaker #5: In combination with GPUs , we're able to deliver to this outcome . So we're seeing such kind of momentum . This is across the board right ?
Speaker #5: Wherever our sales teams are present.
Chris Myers: Awesome, guys. Thank you so much and, congratulations on the great progress.
Speaker #7: Awesome, guys. Thank you so much, and congratulations on the great progress.
Dinakar Munagala: Thank you.
Speaker #3: Thank you .
Operator: Thank you. I would now like to turn the call back over to Dinakar Munagala for any closing remarks.
Speaker #1: Thank you. And I would now like to turn the call back over to Dinakar Munagala for any closing remarks.
Dinakar Munagala: Thank you, operator. Before we close, let me briefly recap. We delivered strong revenue growth and expanded our global footprint, driven by key partnerships, including Nokia and cloud service providers in Asia Pacific, as well as our work with state government initiatives in India. We're preparing to launch our Blaize AI services platform in Q2, positioning us to capture the next phase of AI monetization while improving our revenue mix and margin profile. At the same time, we're seeing a clear shift towards smaller task-specific models that deliver strong performance with far greater efficiency. This aligns directly with our Graph Streaming Processor architecture and strengthens our position as AI infrastructure build-out continues to scale. I also wanted to acknowledge the situation in the Middle East.
Speaker #5: Thank you . Operator . Before we close , let me briefly recap . We delivered strong revenue growth and expanded our global footprint driven by key partnerships , including Nokia and cloud service providers in Asia Pacific , as well as our work with state government initiatives in India .
Speaker #5: We're preparing to launch our Blaz AI services platform in Q2, positioning us to capture the next phase of AI monetization while improving our revenue mix and margin profile at the same time. We're seeing a clear shift towards smaller, task-specific models that deliver strong performance with far greater efficiency.
Speaker #5: This aligns directly with our graph streaming processor architecture and strengthens our position as AI infrastructure build-out continues to scale. I also wanted to acknowledge the situation in the Middle East.
Harminder Sehmi: Our priority remains the safety of our employees, partners, and customers in the region, and we are committed to maintaining continuity and stability in our operations. Thank you to our analysts, investors, as well as our customers and partners for your continued support. We look forward to updating you the next quarter.
Speaker #5: Our priority remains the safety of our employees, partners, and customers in the region, and we are committed to maintaining continuity and stability in our operations.
Speaker #5: Thank you to our analysts, investors, as well as our customers and partners, for your continued support. We look forward to updating you next quarter.
Operator: Thank you. This concludes the conference. Thank you for your participation. You may now disconnect.