Shares of memory giant Micron (MU 2.32%) fell more than 8% last Friday following its late Thursday earnings release.
But does the market have this one wrong?
Micron has become one of the most important players in the artificial intelligence (AI) races over the past two years. While some legacy Micron businesses are currently in a downturn, investors may be taking too short term of an approach, missing the AI forest for the legacy market trees.
Looking under the hood of Micron’s recent earnings, there were actually several long-term positives that should continue to improve revenue and profits through the rest of this year and into 2026.
Micron’s beat and cautious guidance on gross margins
In its fiscal second-quarter 2025, Micron grew revenue 38% year over year to $8.05 billion, while adjusted (non-GAAP) earnings per share (EPS) grew 271% to $1.56. Both figures came in ahead of expectations.
On the back of the solid earnings beat, shares initially rose after-hours but then plunged the next day. The downturn appears to be due to guidance, specifically for gross margins in the current quarter. While revenue is projected to be up nearly 10% quarter over quarter, earnings are projected to be just flat, as adjusted gross margins are projected to fall to 36.5%, down 1.5 percentage points, while operating expenses are projected to increase by about $100 million.
Short-term traders appear overly concerned with the near-term margin trajectory. But those concerns are likely overblown.
Management attributed the negative margin impact to increased sales of lower-margin consumer electronics memory, which will stage a recovery after a soft quarter. On top of that, management noted the NAND flash market continues to be weak. NAND prices actually plunged by a high-teens percentage last quarter alone.
Still, the negativity appears to be within less important parts of Micron’s business. NAND only made up 26% of revenues last quarter, and the consumer markets are all becoming less important as Micron’s enterprise data center business continues to take off thanks to AI.
Those good trends around AI and DRAM should eventually overtake the bad later in 2025, and longer-term investors should also be highly encouraged by three overwhelming positive trends described in the earnings release, too.
Image source: Getty Images.
Positive No. 1: HBM for AI is going even better than previously thought
Perhaps the most consequential change to the memory industry in recent years has been the emergence of high-bandwidth memory (HBM) used in AI applications.
On its prior call in December, Micron indicated it saw the HBM market growing from $16 billion in 2024 to $30 billion in 2025, then to over $100 billion by 2030. That’s a massive, completely new segment that would exceed the entire pre-HBM DRAM industry by that time.
Meanwhile, rival Samsung is struggling with HBM yields, and Micron is rapidly catching up to HBM leader SK Hynix, even surpassing SK Hynix in terms of HBM technology metrics while catching up in terms of market share.
And Micron’s HBM news continues to get better. On the call with analysts, CEO Sanjay Mehrotra now sees an even higher $35 billion HBM market this year, up from the prior guidance of $30 billion. He also expects Micron to attain the same market share gains it forecast before.
Given that Micron had already said it was sold out of its HBM product for 2025, how is the company now able to exceed its prior targets? On the post-earnings call, management mentioned the increase is due to Micron’s strong manufacturing having improved yields relative to its prior plans, along with an earlier shift to 12-high HBM from 8-high HBM. The former carries higher prices, so the move to the more advanced technology is translating into revenue tailwinds.
Micron also mentioned it had landed a third large customer for its HBM products as it continues to gain market share here. And quarterly HBM revenue exceeded $1 billion for the first time — about 12.5% of the total revenue.
As high-margin HBM continues to grow from 12.5% of revenue today to a much higher proportion, Micron’s revenue and earnings should continue to increase over the medium term.
Positive No. 2: A whole new kind of AI memory
Micron’s AI innovation doesn’t stop at HBM. The company also announced it has developed a new kind of low-power data center DRAM (LPDRAM) it calls Small Outline Compression Attached Memory Module, or “SOCAMM,” which is a lower-power memory specifically designed for data center usage.
Previously, low-power DRAM had been used in smaller devices where power consumption was paramount, such as smartphones. However, Micron is the first to develop this new modular LPDRAM, with up to two-thirds power savings compared with the standard D5 DRAM memory for the data center. Micron developed this memory in collaboration with AI leader Nvidia (NVDA 3.16%) specifically to support Nvidia’s upcoming Grace-Blackwell 300 (or GB300) superchip.
Unlike HBM, where Micron had to catch up with SK Hynix, Micron is out front and the first to be manufacturing this type of data center LPDRAM in volumes. Given the heat and power demands of AI data centers, the new kind of memory also has the possibility of generating billions in new revenue alongside HBM in the AI era.
Positive No. 3: Data centers could switch to SSDs from HDDs
Finally, there is a potentially big opportunity looming for Micron in its biggest problem segment right now — NAND flash. While the overall NAND flash market is in a downturn, Micron has actually been scooping up market share in the best long-term NAND market — high-end data center solid state drives (SSDs). Management noted Micron achieved a record-high market share in data center SSDs last year and should continue to take more share in that segment.
But a big opportunity may be on the horizon for SSDs in AI data centers if data center operators begin to use more NAND flash SSDs to replace hard disk drives (HDDs). Right now, the older HDD technology is cheaper for bulk storage. But latency, space, and power consumption issues in AI data centers are spurring operators to switch to SSDs. Management noted on the post-earnings call:
I think that attempt to dramatically reduce HDDs has started at the most leading customers. You have seen some announcements from Pure Storage, and we are suppliers to them, and we are also working with the end customers who are wanting to deploy very large capacity SSDs, and remove HDDS from their future infrastructure deployment. So it’s an exciting opportunity. It’ll take some time to develop, but once it gets going it will definitely have a snowball effect.
If AI data centers begin to switch en masse to SSDs from HDDs, it could really improve demand on the NAND side, which has been the segment weighing down Micron’s margins. Should that segment improve along with the booming AI DRAM industry, Micron’s profits could jump significantly, perhaps in 2026 or 2027.