5G Factor Video Research Note: Micron Delivers UFS, Memory Advances: Spurs Samsung Galaxy 24 AI Innovation

5G Factor Video Research Note: Micron Delivers UFS, Memory Advances: Spurs Samsung Galaxy 24 AI Innovation

In this vignette of The 5G Factor, Ron Westfall and Olivier Blanchard share their views on Micron’s UFS memory advances and how it offers the performance and power needed to store growing amounts of data in today’s AI-driven smartphones.

The discussion highlighted:

Micron Delivers UFS, Memory Advances: Spurs Samsung Galaxy 24 AI Innovation. Micron announced that it is delivering qualification samples of an enhanced version of its Universal Flash Storage (UFS) 4.0 mobile solution with breakthrough proprietary firmware features delivered in an ultra-compact UFS package at 9×13 millimeters (mm). Built on its 232-layer 3D NAND and offering up to 1 terabyte (TB) capacity, the UFS 4.0 solution provides advances in performance that can enable faster and more responsive experiences on flagship smartphones. Samsung is now incorporating Micron’s low-power double data rate 5X (LPDDR5X) memory and UFS 4.0 mobile flash storage into select devices in the Samsung Galaxy S24 series, which is introducing AI to mobile users worldwide. The Galaxy S24 series is underpinned by Samsung’s suite of generative AI tools, Galaxy AI, which helps amplify experiences from enabling barrier-free communication to maximizing creative freedom. We review and assess how Micron’s LPDDR5X is a mobile-optimized memory distinguished by offering the advanced capabilities of the 1β (1-beta) process node, while Micron’s UFS 4.0 offers the performance and power needed to store growing amounts of data in today’s AI-driven smartphones.

Watch The 5G Factor show here:

Watch the full episode here and be sure to subscribe to our YouTube channel.

Listen to the episode here:

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Transcript:

Ron Westfall: All right, well back to Mobile World Congress, which is now in our rearview mirror. However, we are going to spotlight the major announcements in the semiconductor and device segments that of course jumped out at us. And to start, let’s start with the analyst round table conversation we had with Microns Mobile Business Unit executives, including Mark Montierth, the corporate VP and general manager of MBU, and Chris Moore, VP of marketing at Micron’s MBU.

Now, during the show, Micron announced that it’s delivering qualification samples of an enhanced version of its universal flash storage or UFS 4.0 mobile solution with breakthrough proprietary firmware that delivers an ultra-compact UFS package at 9 x 13 millimeters, which is very important in this space. And to compliment that, it’s built on its 232-layer 3D NAND and offers up to one terabyte of capacity. So we’re talking terabytes in this area of the mobile ecosystem. And as such, the UFS 4.0 solution provides advances and performances as well as enabling faster and more responsive experiences on flagship smartphones, which certainly was a major point of interest at the show itself.

Now, Micron UFS 4.0 accelerates data-intensive experiences with up to 4,300 megabytes per second sequential read and 4,000 megabits per second sequential write speed, twice the performance of previous generations. With these speeds, users will be able to launch their favorite productivity and emerging AI apps more swiftly. Large language models and generative AI applications now can be loaded 40% faster, resulting in a smoother experience, I believe, when initializing conversations with AI digital companions. And I don’t think it’s been an adoption issue, but again, it’s improving that experience with the large language models, with being able to get more out of ChatGPT for example, in a more user-friendly and efficient way.

Now, working with Micron is Samsung who’s incorporating Microns low power double data rate 5X or LPDDR5X memory and UFS 4.0 mobile flash storage into select devices across the Samsung Galaxy S24 series, which is really introducing AI to mobile users worldwide. And we saw that the Galaxy S24 series is underpinned by Samsung suite of gen AI tools, Galaxy AI, which helps amplify experiences from enabling barrier free communication to also again, optimizing creative freedom for the users.

Now, as these data and energy intensive features push the limits of smartphone hardware capability, Micron’s LPDDR5X memory and UFS 4.0 storage provides critical high-performance capabilities and power efficiency to deliver these AI experiences at the edge. So edge AI, very much a hot topic and I know we’ll talk about that more in detail. Now, select Samsung Galaxy S24 devices across the S24 Ultra, S24 Plus, and S24 models are shipping with both those capabilities. And that is demonstrating, from our perspective, how Micron is really driving innovation across the smartphone product segment specifically. But again, the overall mobile AI ecosystem.

What I think is also important is that Micron’s LPDDR5X is a mobile-optimized memory distinguished by offering the advanced capabilities of the one beta process node. So here we are talking semiconductors. We should of course mention the process node capability here. And while Micron’s UFS 4.0 offers a performance and power needed to store growing amounts of data in today’s AI driven smartphones, I think it’s also important to note that Micron announced it has begun volume production of its HBM3E, which stands for high bandwidth memory 3E solution. And why is that a big deal? Because Micron’s 24 gigabyte 8H HBM3E will be part of NVIDIA H210 core GPUs, which will begin shipping just in the next quarter of Q2 2024.

Now, we see this milestone as positioning Micron to further empower AI solutions with HBM3E’s performance and energy efficiency features. And of course, driving NVIDIA, we believe will be important in how the market as well as the ecosystem perceives Micron and its strengthening position in terms of, for example, AI capabilities. And that certainly includes AI on the smartphone and devices. And with that, I’m going to stop talking about Micron, the introduction, but there was so much material. I know, Olivier, you’ll have plenty to mind. What was some of your key takeaways in our conversation with the Micron folks at the show?

Olivier Blanchard: And that was a really good introduction. So many acronyms and part numbers, it’s kind of hard to follow. So, if I could summarize and kind of focus on one thing that really struck me as sort of not just relevant but especially relevant, is when we’re talking about the overarching theme of today’s show and also at MWC, and I hate to steal your thunder ahead of when you’re going to talk about this, but it’s basically on-device AI, right? And I think that the top three topics at MWC were on-device AI, on-device AI, and on-device AI.

So up until now, what we’ve been talking about with on-device AI is the CPU, which we all understand what a CPU, central processing unit. You have the GPU, which is graphics, which is kind of like the workhorse for generative AI workloads, and for a lot of camera and gaming applications. So basically, all the enhanced features of a phone tend to require an extremely sophisticated GPU. And we’re starting to see also neural processing units. So NPUs show up in both PCs but on phones first, that kind of help manage and accelerate these AI workloads. But so far that’s been kind of like these are the three categories of semiconductors that we’ve talked about.

And memory has, for the most part, seemed like this sort of commoditized thing where you get memory on your phone or on your device and you have these different tiers, you have 256 and whatever. But we don’t necessarily think about storage and memory as more than just storage. But actually, what’s happening with the new requirements of these extremely fast AI workloads is you need a special memory solution that is well-adapted to the power requirements, to the speed, to the processes, and that just kind of makes this all work.

And where Micron has positioned itself, and I think that their joint announcement with Samsung on the S24 was so critical and also so telling is that they’re kind of like the fourth leg of that peg, of that AI peg, CPU, GPU, NPU and memory. And so it’s a fantastic strategy and hats off to Micron for making it really obvious, and having that conversation, and essentially just putting this forward in the eye of the market and saying, “No, no, no memory is also part of this. It isn’t just these other chip sets or this other semiconductor solutions.”

But also what I thought was significant is that Samsung didn’t have to do this. Samsung could have continued to just say, “Hey look, best implementation, we have our own silicon, but we’re also using Qualcomm’s SOC, it’s Snapdragon flagship plus, plus.” But they made a choice to showcase, to highlight their partnership on the memory side and to show, look, we’re thinking about this as well and this is what’s enabling this. And I can’t remember a time recently when a major handset OEM emphasized their memory partner as much as Samsung did today, and it was definitely different from Micron that usually sort of hangs in the back doing their thing. And it’s indicative of, I think first of all, the need for handset OEMs to differentiate themselves in this new age of on-device generative AI, which is we’re going to talk about what it’s going to do I think to the refresh cycles and resetting them.

But for Samsung to do this shows that, okay, the importance of memory, the importance of having the right partnerships or the right partners in the ecosystem to sort of establish dominance not just as a finished product but as an implementer of all of these different solutions together and showing to the industry that, look, we have all the right pieces in place, we have the right partners in place to deliver the best product. So huge coup for Micron. And also, I’m smarter for it because I’ll be completely honest, I was aware of this but not to that extent.

Ron Westfall: Yes.

Olivier Blanchard: And so it kind of opened my eyes to the importance of memory. And that’s cool, that’s unusual. Usually I’m ahead of the game with that and this time I wasn’t.

Ron Westfall: I agree wholeheartedly. It was a great conversation. And kudos to Samsung for elevating its memory partnership. If I recall correctly, the only other handset OEM that did that at the show was Honor, in terms of collaborating with Micron to highlight the importance of memory. So, Samsung at the forefront. Honor also getting honorable mention in this regard. And so, we’ll stay tuned. We could do the whole show just on Micron-

Olivier Blanchard: At some point we will, because I think Micron’s a company to watch.

Ron Westfall: Oh, yeah.

Olivier Blanchard: Now that they’re on my radar, they should be on everybody else’s radar too, I think. For good reason.

Ron Westfall: I think they’re on our radar and that, I think, will come to the forefront more during the course of this year, especially as AI drives a lot of smartphone innovation and device innovation.

Other insights from The Futurum Group:

5G Factor: MWC24 Preview – AI, Devices, and UX Shine

5G Factor: Key MWC24 Takeaways – The Cloud and Telcos

5G Factor: Key MWC24 Takeaways – Semis and Devices

Author Information

Ron is an experienced, customer-focused research expert and analyst, with over 20 years of experience in the digital and IT transformation markets, working with businesses to drive consistent revenue and sales growth.

He is a recognized authority at tracking the evolution of and identifying the key disruptive trends within the service enablement ecosystem, including a wide range of topics across software and services, infrastructure, 5G communications, Internet of Things (IoT), Artificial Intelligence (AI), analytics, security, cloud computing, revenue management, and regulatory issues.

Prior to his work with The Futurum Group, Ron worked with GlobalData Technology creating syndicated and custom research across a wide variety of technical fields. His work with Current Analysis focused on the broadband and service provider infrastructure markets.

Ron holds a Master of Arts in Public Policy from University of Nevada — Las Vegas and a Bachelor of Arts in political science/government from William and Mary.

Research Director Olivier Blanchard covers edge semiconductors and intelligent AI-capable devices for Futurum. In addition to having co-authored several books about digital transformation and AI with Futurum Group CEO Daniel Newman, Blanchard brings considerable experience demystifying new and emerging technologies, advising clients on how best to future-proof their organizations, and helping maximize the positive impacts of technology disruption while mitigating their potentially negative effects. Follow his extended analysis on X and LinkedIn.

SHARE:

Latest Insights:

Beyond Linux Announcements, SUSE Brings New Innovations in Developer Experience, Software Supply Chain Security, AI Observability, and Ethical AI Guardrails From SUSECON 2025
Mitch Ashley, VP DevOps and Application Development at the Futurum Group, shares key insights from SUSECON 2025, where SUSE's focus shifted beyond Linux to emphasize cloud-native development, developer experience tools, and AI observability—reflecting the company's strategic evolution.
Revvo’s TireIQ Integration With Samsara’s Platform Enhances Fleet Operators’ Ability to Track Tires in Real-Time, Perform Predictive Maintenance, and Reduce Costs
Olivier Blanchard, Research Director at The Futurum Group, examines how Samsara’s partnership with Revvo enhances AI-powered tire monitoring, reducing fleet downtime and optimizing maintenance with real-time insights.
HP’s New 8000 Series Launch Sets Security Expectations Ahead of 2027 Compliance Deadlines
Olivier Blanchard, Research Director at The Futurum Group, examines HP’s quantum-resistant 8000 Series printers and the implications for IT leaders preparing for post-quantum security mandates.
HP Launches EliteBook Ultra G1i, EliteBook X G1i, EliteBook X Flip G1i, and EliteBook X G1a with AI-Driven NPUs, 3K OLED Display, and Enhanced Security
Olivier Blanchard, Research Director at The Futurum Group, explores HP’s new EliteBook series and how AI-driven NPUs, premium displays, and enterprise security position these business laptops in a competitive market.

Thank you, we received your request, a member of our team will be in contact with you.