DESIGN TOOLS

Invalid input. Special characters are not supported.

DRAM

How does voltage scaling enable LPDDR5X to deliver efficient AI user experiences?

Rui Zhou | August 2025

We’re living in an AI-powered world — one where facial recognition, real-time translation and omniscient virtual assistants are no longer a novelty but an expectation. AI is transforming our everyday lives, especially the mobile landscape, where the push for more automation is only set to grow. AI-capable smartphone shipments are projected to reach 912 million units by 2028,1 reflecting a remarkable compound annual growth rate (CAGR) of 78.4% from 2023, when shipments totaled just 50.5 million units.2 By 2028, GenAI-capable smartphones are expected to account for over 54% of total smartphone shipments, with the global installed base surpassing 1 billion units.1

An American flag standing next to a sign reading 'National DRAM Day Reception

Click to enlarge image

Growing consumer demand for more automation and intelligence at the edge is driving the adoption of AI-capable smartphones, making edge AI a critical enabler of next-generation mobile experiences. Running AI on the device reduces the need for constant internet connectivity or reliance on centralized servers like the cloud, powering faster response times and more secure data processing. But despite the growing anticipation of AI, current hardware capabilities are struggling to keep pace. 

AI is pushing for advanced processing efficiency 

In an AI-first economy, data is the elemental fuel — every insight, prediction, and decision stems from it. Advanced algorithms and large-scale models are only as powerful as the quality, volume, and accessibility of the data that feeds them.

Large language models (LLMs) rely on vast, complex data inputs, requiring substantial computational resources to process. Running these models directly on-device — known as on-device AI — demands high-performance hardware capable of handling intensive workloads. To meet user expectations for fast performance, seamless app switching, quick load times and extended battery life, mobile devices must support real-time AI processing without relying on the cloud. This is where memory becomes critical: it acts as the bridge between data and computation, enabling processors to access and manipulate large datasets instantly. That responsiveness depends on memory’s ability to keep up. However, tasks like AI processing and high-resolution video capture consume immense amounts of power. Every advancement comes with a trade-off, and in this case, delivering this level of responsiveness comes at the expense of a drained battery. That’s why power-efficient memory or low-power DRAM (LPDRAM) is critical to deliver the speed AI needs while preserving energy.

Evolution of LPDDR supply voltages graph with lpddr generations on x axis and supply voltage on y axis

Click to enlarge image

Disclaimer: VDDQ can operate within two specification ranges — 0.5 volts (Spec range-1) and 0.3 volts (Spec range-2). For simplicity, this graph uses the 0.3 volt value, but the actual operating voltage may vary depending on system configuration and mode.

Driving performance with lower voltage 

Over the past several LPDDR (low-power double data rate) memory generations, the industry has consistently worked to push the boundaries of voltage scaling. Because power is the product of voltage and current, reducing voltage supply levels directly lowers power consumption. In high-speed memory systems where power demands are significant, even modest reductions in supply voltage yield meaningful energy savings. 

LPDDR5X, the generation of mobile DRAM with higher bandwidth and better power efficiency, exemplifies this progress by rearchitecting the once-unified VDD2 rail into two distinct domains: VDD2H (high-voltage domain of VDD2) and VDD2L (low-voltage domain of VDD2). This separation allows for more precise voltage regulation tailored to different performance needs. 

To fully leverage this architecture, technologies like DVFSC (Dynamic Voltage and Frequency Scaling Control) and eDVFSC (enhanced DVFSC) are essential. They dynamically adjust voltage and frequency based on workload demands, allowing VDD2H to operate at lower voltages during low-speed tasks — which help reduce power consumption and extend battery life.

An American flag standing next to a sign reading 'National DRAM Day Reception

Click to enlarge image

High-performance components like memory banks and core logic continue to draw from higher-voltage rails such as VDD2H and VDD1 to maintain speed and responsiveness. In contrast, peripheral circuits and I/O functions operate on lower-voltage rails like VDD2L and VDDQ to reduce energy use during less demanding tasks.

Micron’s innovation on lower VDD2H 

Separating VDD2H and VDD2L marked a pivotal advancement by introducing a new level of flexibility and efficiency in power delivery. By identifying components that didn’t require the full voltage of the original VDD2 rail, engineers enabled systems to operate at VDD2L during low-frequency activity, cutting power consumption without compromising responsiveness.

But the innovation didn’t stop there. Micron’s engineers discovered that even components powered by VDD2H could tolerate a lower voltage threshold. This led to the development of low VDD2H (LVDD2H) — a finely tuned, lower-voltage version of VDD2H. By pushing VDD2H closer to its minimum viable level, operating in LVDD2H mode delivers additional power savings on top of those already achieved through the VDD2L split.

Reducing the voltage of VDD2H, especially in high-speed operating modes, offers several key benefits:

  • Lower dynamic and static power consumption, reducing overall energy draw.
  • Improved thermal performance, as less power translates to less heat.
  • Extended battery life in mobile and embedded systems, where efficiency is critical.


LVDD2H operating modes

Through extensive testing and characterization, Micron engineers have defined two primary LVDD2H operating modes: nominal and minimum

In nominal mode, to align with ecosystem capabilities, voltage levels are kept at 1.05V for 8.533–10.7 Gbps and reduced for data rates below 7.5 Gbps. In minimum mode, the LVDD2H can be reduced throughout the full data rate range. 

EDGE AI and DoU use cases comparison chart

Click to enlarge image

Power savings enabled with LVDD2H 

Micron’s internal testing demonstrates the significance of LVDD2H across two main use cases: artificial intelligence markup language (AIML) and days of usage (DoU).3 In AIML workloads, reducing VDD2H from 1.060V to 0.98V delivered an average of up to 8% power savings.4 The voltage reduction was tested on 12 different AI models, all of which demonstrated notable power savings. Among LLMs, Llama 2-13b showed the highest power benefit of up to 12%. These power savings can directly enhance end-user experiences across AI-driven features like voice assistants, photo processing, auto-correction and chatbots. For DoU scenarios, voltage reduction produced an average of a 5% power benefit across eight different use cases. DoU use cases encompass typical user activities that occur throughout the day on a mobile device, such as chatting on Facebook, listening to music, browsing the web, or watching videos.

Next  

Lower voltage directly reduces power consumption, extending battery life in ways users can feel every day. As technology advances at an unprecedented pace, advanced memory solutions are essential to keep up. Micron is constantly working to maximize power efficiency and performance from its designs to create meaningful impact for end users, setting the benchmark for energy-efficient DRAM in the industry. By collaborating closely with ecosystem partners to support aligned innovation and industry-wide progress, we’re shaping the future of data. Our relentless pursuit of innovation and excellence keeps us ahead of the curve, delivering solutions that drive the next generation of experiences.

 

1. Counterpoint Research. The Ecosystem Driving AI’s Democratization in Smartphones. Published January 10, 2025. https://www.counterpointresearch.com/insight/post-insight-research-notes-blogs-the-ecosystem-driving-ais-democratization-in-smartphones/

2. IDC. Worldwide Generative AI Smartphone Forecast, 2024-2028: July 2024. Published July 2024. Worldwide Generative AI Smartphone Forecast, 2024–2028: July 2024

3. Test configuration was based on Qualcomm platform with 9.6 Gbps on 2R-1β LPDDR5X and eDVFSC enabled. Voltage scaling relative to LPDRAM frequency was fixed at 1.06V/0.98V due to a hardware limitation for the purpose of this report.

4. The voltage levels used in this report — 1.060V/0.98V — differ slightly from the nominal settings of 1.05V/0.99V due to the constraints of the test environment.

 

Sr. Manager Product Marketing, Mobile Business Unit

Rui Zhou

Rui is the Senior Manager of Product Marketing for the Mobile Business Unit at Micron Technology. With over 15 years combined experiences of marketing and engineering in high-tech industry, Rui is an expert in product positioning, value proposition, and growth strategy. Rui has an MBA from Portland State University, and both a M.Sc. in IC design and a BEng degree in Electrical and Electronics Engineering from Nanyang Technology University in Singapore.