The Micron strategically reorganizes business units to highlight AI data center demand.

The Micron strategically reorganizes business

Table of Contents

​Micron Technology has recently redesigned its business segments to meet the increasing interest in artificial intelligence data centers. This strategic move also involves the development of a new division known as the ‘Cloud Memory Business Unit,’ which deals with HBM chips that are crucial in speeding up the artificial intelligence operations in large-scale data centers. These HBM chips are essential companions to other artificial integrated graphics chips, like those of Nvidia, for the improvement of data-intensive tasks. This new division will be led by Raj Narasimhan, who has been promoted from the position of leading the Compute and Networking Business Unit. ​
In addition to the establishment of the Cloud Memory Business Unit, Micron has also started three other business units, which include the Core Data Center Business Unit, Mobile and Client Business Unit, and Automotive and Embedded Business Unit. The Core Data Center unit will focus on memory and storage products for the data center to achieve increased server I/O for its business applications. The Mobile and Client Business Unit will cater to the demand of the mobile device market; on the other hand, the Automotive and Embedding Business Unit will serve the automotive, industrial, and consumer technology segments. All of these units will be run by Micron’s current executives, who will ensure that the company has organizational memory, along with deep knowledge of existing and new opportunities for growth in those areas.

Creation of the Cloud Memory Business Unit

Micron has created a new CBMU, or ‘Cloud Memory Business Unit,’ which focuses on the HBM market that is critical for AI operations. This unit will be to design memory solutions for hyperscale cloud providers to improve the operation of AI tasks. The HBM chips are endowed with the role of working in tandem with the AI graphics processors, like those of Nvidia, in addressing intense data operations.
As for the new division, its purpose is to ensure that P. Lebart is better established in the new markets it operates in. Taking charge of this new division is Raj Narasimhan, who used to work as the head of the Compute and Networking Business Unit. This will lower the costs, while his experience will help to satisfy the increased need for AI solutions. This reorganization stems from Micron’s tactics of preparing for the advancement of the AI market as well as aiding the company in asserting its leadership in the memory solution market.

Introduction of Specialized Business Units

However, other recent organizational structural changes in Micron also include three other units: Core Data Center Business Unit, Mobile and Client Business Unit, and Automotive and Embedded Business Unit. The Core Data Center unit will focus on the memory and storage products for the data center application and the performance of the enterprise.
To capture the needs of the mobile device market, a new unit, specifically the Mobile and Client unit, will be developed, while the Automotive and Embedded unit will target the automotive, industrial, and consumer technologies constituencies. All the above-mentioned units will be headed by those executives currently in the company to help them build upon their experience in their fields.

Advancements in AI Memory Solutions

In AI memory, Micron is always on the move, with their HBM3E 12H 36GB memory that is 50% denser as compared to the previous model in terms of size. This innovation is defined to address the high-performance need for AI jobs, as well as energy efficiency.
Moreover, there is also the SOCAMM (System on Chip Attached Memory Module) by Micron in volume production. Compared to conventional RDIMMs, SOCAMM provides roughly 2.5 times the bandwidth per capacity, balancing the system. It is perfect for use in AI servers and data applications. It comes with two standard low-power interfaces, which make it more compact and hence reduce power consumption, hence making it an innovation in memory technology.

Strategic Focus on AI-Driven Growth

The restructuring at the company is quite strategic, and it is intended to respond to the key development area of artificial intelligence. The company has revealed better-than-expected revenues, above expectations, especially due to the growing adoption of HBM chips in artificial intelligence technologies. Micron HBM chips used in relevant processors of Nvidia are sold out for the next 2 years, 2024 & 2025, making their demand robust in the market.
Micron plans to promote its sales by reorganizing the business segments and focusing on developing memory solutions for the AI and data center segments. This particular strategy is anticipated to strengthen the organization’s capacity to produce efficient memory solutions responsive to the dynamic requirements of artificial intelligence-based systems.

Did You Know?

It is expected that Micron’s high-bandwidth memory (HBM) products will not be sold out through to the end of the 2025 period due to increased AI demand. Here, the new Cloud Memory Business Unit that targets the HBM for the data centers and their SOCAMM has three and a half times more bandwidth than the memory. This restructuring places Micron in the pivotal role of being associated with AI, mobile, and data center memory solutions.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related News >