Top Selling Multipurpose WP Theme
Home Technology ‘Marrying’ Enhanced Memory with Intel Xeon Processor Cores

‘Marrying’ Enhanced Memory with Intel Xeon Processor Cores

​In a world with no shortage of challenges to solve – curing cancer, slowing global warming, securing a nuclear stockpile – it’s critical to have technology that can keep up with and make use of growing heaps of data. It’s about more than just the speed of data processing – it’s also about the sheer amount of data that can be processed – and the rate at which the memory can supply the processor.

Ugonna Echeruo, principal engineer in Intel’s Design Engineering Group and chief architect of the Intel® Xeon® CPU Max Series (code-named Sapphire Rapids with HBM), describes the challenge this way: At its most basic level, a CPU fetches information from the memory, processes it and updates it. Eventually, the amount of information the CPU can work through is constrained by the size of the “pipe” bringing data to it. The bigger the pipe, the more information that can flow through the CPU, the more tasks accomplished.

Echeruo explains that Intel focused on enabling a new solution category that targets workloads bound by memory bandwidth. Right now, the answer to constrained data is high bandwidth memory, or HBM, although he says that solutions could change in the future. It allows the CPU to gobble more data, leaving even the hungriest customers satisfied. And while 4th Gen Intel® Xeon® Scalable processors can handle hefty workloads themselves, Echeruo explains that HBM is equipped to specifically take on workloads where performance is constrained by memory bandwidth or by both memory bandwidth and computational limitations.

Echeruo calls the Max Series CPU launch the most notable moment in his 20-plus-year career at Intel. And for good reason. The Intel Xeon CPU Max Series processor is the first and only x86-based processor with HBM on the chip.

Customers Ask, Intel Delivers

Echeruo says customers – like government research labs, federal agencies and universities – are at the forefront of why the Max Series CPU exists today.

“Customers are running applications that require a lot of memory bandwidth and they are limited by the bandwidth in existing products,” he says. And those same customers have been asking Intel to boost that bandwidth to meet their demands.

Consider, for example, a lab that does high performance computing with mammoth amounts of data. With a typical setup (sans HBM), each researcher would need to utilize many compute nodes to produce a solution. Thanks to HBM, Max Series CPUs improve performance and memory bandwidth – without requiring code changes – that ultimately help researchers accomplish the same task using fewer resources, which increases the lab’s overall productivity and energy efficiency.

HBM 101: Back to Basics

Imagine the CPU as the internal combustion engine in your car. The car’s performance is capped because only so much air can be pumped into the combustion chamber and mixed with gas. Then comes the addition of turbos and superchargers – or, in the CPU’s case, high bandwidth memory. Now you can force even more air in, and off you go, faster than ever!

Over time, CPU “pipes” (remember Echeruo’s description from the beginning?) move more air to the figurative combustion chamber. The pipes are wider and the throughput of the interface between the memory and the CPU increases. With those wider pipes, more capable CPUs can handle more data and customers are happier. With HBM, CPU capability takes that next turbocharged step.

Challenges Worth Solving

Echeruo explained that the “real estate” or proximity of HBM on the CPU is key to its success. HBM is soldered on the circuit board, close to the processor, making for a quick, convenient hop to fetch the necessary information. And a bonus: That proximity saves power.

But make no mistake, he says, it is not as easy as just sticking HBM to the CPU package.

“There were a lot of challenges for all of the different teams involved,” says Echeruo. He explains that the team was working with 4th Gen Intel Xeon processors after the design was set, so lots of testing and validation was needed to make sure their HBM plan would be successful. “We wanted to take this enhanced memory system and attach it to the best compute cores at Intel, our Intel Xeon cores, and marry the two,” says Echeruo.

“We had to look at individual IP within the product to make sure there wasn’t anything that would conflict with HBM, and we needed to make sure we could fully utilize as much bandwidth as possible.” He adds, “We had to figure out how to make changes necessary for HBM to be successful and functional while not impinging on the schedule and delivery of the standard product.”

Stacking up to the Competition

Speaking of power efficiency, Echeruo said that’s another advantage of the Max Series CPU. Not only is the proximity of the HBM to the CPU a power-saver, but so are the savings from relying on fewer systems and capacity that users generally need without HBM. So, you can likely say goodbye to many of those RAM sticks and say hello to HBM.

Intel is the first to add HBM to the x86 processors, which Echeruo said is a “major advantage for Intel.” As we look into the future, Echeruo believes the key is “to leverage our software stack and make HBM more attainable and user-friendly for customers.”

Speaking of user-friendliness, unlike GPUs with HBM, the Max Series CPU doesn’t require labor-intensive code changes, saving time and effort. “The less work customers have to do, the happier they are in the end,” says Echeruo, “and ultimately that’s great for Intel, too.”

And the innovation doesn’t stop here. A few weeks ago at the ISC High Performance ’23 conference, Intel showcased an upcoming high memory bandwidth product, a future Intel Xeon processor, code-named Granite Rapids with Multiplexer Combined Ranks.

A Better World, Thanks to HBM

Xeon Max is going to be used to extend the basic scientific understanding of the world. Echeruo says that this is, in part, what drives him forward.

“These products will be put into servers that drive companies and national labs to work on fundamental science, medicine or cloud infrastructure now and in the future,” Echeruo says.

@2023 – Cellit. All Rights Reserved.

Contact us: contact@cellit.in