News

JEDEC announced the publication of the HBM4 DRAM standard, delivering higher bandwidth, efficiency, and capacity for AI and ...
HBM is fundamental to the AI revolution as it ... “This shows Marvell has some big hyperscale XPU wins since this type of customization in the memory space does not happen for small orders.” ...
Selling Micron Technologies in the current environment isn't sensible due to its strong US manufacturing footprint and ...
HBM enables lower power consumption per I/O and higher bandwidth ... “For example, if there are two different types of memories — A and B, for discussion — and they are run at the same clock frequency ...
High-bandwidth memory (HBM) is again in the limelight. At GTC 2025, held in San Jose, California, from 17 to 21 March, SK hynix displayed its 12-high HBM3E devices for artificial intelligence (AI) ...
Rambus recently announced the availability of its new High Bandwidth Memory (HBM) Gen2 PHY. Designed for systems that require low latency and high bandwidth memory, the Rambus HBM PHY, built on the ...
Samsung Electronics anticipates a 21% decrease in Q1 profit, attributed to sluggish artificial intelligence chip sales and ...
Micron has become a third – and now significant – supplier for the HBM stacked memory that is used with GPU and other kinds of AI accelerators. Significantly, Micron’s eight-high HBM3E stacks are used ...