A recently published IDC white paper calls attention to the growing significance of chip connectivity. Entitled, “Hidden Signals: The Memories and Interfaces Enabling IoT, 5G, and AI,” the paper states that data created by electronic systems will grow 10x to 103 zettabytes between 2014 to 2023. Moreover, it says the average server will support over 670 gigabytes of DRAM in 2023.
The underpinnings to this market growth are not only interfaces or chip connectivity and where data is stored in real-time, but also the connectivity between systems and processors computing the data, according to IDC. Technologies critical in this scenario include GDDR6, HBM, and memory buffers. These are the components that move, accelerate, and store data and enable IoT, 5G, and AI. These are the trends driving new markets for the next 10 years, according to the IDC white paper.
The 10-page paper provides the reader a comprehensive market analysis of various product classifications, a historical perspective and market growths per category, trends, and detailed definitions.
As for memory and interface technology and associated trends, IDC explains that interfacesbetween high-performance data processors have been evolving because the market has recognized that monolithic, homogeneous microprocessors are insufficient to process and secure the amount and disparity of data flooding into high-performance systems.
The paper adds that CPUs, GPUs, FPGAs, and ASICs share a need for external memories—such as CPUs for main memory and GPUs for frame buffers—that hold data close to the processor. These needs thus put specific pressure on the interfaces between these processors and their memories. Thus, interfaces and memories are inextricably intertwined and the two are critical to making the often-hidden data signals flow efficiently.
IDC also explains that systems like servers will need increasing amounts of DRAM. However, such increasing capacities means increasing the burden on the system microprocessor to manage that memory across the memory bus. Memory buffer chips reduce that burden by bringing local intelligence to the memory on the memory modules.
Buffer chips on Load-Reduced DIMMs (LR-DIMMs), for example, reduce the load on the microprocessor’s memory controller and across the memory bus by handling all data, command, and control signals sent to memory and handling all reads and writes to the DRAM chips.
All in all, the IDC paper paints a favorable landscape and market outlook for IoT, 54G, and AI based on today’s Rambus memory and interface technologies. GDDR6 PHYs, 32 gigabit per second (Gbps) SerDes for 5G, 112Gbps ADC/DSP-based SerDes for next generation servers, and DDR5 buffer chips are leading the charge to further advance those product categories.
It’s definitely a good read for any and all who follow or want to learn more about these industry advances and the technologies driving them. Get a hold of a copy of this IDC White Paper right now.