Join Nidish Kamath, director of product management for Rambus Memory Controller IP, as he dives into the HBM, GDDR, and LPDDR solutions that address AI training and inference workload requirements in this webinar.
Navigating the Dynamics of IP Licensing for Data Center & AI
Raj Uppala, Senior Director of Marketing and Partnerships at Rambus, explores the complexities of IP licensing for emerging data center and AI applications.
Why PCIe and CXL are Critical Interconnects for the AI Era
Join Lou Ternullo, Senior Director of IP Product Management at Rambus, as he discusses the critical role of PCIe and CXL interconnects in enabling the AI-driven data centers of tomorrow in this on-demand webinar.
The Road Ahead for Main Memory in the Data Center
In this webinar, Carlos Weissenberg, Product Marketing Manager for Memory Interface Chips at Rambus, discusses the increasing demands for memory driven by AI and high-performance computing.
Why Memory Matters for AI
In this roundtable discussion, memory experts Steve Woo, John Eble, and Nidish Kamath explore the critical role of memory in AI applications. They discuss how AI’s rapid evolution, especially with the growth of large language models, is driving the need for higher memory capacity, bandwidth, and power efficiency.
The Impact of AI 2.0 on Memory & Interconnect Technology
Steven Woo, Fellow and Distinguished Inventor at Rambus Labs, explores the transformative impact of AI 2.0 on memory and interconnect technology. He highlights how the rapid growth of AI models, exceeding trillions of parameters, and the shift to multimodal systems has dramatically increased the demand for higher memory capacity and bandwidth.