Found 3539 Results

Creating a Secure Infrastructure for Advanced AI Workloads

https://go.rambus.com/creating-a-secure-infrastructure-for-advanced-ai-workloads#new_tab

Bart Stevens, Senior Director of Product Marketing for Security IP, explores the security challenges of AI inference processing in both cloud-based data centers and edge devices in this webinar.

Security Challenges in a World of AI Everywhere

https://go.rambus.com/security-challenges-in-a-world-of-ai-everywhere#new_tab

In this webinar, Scott Best, Senior Director of Anti-Tamper Technology at Rambus, explores the security challenges associated with AI chips and their growing deployment at the edge.

From Training to Inference: HBM, GDDR & LPDDR Memory

https://go.rambus.com/from-training-to-inference-hbm-gddr-and-lpddr-memory#new_tab

Join Nidish Kamath, director of product management for Rambus Memory Controller IP, as he dives into the HBM, GDDR, and LPDDR solutions that address AI training and inference workload requirements in this webinar.

Navigating the Dynamics of IP Licensing for Data Center & AI

https://go.rambus.com/navigating-the-dynamics-of-ip-licensing-for-data-center-and-ai#new_tab

Raj Uppala, Senior Director of Marketing and Partnerships at Rambus, explores the complexities of IP licensing for emerging data center and AI applications.

Why PCIe and CXL are Critical Interconnects for the AI Era

https://go.rambus.com/why-pcie-and-cxl-are-critical-interconnects-for-the-ai-era#new_tab

Join Lou Ternullo, Senior Director of IP Product Management at Rambus, as he discusses the critical role of PCIe and CXL interconnects in enabling the AI-driven data centers of tomorrow in this on-demand webinar.

The Road Ahead for Main Memory in the Data Center

https://go.rambus.com/the-road-ahead-for-main-memory-in-the-data-center#new_tab

In this webinar, Carlos Weissenberg, Product Marketing Manager for Memory Interface Chips at Rambus, discusses the increasing demands for memory driven by AI and high-performance computing.

Why Memory Matters for AI

https://go.rambus.com/why-memory-matters-for-ai#new_tab

In this roundtable discussion, memory experts Steve Woo, John Eble, and Nidish Kamath explore the critical role of memory in AI applications. They discuss how AI’s rapid evolution, especially with the growth of large language models, is driving the need for higher memory capacity, bandwidth, and power efficiency.

The Impact of AI 2.0 on Memory & Interconnect Technology

https://go.rambus.com/the-impact-of-ai-2-0-on-memory-and-interconnect-technology#new_tab

Steven Woo, Fellow and Distinguished Inventor at Rambus Labs, explores the transformative impact of AI 2.0 on memory and interconnect technology. He highlights how the rapid growth of AI models, exceeding trillions of parameters, and the shift to multimodal systems has dramatically increased the demand for higher memory capacity and bandwidth.

Addressing AI’s Insatiable Demand For Power

https://www.forbes.com/councils/forbestechcouncil/2024/10/03/addressing-ais-insatiable-demand-for-power/#new_tab

The growth of AI has been staggering, and applications are emerging across the industry that offer new generative AI capabilities powered by large language models. The impact of these AI 2.0 applications is broad and fundamentally alters the way we interact with computers. However, this improvement in capabilities and performance has also been accompanied by […]

HBM4 Controller Product Brief

https://go.rambus.com/hbm4-controller-product-brief#new_tab

The Rambus HBM4 Controller is designed to support customers with deploying a new generation of HBM memory for cutting-edge AI accelerators, graphics and high-performance computing (HPC) applications. The Rambus HBM4 Controller supports a data rate of 10 Gigabits per second (Gbps) and delivers a total memory bandwidth of 2,560 Gigabytes per second (GB/s) or 2.56 […]

Rambus logo