Ask the Experts: The State of AI
Hear from Rambus Expert Steven Woo about the latest developments in AI and the implications for hardware and computing architecture. Learn about chain-of-thought and the innovations needed to support its
Hear from Rambus Expert Steven Woo about the latest developments in AI and the implications for hardware and computing architecture. Learn about chain-of-thought and the innovations needed to support its
Steven Woo, Fellow and Distinguished Inventor at Rambus Labs, explores the transformative impact of AI 2.0 on memory and interconnect technology. He highlights how the rapid growth of AI models,
AI is a rapidly evolving space. With the meteoric rise of generative AI applications in the past year, we are now firmly in the era of AI 2.0. This is
Supercomputing 2023 brought together some of the brightest minds in the field of high-performance computing, showcasing the latest in exascale computing and the challenges faced in the pursuit of next-generation
Last week, I had the pleasure of hosting a panel at the AI Hardware & Edge AI Summit on the topic of “Memory Challenges for Next-Generation AI/ML Computing.” I was
Developments in generative AI and Large Language Models are moving at a lightning pace. Incredible amounts of data must be processed and moved to train models which at their largest

AI has transformed the semiconductor industry, impacting design, manufacturing and the global economy at large. In 2025, AI adoption “in at least one business function”

Exciting developments in AI inference—like chain-of-thought prompting, which breaks down large, complex questions and prompts into smaller steps mimicking human reasoning—are opening the gates to

Rapid advancements in AI are becoming commonplace, driven by large language models (LLMs) that now exceed 1 trillion parameters. While these AI models are revolutionizing

Hear from Rambus Expert Steven Woo about the latest developments in AI and the implications for hardware and computing architecture. Learn about chain-of-thought and the

The term “memory wall” was first coined in the 1990s to describe memory bandwidth bottlenecks that were holding back CPU performance. The semiconductor industry helped

Steven Woo, Fellow and Distinguished Inventor at Rambus Labs, explores the transformative impact of AI 2.0 on memory and interconnect technology. He highlights how the

The growth of AI has been staggering, and applications are emerging across the industry that offer new generative AI capabilities powered by large language models.

Businesses across industries are going all in on artificial intelligence (AI). Major tech companies have spent billions and plan to maintain, if not increase, their

AI is a rapidly evolving space. With the meteoric rise of generative AI applications in the past year, we are now firmly in the era

Supercomputing 2023 brought together some of the brightest minds in the field of high-performance computing, showcasing the latest in exascale computing and the challenges faced

Last week, I had the pleasure of hosting a panel at the AI Hardware & Edge AI Summit on the topic of “Memory Challenges for

Developments in generative AI and Large Language Models are moving at a lightning pace. Incredible amounts of data must be processed and moved to train
