The Atari 2600 (or VCS) – which hit the nascent video game market back in 1977 – packed 128 bytes RAM and an 8-bit MOS 6507 CPU clocked at a mere 1.19 MHz. According to Wikipedia, the RAM was tasked with handling run-time data, which included the call stack and the state of the game world. There was no frame buffer.
Image Credit: Wikipedia (via Evan-Amos)
Fast-forward to 1996 and the launch of Nintendo’s N64. Powered by a 64-bit NEC VR4300 CPU clocked at 93.75 MHz, the fifth generation console was one of the first to implement a unified memory subsystem and packed 4 megabytes of Rambus RDRAM (subsequently expandable to 8MB).
Image Credit: Wikipedia (via Yaca2671)
Driven by Moore’s Law and the demands of gaming enthusiasts, consoles were rapidly becoming ever more sophisticated as they pushed – and sometimes even shattered – the limits of bandwidth, capacity and graphics. By the time 2006 rolled around, Sony’s PlayStation® 3 boasted a 3.2 GHz Cell Broadband Engine with 1 PPE & 7 SPEs, 256 MB of XDR DRAM and 256 MB GDDR3 DRAM.
Image Credit: Wikipedia (via Evan-Amos)
A decade later, all eyes in the gaming world are fixed on virtual reality (VR) headsets such as Facebook’s Oculus Rift, Samsung’s Gear VR (powered by Oculus), HTC’s Vive and Sony’s PlayStation VR. Perhaps not surprisingly, implementation and requirements vary wildly for each device. For example, Samsung’s Gear VR is designed to work with a compatible Galaxy device, which acts as the headset’s display and processor. Meanwhile, the actual Gear VR unit is designated as a controller, as it includes a high field of view, as well as an inertial measurement unit (IMU) for rotational tracking.
Image Credit: Oculus Rift
In contrast, system requirements for the Rift (which is contingent upon a PC) stipulate an NVIDIA GTX 970 or AMD 290 GPU, Intel i5-4590, 8GB RAM and HDMI 1.3 video output supporting a 297MHz clock.
“A traditional 1080p game at 60Hz requires 124 million shaded pixels per second. In contrast, the Rift runs at 2160×1200 at 90Hz split over dual displays, consuming 233 million pixels per second,” Oculus’ Atman Binstock explained in a recent blog post.
[youtube https://www.youtube.com/watch?v=amtBUkmHS0w]
“At the default eye-target scale, the Rift’s rendering requirements go much higher: around 400 million shaded pixels per second. This means that by raw rendering costs alone, a VR game will require approximately 3x the GPU power of 1080p rendering.”
In the future, says Binstock, successful consumer VR will likely drive changes in GPUs, OSs, drivers, 3D engines, and apps, ultimately enabling much more efficient low-latency VR performance.
“It’s an exciting time for VR graphics, and I’m looking forward to seeing this evolution,” he added.
Indeed, VR has certainly come a long way since 1991, when the $60,000 Virtuality 1000CS made its way into the arcade scene. The unit featured an HMD to display video and play audio, while players moved and used a 3D joystick to interact with the VR world. According to Tom’s Hardware, the system relied upon an Amiga 3000 to handle most of the game processing.
“Gaming may have significantly evolved over the years, but there is one constant that remains unchanged. Players are always seeking a more immersive experience, enabled by improvements in AI and more realistic and responsive graphics,” Steven Woo, Vice President of Systems and Solutions at Rambus, explained. “As such, gaming continues to be at the forefront, pushing the very limits of numerous technologies, including memory, processing and graphic capabilities.”
Woo, an engineer who participated in the development of memory technologies adopted in Sony’s PlayStation 2 and PlayStation 3 game consoles, also pointed out that VR, although quickly evolving, is still in a relatively nascent stage.
“As Oculus’ Atman Binstock noted, successful consumer VR is likely to drive changes in GPUs, OSs, drivers, 3D engines, and apps, ultimately enabling much more efficient low-latency VR performance,” Woo added. “I’m looking forward to seeing how VR will ultimately take advantage of new memory technology as it evolves over the next few years.”
Leave a Reply