In many cases, the cost of a DDR4 memory kit is twice what it was a year ago, but if it comes as any consolation, at least the market isn’t standing pat. Rambus, a company that is known equally well for developing memory technologies as it is for suing other firms over the use of its IP, announced that it has a functional DDR5 DIMM (dual in-line memory module) prototype.
Memory PHYs
Rambus announces industry’s first fully functional DDR5 DIMM
Specialist memory company Rambus has announced a fully functional DDR5 DIMM (dual in-line memory module) prototype. It claims to have achieved an industry first with its DIMM “capable of achieving the speeds required for the upcoming DDR5 standard”.
DDR5 Runs in Rambus’ Labs
Rambus has working silicon in its labs for DDR5, the next major interface for DRAM dual in-line memory modules (DIMMs). The register clock drivers and data buffers could help double the throughput of main memory in servers, probably starting in 2019 — and they are already sparking a debate about the future of computing.
SiFive’s Chief Executive on Opening a Chip Design Factory
Before he agreed to anything, Naveed Sherwani needed to make 40 phone calls. He had questions about the new RISC-V computer architecture and the company founded by its inventors, SiFive. He had been asked to run it.
Rambus Aims DDR4 PHY at Data Centers
TORONTO – Rambus Inc. said it has developed the first production-ready 3200 Mbps DDR4 PHY available on Globalfoundries Inc.’s FX-14 ASIC platform using its power-performance optimized 14nm LPP process.
The Rambus R+ DDR4 PHY intellectual property uses Rambus’ proprietary R+ architecure, based on the DDR industry standard. The PHY is part of the Rambus’ suite of memory and SerDes interface offerings for networking and data center applications, said Frank Ferro, senior director of product management at Rambus. Meeting the performance and capacity demands of those segments are a heavy focus for the company, he said in a telephone interview with EE Times.
Cloud, Consumer and Things Drive Insatiable Demand for Memory
When a caller uses speech recognition on his/her mobile phone, the application is running mostly in the cloud. Functions such as data input using speech recognition requires large amounts of memory to make the capability seamless and effective. Providing this functionality affects the architecture of the mobile device and of the cloud. Compute servers in the massive data centers run by Amazon, Google, and Facebook are vastly different than those in data centers a decade ago. These new centers rely on many powerful processors in parallel, all computing different elements of the same task. These processors require large amounts of data stored in DRAM and flash in front of large hard drive farms to be efficient.