Home > Search
Found 220 Results
As machine learning and artificial intelligence (AI) becomes more sophisticated and advanced, there are more practical uses for the technology. Machine learning can advance scientific achievement, especially where medicine and space-exploration is concerned. Since machine learning can handle a myriad of data in an efficient manner and is able to automate repetitive actions based on […]
It is increasingly harder to speak of the future of technology without mentioning the Internet of Things (IoT) and blockchain. Likewise, the same could be said about machine learning. A research report by Gartner, which says that worldwide IT spending will reach $3.7 trillion in 2018, mentions that Artificial Intelligence (AI) is a key driver […]
Consumer IoT components increasingly targeted by attackers Earlier this month, the National Institute of Standards and Technology (NIST) published a document titled “Draft Report on International IoT Cybersecurity Standardization (PDF).” The report – which examines various aspects of the rapidly evolving Internet of Things – also takes a closer look at the potential risks of […]
Earlier this month, Semiconductor Engineering’s Ann Steffora Mutschler penned an article that takes a closer look at how buffering is gaining ground as a way to speed up the processing of increasingly large quantities of data. In simple terms, says Mutschler, a data buffer is an area of physical memory storage that temporarily stores data while it is […]
There is a common notion that the arrival of Artificial Intelligence (AI) would spell the end of many entry-level jobs. Companies such as McDonalds are replacing human servers with machines in response to a rise in the minimum wage, with a former CEO saying that “robots are going to replace people in the service industry […]
Steven Woo, the vice president of systems and solutions and distinguished inventor in Rambus’ Office of the CTO, recently authored an article for Semiconductor Engineering that explores the data center in 2018 and beyond. As Woo observes, there are a number of trends that continue to challenge the design of conventional von Neumann architecture, including […]
Comprehensive solution including memory, PHY, Controller and Verification IP for ASIC and FPGA to enable GDDR6 adoption beyond graphics Boise, Idaho, January 23, 2018 (GLOBE NEWSWIRE) — Micron Technology, Inc. (NASDAQ:MU), a leading memory and storage provider, today announced with Rambus Inc., Northwest Logic and Avery Design, their efforts to deliver a comprehensive solution for […]
The origins of GDDR The origins of modern graphics double data rate (GDDR) memory can be traced back to GDDR3 SDRAM. Designed by ATI Technologies, GDDR3 made its first appearance in nVidia’s GeForce FX 5700 Ultra card which debuted in 2004. Offering reduced latency and high bandwidth for GPUs, GDDR3 was followed by GDDR4, GDDR5, […]
BOISE, Idaho, Jan. 23, 2018 (GLOBE NEWSWIRE) — Micron Technology, Inc. (NASDAQ:MU), a leading memory and storage provider, today announced with Rambus Inc., Northwest Logic and Avery Design, their efforts to deliver a comprehensive solution for GDDR6, the world’s fastest discrete memory. This first-of-its-kind solution would enable GDDR6 use in advanced applications such as high-performance […]
Niraj Mathur, VP of high speed interface products at Rambus, recently penned an article for Semiconductor Engineering that explores the importance of PCI Express 4.0 in the data center. Read our primer: Rambus launched PCI Express 5 » “Modern CPUs rely on the following primary interconnect types: memory interconnects, primarily supported by DDR4 today; high speed […]