The International Conference on Field-Programmable Logic and Applications recently convened in Lausanne, Switzerland. As Christoforos Kachris, a senior researcher at the National Technical University of Athens notes, exploring the role of FPGAs in the data center took center stage at the conference.
“Christoph Hagleitne presented IBM’s view of the major applications where FPGAs can provide differentiation such as cognitive computing, high performance computing and the Internet of Things (IoT),” Kachris wrote in a recent EE Times article.
“[Meanwhile], P. K. Gupta, the general manager of Xeon+FPGA products in Intel’s data center group, said FPGAs can increase the performance of applications such as machine learning, cloud radio-access networks, edge computing and content delivery. Accelerators can [also] increase performance at lower total cost of ownership for targeted workloads.”
In addition, Microsoft’s Doug Burger detailed the evolution of Project Catapult cloud FPGA architecture, while a number of papers presented by various engineers focused on the acceleration of machine learning and data analytics, as well as how to effectively deploy accelerators in the data center.
Commenting on the above, Steven Woo, VP of Systems and Solutions at Rambus, told us that data centers frequently aggregate numerous individual servers into a pool of processing units, with large, data-intensive tasks distributed across multiple racks of servers.
“However, this one size fits all approach, typically characterized by a relatively fixed amount of compute, memory, storage and I/O resources in each server, frequently leads to an acute under-utilization of resources,” he explained. “This is because specific tasks may require a tailored amount of each compute resource in real-time. In some cases, for certain workloads the legacy server architecture contributes to low CPU utilization rates, high latencies to access data, reduced power efficiency and increased TCO.”
As Woo points out, this is precisely why a number of companies are using FPGAs to evolve a more modular and effective approach for data centers, acceleration, HPC and beyond.
“At Hot Chips, for example, Baidu engineers confirmed the company is using FPGAs to accelerate SQL queries,” said Woo. “Meanwhile, DeePhi is looking towards reconfigurable devices such as FPGAs for deep learning.” Indeed, according to DeePhi CEO and co-founder, Song Yao, FPGA-based deep learning accelerators already meet ‘most’ requirements, with high on-chip memory bandwidth, acceptable power and performance, as well as support for customized architecture.
From a broader perspective, says Woo, the days of relying on Moore’s Law and Dennard Scaling to optimize performance and power efficiency have long since passed. Moving forward, the industry must focus on advances in system architecture to drive effective improvements.
“FPGAs will exist alongside other silicon, while playing a critical role in helping to evolve computing platforms by enabling flexible acceleration and near data processing,” he concluded.