Kevin Fogarty of Semiconductor Engineering recently wrote an article about the future of machine learning. As Fogarty observes, machine learning involves executing complex calculations on huge volumes of data with increasing efficiency.
“Machine learning can apply to every corporate function and it can have an impact on companies in every part of the economy. So, it’s no surprise that funding is pouring into this sector,” writes Fogarty.
“A survey by McKinsey & Co showed that total investments in AI development tripled between 2013 and 2016. Most of that — $20 billion to $30 billion — came from tech giants. Those companies expect that machine learning and other AI models that descend from it will be as critical to their customers in the future as mobility and networking are now.”
What makes this technology so attractive, says Fogarty, is that machine learning and other forms of artificial intelligence (AI) can be broadly applied and still produce dramatic benefits. In fact, Gartner predicts that by 2020, AI technologies will be pervasive in new business software and will be a top-five investment priority for 30% of CIOs. Meanwhile, Linley Gwennap, principal analyst at the Linley Group, tells Semiconductor Engineering that the market for data center-oriented AI accelerators will reach $12 billion by 2022.
“During the next year or two we’ll start seeing a lot more choices out there for data centers and other devices,” Gwennap states. “So, the question facing the Googles and Facebooks of the world is, ‘Do I keep designing my own chips? Or, if I can get something just as good on the open market, should I do that?’”
According to Steven Woo, distinguished inventor at Rambus, the industry is currently fielding a number of AI, neural networking chips and cores.
“What’s happening at a higher level is they’re fusing information together. There is a lot of exploration going on. What you’re seeing now is a lot of companies looking for major markets to build infrastructure around. You see that with cell phones, where there are billions of units,” Woo tells Semiconductor Engineering. “Those are driving new packaging infrastructure. You also see it in automotive, which has a lot of money behind it. And with IoT, the potential is there, but the challenge is finding commonality. And with neural networking and machine learning, there seems to be new algorithms every week, which makes it hard to develop a single architecture. This is why you’re seeing so much interest in FPGAs and DSPs.”
Interested in learning more? The full text of “The Next Phase of Machine Learning” is available on Semiconductor Engineering here.