Jul. 16, 2018 –
The embedded field-programmable gate array (eFPGA) is beginning to find a market, with communications leading the way but machine learning likely to drive broader adoption.
The eFPGA has had a checkered history since the concept first appeared in the 1990s as concerns grew over the cost of mask sets for processes as they approached the 180nm process node. But several attempts to kickstart technology ran into issues of cost and density. The need to maintain programmability, particularly in the interconnect, meant many SoC designers opted for other ways to incorporate flexibility in their products.
The market seems to be changing as users look beyond die cost to the need to support fast-moving standards and novel algorithms that, in some cases, mutate from month to month. Moore's Law might have slowed down considerably for silicon but a similar exponential has emerged in machine learning, with the number of papers on deep-learning and associated algorithms appearing on the academic site ArXiV more than doubling every two years, according to a presentation by Jeff Dean of Google at the ScaledML conference earlier in the year.