Intel, partners make new strides in Loihi neuromorphic computing chip development

Intel has released new performance benchmarks for the Loihi neuromorphic computing processor, revealing improvements in power consumption and efficiency. 

During Intel’s virtual Lab Day, on Thursday, the tech giant revealed Loihi chip improvements in voice command recognition, gesture recognition in artificial intelligence (AI) applications, image retrieval, search functions, and robotics. 

Neuromorphic computing aims to use computer science to propel rule and classical logic-based AI builds into more flexible systems that emulate human cognition, including contextual interpretation, sensory applications, and autonomous adaptation. 

Intel says that neuromorphic computing focuses on emulating the human brain and implementing stable probabilistic computing, which creates “algorithmic approaches to dealing with the uncertainty, ambiguity, and contradiction in the natural world” — just like humans are capable of.

However, speaking to attendees of the virtual event, Rich Uhlig, VP and Director of Intel Labs added a caveat: progress in neuromorphic computing has “come at the cost of ever-increasing power consumption, […] posing challenges for AI and the democratization of AI” — a bottleneck hurdle that the company wants to overcome.

The Loihi research chip, containing 128 cores and fabricated through 14nm process technology, is Intel’s contribution to the field. Making its original debut in 2017, the processor contains 130,000 ‘neurons’ suitable for use in spiking neural networks (SNNs) that send pulsed signals — containing encoded information — which ‘spike’ and stimulate other neurons in a neural network. 

TechRepublic: Microsoft’s new security chip takes PC protection to a higher level

Intel’s latest Loihi performance benchmarks relate to voice command recognition, gesture recognition, image retrieval, search functionality, and robotics, areas of interest to partner companies experimenting with the next-generation AI technology.

CNET: How Intel will keep Moore’s Law cranking for years to come

Accenture, one of Intel’s Neuromorphic Research Community (INRC) members, compared Loihi to “standard” GPUs available in the market today. 

In particular, the company has been examining automotive AI, in which the power consumption of standard GPUs is currently too high to include advanced AI applications. 

Neuromorphic computing could pave the way for more intelligent applications entering the space without draining a car’s battery.  Working with automotive partners, Accenture is currently testing proof-of-concept automotive AI models using Intel’s chip for voice commands. Automotive models are now being trained with simple instructions –including lights on, lights off, and start engine — through open source voice recording samples and the Loihi processor. 

Tests found that “acceptable” accuracy results have been achieved so far — but the real highlight is a reported 1,000 times energy efficiency improvement in comparison to a standard GPU. In addition, Loihi responded to computational requirements up to 200 milliseconds more quickly.

The company and other INRC partners, including the University of California, Irvine, have also examined how Loihi deals with gesture recognition. Intel says that Loihi’s self-learning capabilities have resulted in “tangible” progress in this field, in which new gestures can now be learned in “just a few exposures” without the need to store vast amounts of data on AI hardware. 

“This could be applied to a variety of use cases, such as interacting with smart products in the home or touchless displays in public spaces,” the company added, with the overall aim being the creation of a way to allow us to interact more “naturally” with our devices.

In image retrieval and product search applications, partners in the retail sector tested out Loihi’s processing capabilities against standard CPU and GPU solutions. According to the researchers, the neuromorphic computing chip could find and generate image feature vectors using less time and less energy than standard processors while “maintaining the same level of accuracy.”

screenshot-2020-12-03-at-18-19-45.png

Example approximation of gesture-based learning, which involves analyzing relationships between different data points.

Optimization and search options, too, have been explored. Intel says that during tests, the chip was able to solve problems such as constraint satisfaction — the process of finding mathematical solutions to a problem when restrictions are imposed — over 1,000 times more efficiently and 100 times faster than standard CPUs. 

For logistics and travel scheduling applications, for example, the new chip could reduce power requirements and bolster computing efficiency.

When it comes to robotics, Rutgers and TU Delft have tested Intel’s chip to improve drone performance. The organizations found, once more, that the processor required far less power than standard mobile GPUs, leading to an overall 75% power efficiency boost. 

See also: Intel’s Q3 in line with expectations, data center chip sales down from year ago | Intel, Argonne National Lab co-design XeHP GPUs, oneAPI in exascale HPC chase | Intel debuts GPU for the data centre with a focus on Android cloud gaming

The expense of neuromorphic chip design and production will likely keep neuromorphic computing applications at the edge or for non-cost-sensitive systems, Intel says, but the firm expects innovation in the field to eventually drive down the price of adopting neuromorphic technologies and push it firmly into the commercial sector. 

In the end, Intel wants to create an architecture focused on power efficiency and speed, with little to no detriment to performance, a concept the company calls an “order of magnitude gains.”

screenshot-2020-12-03-at-18-37-50.png

Alongside the new neuromorphic computing results, Intel has also disclosed new INRC members. 

The INRC is a community spanning across academic, government and industry research groups interested in developing neuroscience, neuromorphic computing applications, programming models, and both sensor and control technologies suitable for commercial settings. 

Since its launch in 2018, the INRC has signed up over 100 members including Accenture, Airbus, GE, Berkeley, and Washington State University. Now, Lenovo, Logitech, Mercedes-Benz and Prophesee have added themselves to the roster. 

“In two short years, we’ve formed a vibrant community comprising hundreds of researchers around the world inspired by the promise of neuromorphic computing to deliver orders of magnitude gains in computing efficiency, speed, and intelligent functionality,” commented Mike Davies, Intel Neuromorphic Computing Lab director. “For the first time, we are seeing a quantitative picture emerge that validates this promise. Together with our INRC partners, we plan to build on these insights to enable wide-ranging disruptive commercial applications for this nascent technology.”

Intel says the work conducted by INRC members will contribute to the firm’s next-generation neuromorphic research chip, which will be “coming soon.”

Previous and related coverage


Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0


Source Article

Next Post

CREATE A HOLIDAY TRADITION WITH RAISING TWELVE ON A NICKEL AND A PRAYER STREAMING DEC 18-20 at Powerstories

Thu Dec 3 , 2020
For a special limited engagement on December 18-20, join Powerstories Theatre for a very special holiday treat online. Enjoy Raising Twelve on a Nickel and a Prayer, a prerecorded video of their 2018 live production. With music by Terez Hartman, this is a true story written by Powerstories founder Fran […]

You May Like