Nanotechnology Now

Our NanoNews Digest Sponsors
Heifer International



Home > Press > Leti’s Chief Scientist Presents Optimistic Vision for Neuromorphic Hardware and Ultra-Low-Power Microdevices for Edge Computing at ISSCC: Leti’s Chief Scientist Presents Optimistic Vision for Neuromorphic Hardware and Ultra-Low-Power Microdevices That Are Based on Novel Emerging

Abstract:
CEA-Leti’s chief scientist today issued a forward-looking call to action for the microelectronics industry to create a radically new, digital-communication architecture for the Internet of Things in which “a great deal of analytics processing occurs at the edge and at the end devices instead of in the Cloud”.

Leti’s Chief Scientist Presents Optimistic Vision for Neuromorphic Hardware and Ultra-Low-Power Microdevices for Edge Computing at ISSCC: Leti’s Chief Scientist Presents Optimistic Vision for Neuromorphic Hardware and Ultra-Low-Power Microdevices That Are Based on Novel Emerging

San Francisco, CA | Posted on February 13th, 2018

Delivering a keynote presentation at the kickoff of ISSCC 2018, Barbara De Salvo said this architecture will include human-brain inspired hardware coupled to new computing paradigms and algorithms that “will allow for distributed intelligence over the whole IoT network, all-the-way down to ultralow-power end-devices.”

“We are entering a new era where artificial-intelligence systems are … shaping the future world,” said De Salvo, who also is Leti’s scientific director. “With the end of Moore’s Law in sight, transformative approaches are needed to address the enduring power-efficiency issues of traditional computing architectures.”

The potential efficiencies of processing data at the edge of networks – e.g. by small computers located near IoT-connected devices – rather than at distant data centers or the Cloud are increasingly cited as long-term goals for the Internet of Things. But the challenges to realizing this vision are formidable. For example, IoT battery-powered devices lack both processing power to analyze the data they receive and a power source that would support data processing.

To break through these barriers, De Salvo called for a “holistic research approach to the development of low-power architectures inspired by the human brain, where process development and integration, circuit design, system architecture and learning algorithms are simultaneously optimized.” She envisions a future in which optimized neuromorphic hardware will be implemented as a highly promising solution for future ultralow-power cognitive systems that extend well beyond the IoT.


“Emerging technologies such as advanced CMOS, 3D technologies, emerging resistive memories, and silicon photonics, coupled with novel brain-inspired paradigms, such as spike-coding and spike-time-dependent-plasticity, have extraordinary potential to provide intelligent features in hardware, approaching the way knowledge is created and processed in the human brain,” she said.

De Salvo’s presentation, “Brain-Inspired Technologies: Towards Chips that Think”, included summaries of key research findings in a variety of fields that will play a role in developing brain-inspired technologies for computing and data-handling requirements of a “hyperconnected” world.


Human Brain Research
Tracing major discoveries about how the brain works, De Salvo cited the emergence of connectionism, novel neuroimaging techniques and the functioning of neural networks, which may provide models for brain-inspired technologies.

“The large-scale neuronal networks of the brain are arranged globally as hierarchical modular networks, with dense modules at the local level (cellular circuits, laminar compartments) that are encapsulated in increasingly larger modules (cortical columns, areas and whole lobes), but with very sparse overall connectivity,” she said. “Such a topology fundamentally enhances the brain’s dynamic stability and information-processing abilities.

“An important research target will be to understand how the three-dimensional organization of brain cells, neurons and glial cells, connected in networks within the layers of the brain cortex, are responsible for the emergence of genetically determined elementary operations,” De Salvo said.

Hyperconnectivity and Deep Learning
She also noted that the convergence of miniaturization, wireless connectivity, increased data-storage capacity, and data analytics, has positioned the Internet of Things at the epicenter of profound social, business and political changes.

“With billions of easy-access and low-cost connected devices, the world has entered the era of hyperconnectivity, enabling people and machines to interact in a symbiotic way (anytime, anywhere) with both the physical and cyber worlds,” De Salvo said. “AI has been at the center of this revolution.”

She cited significant gains in the performance and applications of machine learning, driven by vast data storage in images, videos, audio and text files available across the Internet. These gains, in turn, have been essential to the dramatic improvement of learning/training approaches and algorithms, as well as the increased computational power of computers, including parallel computing for neural network processing, which has compensated for the slowing down of Moore’s Law below the 10nm node. Deep learning is the most popular machine-learning field.

“Today, for tasks such as image or speech recognition, machine-learning applications are equaling or even surpassing expert human performance,” De Salvo said. “Other tasks considered as extremely difficult in the past, such as natural language comprehension or complex games, have been successfully tackled.” Future applications will require even more analysis, understanding of the environment and intelligence, and machine-learning algorithms will require more computing power to become pervasive.

Approaching the Edge
“Bringing intelligence to the edge or to end-devices means doing useful processing of the data as close to the collection point as possible, and allowing systems to make some operational decisions locally, possibly semi-autonomously,” De Salvo explained.

Controlling real-time distance learning locally is essential for many applications, from landing drones to navigating driverless cars. “The delay caused by the round-trip to the Cloud could lead to disastrous or even fatal results,” she said. “Privacy will require that key data not leave the user’s device, while transmission of high-level information, generated by local neural-network algorithms, will be authorized.”
Raw videos generated by millions of cameras will have to be locally analyzed to limit bandwidth issues and communication costs. For all these reasons, new concepts and technologies that can bring artificial intelligence closer to the edge and end-devices are in high demand.

“The primary design goal in distributed applications covering several levels of hierarchy (similar to what happens in the brain), is to find a global optimum between performance and energy consumption,” De Salvo said. “This imperative requires a holistic research approach, where the technology stack (from device to applications) is redesigned.”

This process is underway. Companies are addressing embedded applications by developing specialized edge platforms that can execute machine-learning algorithms on embedded hardware. Impressive power improvements (down to a few watts) have been achieved exploiting Moore’s Law: pushing the FinFET technology down to the 7nm node and by hardware-software co-optimization. To optimize energy efficiency in mobile devices, several research groups have focused on hardware designs of Convolutional Neural Network (CNN) accelerators. De Salvo noted that off-chip storage devices, such as DRAMs, significantly increase power consumption, but that mobile-oriented applications (keyword spotting and face detection) have been demonstrated with a low-power programmable deep-learning accelerator that consumes less than 300µW.

‘Extremely Critical’ Power Requirements
Some fixes to challenges are still in the discussion stage. For example, De Salvo noted that bringing intelligence into low-power IoT-connected end-devices that support applications such as habitat monitoring and medical surveillance is significantly more difficult than bringing it to traditional networked mobile devices at the edge. “Most connected end devices are wireless sensor nodes containing microcontrollers, wireless transceivers, sensors, and actuators,” she said. “The power requirement for these systems is extremely critical – less than 100μW for normal workloads – as these devices often operate using energy-harvesting sources or a single battery for several years.”

De Salvo said scientists inspired by the human brain, whose computing performance and efficiency still remain unmatched, are pursuing a radically different approach to neuromorphic systems. “It consists in implementing bio-inspired architectures in optimized neuromorphic hardware to provide direct one-to-one mapping between the hardware and the learning algorithm running on it,” she said. These architectures include spike coding, which encodes neuron values as pulses or spikes rather then analog or digital values, and spike-timing-dependent-plasticity, a bio-inspired algorithm that enables unsupervised learning.

3D Technologies: Key Enablers of Neuromorphic Hardware
De Salvo said the human brain’s intelligence and efficiency are strongly linked to its extremely dense 3D interconnectivity. For example, there are approximately 10,000 synapses per neuron, and billions of neurons in the human brain cortex. “The hierarchical structure in the cortex follows specific patterns, through vertical arrangements or µcolumns, where local data flow on subcortical specialized structures, and laminar interconnections, which foster inter-area communications and to build the hierarchy.

“Based on these considerations, it is clear that emerging 3D technologies, such as through-silicon vias and 3D monolithic integration, also called CoolCubeTM, will be a key enabler of efficient neuromorphic hardware,” she said.

Outlining silicon technologies that will be vital to creating brain-inspired hardware, De Salvo also cited resistive memories or ReRAM, Fully Depleted Silicon on Insulator and silicon photonics.
“Thanks to its suitability for low-power design, FDSOI technology is a great candidate for neuromorphic hardware,” she said. In deep-learning architectures, high-performance reconfigurable digital processors based on 28nm FDSOI have shown power consumption in the range of 50mW, a power efficiency achieved by introducing optimized data-movement strategy and exploiting FDSOI back-biasing strategies. De Salvo also noted that a large-scale multi-core neuromorphic processor called Dynap-SEL, also based on 28nm FDSOI, recently was demonstrated.

The Road Ahead
De Salvo concluded her optimistic presentation about the future and potential of brain-inspired hardware, AI and edge computing by listing the technological challenges on the road to that goal. But she left the audience with one final prediction.

“New materials to interface devices with living cells and tissues, new design architectures for lowering power consumption, data extraction and management at the system level, and secured communications are the next domains that will experience intense development,” she said. “Brain-inspired implantable microdevices, acting as intelligent neuroprostheses, and bio-hybrid systems represent the new era of cross-disciplinary brain-repair strategies, where biological and engineered solutions will complement each other, probably mediated by artificial intelligence.”

####

About Leti
Leti, a technology research institute at CEA Tech, is a global leader in miniaturization technologies enabling smart, energy-efficient and secure solutions for industry. Founded in 1967, Leti pioneers micro-& nanotechnologies, tailoring differentiating applicative solutions for global companies, SMEs and startups. Leti tackles critical challenges in healthcare, energy and digital migration. From sensors to data processing and computing solutions, Leti’s multidisciplinary teams deliver solid expertise, leveraging world-class pre-industrialization facilities. With a staff of more than 1,900, a portfolio of 2,700 patents, 91,500 sq. ft. of cleanroom space and a clear IP policy, the institute is based in Grenoble, France, and has offices in Silicon Valley and Tokyo. Leti has launched 60 startups and is a member of the Carnot Institutes network. Follow us on www.leti-cea.com and @CEA_Leti.

CEA Tech is the technology research branch of the French Alternative Energies and Atomic Energy Commission (CEA), a key player in innovative R&D, defence & security, nuclear energy, technological research for industry and fundamental science, identified by Thomson Reuters as the second most innovative research organization in the world. CEA Tech leverages a unique innovation-driven culture and unrivalled expertise to develop and disseminate new technologies for industry, helping to create high-end products and provide a competitive edge.

For more information, please click here

Contacts:
Press Contact
Agency
+33 6 74 93 23 47

Copyright © Leti

If you have a comment, please Contact us.

Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.

Bookmark:
Delicious Digg Newsvine Google Yahoo Reddit Magnoliacom Furl Facebook

Related News Press

News and information

Researchers develop artificial building blocks of life March 8th, 2024

How surface roughness influences the adhesion of soft materials: Research team discovers universal mechanism that leads to adhesion hysteresis in soft materials March 8th, 2024

Two-dimensional bimetallic selenium-containing metal-organic frameworks and their calcinated derivatives as electrocatalysts for overall water splitting March 8th, 2024

Curcumin nanoemulsion is tested for treatment of intestinal inflammation: A formulation developed by Brazilian researchers proved effective in tests involving mice March 8th, 2024

Brain-Computer Interfaces

Developing nanoprobes to detect neurotransmitters in the brain: Researchers synthesize fluorescent molecularly imprinted polymer nanoparticles to sense small neurotransmitter molecules and understand how they govern brain activity March 3rd, 2023

Taking salt out of the water equation October 7th, 2022

Internet-of-Things

New nanowire sensors are the next step in the Internet of Things January 6th, 2023

New chip ramps up AI computing efficiency August 19th, 2022

Lightening up the nanoscale long-wavelength optoelectronics May 13th, 2022

Possible Futures

Two-dimensional bimetallic selenium-containing metal-organic frameworks and their calcinated derivatives as electrocatalysts for overall water splitting March 8th, 2024

Curcumin nanoemulsion is tested for treatment of intestinal inflammation: A formulation developed by Brazilian researchers proved effective in tests involving mice March 8th, 2024

The Access to Advanced Health Institute receives up to $12.7 million to develop novel nanoalum adjuvant formulation for better protection against tuberculosis and pandemic influenza March 8th, 2024

Nanoscale CL thermometry with lanthanide-doped heavy-metal oxide in TEM March 8th, 2024

Announcements

What heat can tell us about battery chemistry: using the Peltier effect to study lithium-ion cells March 8th, 2024

Curcumin nanoemulsion is tested for treatment of intestinal inflammation: A formulation developed by Brazilian researchers proved effective in tests involving mice March 8th, 2024

The Access to Advanced Health Institute receives up to $12.7 million to develop novel nanoalum adjuvant formulation for better protection against tuberculosis and pandemic influenza March 8th, 2024

Nanoscale CL thermometry with lanthanide-doped heavy-metal oxide in TEM March 8th, 2024

Automotive/Transportation

Researchers’ approach may protect quantum computers from attacks March 8th, 2024

New designs for solid-state electrolytes may soon revolutionize the battery industry: Scientists achieve monumental improvements in lithium-metal-chloride solid-state electrolytes November 3rd, 2023

Previously unknown pathway to batteries with high energy, low cost and long life: Newly discovered reaction mechanism overcomes rapid performance decline in lithium-sulfur batteries September 8th, 2023

Tests find no free-standing nanotubes released from tire tread wear September 8th, 2023

Events/Classes

Researchers demonstrate co-propagation of quantum and classical signals: Study shows that quantum encryption can be implemented in existing fiber networks January 20th, 2023

CEA & Partners Present ‘Powerful Step Towards Industrialization’ Of Linear Si Quantum Dot Arrays Using FDSOI Material at VLSI Symposium: Invited paper reports 3-step characterization chain and resulting methodologies and metrics that accelerate learning, provide data on device pe June 17th, 2022

June Conference in Grenoble, France, to Explore Pathways to 6G Applications, Including ‘Internet of Senses’, Sustainability, Extended Reality & Digital Twin of Physical World: Organized by CEA-Leti, the Joint EuCNC and 6G Summit Sees Telecom Sector as an ‘Enabler for a Sustainabl June 1st, 2022

How a physicist aims to reduce the noise in quantum computing: NAU assistant professor Ryan Behunin received an NSF CAREER grant to study how to reduce the noise produced in the process of quantum computing, which will make it better and more practical April 1st, 2022

NanoNews-Digest
The latest news from around the world, FREE




  Premium Products
NanoNews-Custom
Only the news you want to read!
 Learn More
NanoStrategies
Full-service, expert consulting
 Learn More











ASP
Nanotechnology Now Featured Books




NNN

The Hunger Project