A team of researchers from UC Santa Barbara and Intel Labs has proposed a groundbreaking platform for neuromorphic computing, significantly enhancing energy efficiency and potentially transforming the future of AI and IoT technologies.
Computers have made remarkable strides in processing power, prediction capabilities and data communication, often surpassing human capabilities. Yet, in one critical area, they lag significantly behind the human brain — energy efficiency.
“The most efficient computers are still approximately four orders of magnitude — that’s 10,000 times — higher in energy requirements compared to the human brain for specific tasks such as image processing and recognition,” Kaustav Banerjee, a professor of electrical and computer engineering at UC Santa Barbara and an authority in nanoelectronics, said in a news release.
Banerjee emphasizes the importance of developing more energy-efficient computing technologies, especially given global energy consumption trends.
“Making computers more energy efficient is crucial because the worldwide energy consumption by on-chip electronics stands at #4 in the global rankings of nation-wise energy consumption, and it is increasing exponentially each year, fueled by applications such as artificial intelligence,” he added.
The urgency of this challenge is magnified by contemporary concerns such as global warming.
Enter neuromorphic (NM) computing — a promising avenue for addressing this energy efficiency gap. By mimicking the brain’s structure and functionality, where processing happens in parallel across numerous low-power neurons, NM computing can potentially match the brain’s energy efficiency.
Recent advancements were showcased in a paper published in Nature Communications. Banerjee and his colleagues Arnab Pal, Zichun Chai, Junkai Jiang and Wei Cao, along with Intel Labs’ researchers Vivek De and Mike Davies, propose a revolutionary ultra-energy-efficient platform. This platform utilizes 2D transition metal dichalcogenide (TMD)-based tunnel-field-effect transistors (TFETs), bringing energy requirements to within two orders of magnitude (about 100 times) of the human brain’s.
Neuromorphic Computing in Focus
Neuromorphic computing has long been a theoretical concept but is now gaining traction thanks to recent progress in circuitry, which has enabled smaller and more efficient transistor arrays. This development is crucial for applications like artificial intelligence and the Internet of Things, driving the need for advanced hardware platforms.
Banerjee’s team is at the forefront of this evolution with their 2D tunnel-transistors. These atomically thin, nanoscale transistors, developed through extensive research efforts, aim to achieve high performance while consuming minimal power. These transistors, which respond at low voltages, are foundational to the researchers’ NM platform and can emulate the brain’s energy-efficient operations.
The research indicates that these 2D TFETs reduce off-state currents and have a low subthreshold swing (SS), a measure of how efficiently transistors can switch between on and off states. Banerjee highlights that lower SS translates to lower operating voltage and more efficient switching.
“Neuromorphic computing architectures are designed to operate with very sparse firing circuits, meaning they mimic how neurons in the brain fire only when necessary,” lead author Arnab Pal said in the news release.
This method contrasts with conventional computers, which continuously draw power and process data sequentially. Neuromorphic systems activate only when there is data to process, distributing memory and processing across transistors and significantly improving energy efficiency.
While companies like Intel and IBM have developed brain-inspired platforms, these still lose significant energy through leakage currents in the transistors’ off-state. Traditional metal-oxide-semiconductor field-effect transistors (MOSFETs) used in current NM chips have high off-state leakage.
“Since the power efficiency of these chips is constrained by the off-state leakage, our approach — using tunneling transistors with much lower off-state current — can greatly improve power efficiency,” added Banerjee.
When integrated into neuromorphic circuits mimicking neuron firing and reset, TFETs outperformed state-of-the-art MOSFETs, particularly FinFETs, in energy efficiency. Although TFETs are still experimental, their demonstrated performance in NM circuits identifies them as promising candidates for the next generation of brain-inspired computing.