The long-time trend in computing has been to develop smaller, more powerful devices using supercomputing and parallel computing architectures. But with limitations imposed by energy and scaling, scientists are being challenged to explore alternative architectures for future generation computing. Penn State researchers believe they may have an answer in a brain-inspired device based on two-dimensional materials.
The device can provide more than yes-or-no answers, as it relies on a range of probabilistic responses that are compared with a learned database.
"Complexity scaling is also in decline owing to the non-scalability of traditional von Neumann computing architecture and the impending 'Dark Silicon' era that presents a severe threat to multi-core processor technology," the researchers note in an online issue of Nature Communications.
Because of thermal concerns, modern computer chips cannot have most of its individual transistors powered up at the same time. Most modern computers rely on the von Neumann architecture which is based on a digital approach of “yes” or “no” answers. Program instruction and data are stored in the same memory and share the same communications channel.
"Because of this, data operations and instruction acquisition cannot be done at the same time," said Penn State researcher Saptarshi Das, assistant professor of engineering science and mechanics. "For complex decision-making using neural networks, you might need a cluster of supercomputers trying to use parallel processors at the same time—a million laptops in parallel—that would take up a football field. Portable healthcare devices, for example, can't work that way."
The researchers created a brain-inspired, analog, statistical neural network that provide a range of probabilistic responses compared with the learned database in the machine. To do this, the researchers developed a Gaussian field-effect transistor that is made of 2D materials—molybdenum disulfide and black phosphorus. These devices are more energy efficient and produce less heat, which makes them ideal for scaling up systems.
"The human brain operates seamlessly on 20 watts of power," said Das. "It is more energy efficient, containing 100 billion neurons, and it doesn't use von Neumann architecture."
Scaling complex computing operations onto a small computer chip is the other key challenge.
"Size scaling has stopped," said Das. "We can only fit approximately 1 billion transistors on a chip. We need more complexity like the brain."
The researchers tested their neural network on human electroencephalographs, graphical representation of brain waves. After feeding the network with many examples of EEGs, the network could then take a new EEG signal and analyze it and determine if the subject was sleeping.
"We don't need as extensive a training period or base of information for a probabilistic neural network as we need for an artificial neural network," said Das.
One potential application for statistical neural network computing is medicine, as diagnostic decisions are not always yes or no. The computing architecture could also suit medical diagnostic devices that need to be small, portable and use minimal energy.