The Link Between Energy and Information for the Design of Neuromorphic Systems

Sensors Insights by Narayan Srinivasa

Introduction

Recent understanding of our nervous system reveals that sensory organs are optimized for efficient uptake of information from the environment to the brain. While the bulk of the energy consumed in this process is to generate asynchronous action potentials for signal communication, evolutionary pressure has further optimized on this mode of communication within the brain for efficient representation and processing of information.

We will explore this link between energy and information to reveal some of the mechanisms, circuits, and energy efficient codes in the brain. The implications of this link will be important to consider for the design of neuromorphic systems of the future. Applications of these features are currently being tested and seems to hold the promise for developing devices capable of exhibiting intelligence at the edge.

Neuromorphic electronics was originally coined [1] to describe electronic analog circuits that mimic neurobiological circuits and architectures in the nervous system. This field has since evolved to encompass a broader community called Neuromorphic engineering/computing was that now covers analog, digital, mixed-mode analog/digital VLSI circuits and software systems. In this article, we will look at some of the constraints and features in the brain that have evolved over millions of years which we believe is very important to understand in order build future neurormorphic circuits and systems.

The mammalian brain is unique in that it serves as an interface between morphology, physiology and behavior. The evolution of the brain was shaped by two key selective pressures. The first was to generate adaptive behavior via information processing under changing conditions for the animal to survive. However, this benefit comes at the cost of energy consumed to generate these adaptive behaviors. Essentially the brain evolved to balance energy consumption with information processing to survive and thrive in its dynamically changing environment. The brain architecture that emerged from evolution was based on this exquisite balancing act. We will explore the link between energy and information to reveal some of the mechanisms, circuits, and energy efficient features in the brain. These features offer cues and constraints from a design perspective for future neuromorphic systems.

The mammalian nervous system is believed to consume 20% of the total available energy in the body despite being just 2% of the total body mass [2]. The bulk of this energy is consumed in performing three functions (Figure 1). The first is action potential generation or spike generation. Spikes are pulses generated at the neuron based on incoming synaptic currents to communicate the flow of information to other neurons in the brain. Higher rates of spike generation require more energy and is related to signal bandwidth that can be supported.

Fig. 1: The three main energy consuming factors during action potential generation affects the signal speed, quality and transmission.
Fig. 1: The three main energy consuming factors during action potential generation affects the signal speed, quality and transmission.

The second function is the act of maintaining the signal quality of these spikes along the axon of the source neuron after the spike is generated at the neural cell body. This can be viewed as a noise filtering process or enhancing signal-to-noise ratio of the spike signals. The third function is in the synaptic transmission of the spike via the synaptic cleft between the axonal arbor of the source neuron and the dendritic bouton of the recipient neuron. The nature of this transmission has a significant impact on both the energy consumed and information transmitted.

The spikes that are transmitted between any pair of neurons is typically asynchronous in nature – that is the timing between spikes is not synced to any clock. The nature of the neural code that carries information using spikes is still under debate. Experiments however confirm that the asynchronous spike timing model of communication is the mode that maximizes the information transmission rates [3] between neurons. Neuronal morphology has evolved to deliver information carried by these spikes at the lowest allowed energy. For example, majority of neurons in the brain are small and carry lower information rates but operate at much higher energy efficiency due to reduced cost for spike maintenance and propagation. The distribution of axonal tracts in the brain are skewed towards thin axons which have lower firing rates than thicker ones thus further conserving on energy consumed for transmitting information [4]. The various channels on the neurons control the kinetics, sensitivity and neuronal threshold (Figure 2). Synaptic current scaling is another homeostatic mechanism found in neurons to control its firing rate thus balancing information rates with energy consumed [5].

Fig: 2: The gated channels on the neuron controls the kinetics (diffusion), sensitivity (chemical) and threshold (voltage) which in turn affects the information processing rates and energy consumed by the neuron.
Fig: 2: The gated channels on the neuron controls the kinetics (diffusion), sensitivity (chemical) and threshold (voltage) which in turn affects the information processing rates and energy consumed by the neuron.

The biophysics at a synapse is also optimized for energy efficient information processing. The neurotransmitter release in the synaptic cleft between the axonal terminal of a pre-synaptic neuron and a dendritic bouton of a post-synaptic neuron is probabilistic in nature [6]. If the probability of transmitter release is p and if the probability of a presynaptic spike occurrence is s, then Information transmitted at a biological synapse at increasing p is linear for lower s but less-than-linear for higher s. Since lower s implies lower energy consumed, the transmitted information per joule of energy is higher for lower s irrespective of p. In addition to synaptic transmission, there is evidence for various forms of plasticity at the synapse and neuron (see Table 1) that further enable optimal filtering of information for efficient information transfer. This enables continuous learning which in turn enables adaptive behaviors.

Table 1: Various forms of plasticity enable optimal filtering of information for efficient information transfer.
Table 1: Various forms of plasticity enable optimal filtering of information for efficient information transfer.

At the architectural level, there are several features that reflect the link between energy and information for efficient behavior.  Neurons are densely connected within short spatial scales using thin axonal tracts. These distributions of dense short connections are “tuned” to extract information from each modality. For example, dense short connections in the V1 area of the visual cortex are selective for local image gradients in images [11].  However, adaptive behavior requires many of these local computations to be integrated rapidly. To enable this, the brain has also evolved thicker and longer axons to integrate information spread over several low information rate thin axon tracts.

This mix of small dense and sparse long-range connections form a small world network [12] ensuring rapid information exchange and integration necessary for adaptive behaviors while conserving energy. Another important structural feature that promotes information coding while conserving energy is based on balanced synaptic currents [13]. The brain has fundamentally two types of neurons: roughly 80% of the neurons are excitatory because they excite any neuron they connect to and the remaining 20% are inhibitory neurons because they inhibit any neuron they are connected to (Figure 3).

Fig. 3: Brain architecture is composed of microcircuits with both excitatory and inhibitory neurons that produce balanced currents at each neuron.
Fig. 3: Brain architecture is composed of microcircuits with both excitatory and
inhibitory neurons that produce balanced currents at each neuron.

The interesting aspect is that these populations of neurons exhibit exquisite balance between excitatory and inhibitory currents. This ensures that neurons do not spike often and this in turn promotes efficiency in both information coding and energy consumption.

The ultimate economy of spikes is realized when there is minimal redundancy in signal representation. However, the cost of recovering from an action potential relative to the cost of inactivity, should also be factored into the economy of impulses. In fact, networks with the largest representational capacity are not, in general, optimal (see Figure 4(a)) when energy expenditures are considered. Increased energy expenditure per neuron implies a decrease in average firing rate if energy efficient information transmission is to be maintained [14].

Fig. 4: (a) Three networks with different sizes can still have the same representational capacity depending on the ratio of active to inactive neurons. However, energy efficiency of the network depends on the cost of maintaining active to passive neurons.
Fig. 4: (a) Three networks with different sizes can still have the same representational capacity depending on the ratio of active to inactive neurons. However, energy efficiency of the network depends on the cost of maintaining active to passive neurons. (b) The memory capacity of networks can be traded for readout efficiency (which is easier for networks with sparse activity).

At the same time, sparse firing of neurons in a given population offers the best tradeoff between memory capacity and readout efficiency (or the ability to decode information – see Figure 4(b) above) [15]. Finally, an important feature in the brain is its ability to process information in an asynchronous fashion. This “just in time” mode of operation allows the brain to conserve energy by spiking very sparsely.  Actions and information coding happens as and when the prerequisite actions or coding steps are completed.

The computational elements such as neurons and synapses combined with architectural aspects that strongly link energy and information processing offers a blueprint for the design of neuromorphic systems. There are important aspects that need to be the focus of such efforts (Figure 5).

Fig. 5:  The two aspects of information coding for the design of neuromorphic systems is to control the biophysics of the anatomical elements and encoding strategies at multiple spatial and temporal scales.
Fig. 5:  The two aspects of information coding for the design of neuromorphic systems is to control the biophysics of
the anatomical elements and encoding strategies at multiple spatial and temporal scales.

The first is to make signals that carry information energetically cheap by controlling the biophysics of both neuronal and synaptic circuits. The brain teaches us that using a mix of low voltage mixed mode (i.e., analog and digital) circuits is most efficient while signals for encoding and communicating signals should be in the form of spikes in an asynchronous event driven mode. A related and very important aspect is to ensure that these hybrid circuits also cover various aspects of plasticity in a programmable manner. This is important to enable systems to learn to exhibit adaptive behaviors.

The second focus is on the system level architectural design to enable efficient coding. The architecture should use a “compute in a sea of memory” architecture where memory is locally accessible to all computing elements. The plastic nature of the synapses requires lots of read and write operations to and from these memory elements and must be locally accessible. The highly dense connectivity of the neurons will require a huge amount of memory and future neuromorphic systems will have to address this challenge of dense memory structures.

Recent nanodevice based memories are current seems to hold promise in addressing this challenge [16]. Another important aspect is to enable programmable connectivity so that large scale networks of neurons could be programmed to connect in specific ways (e.g., cortical layered architecture of the brain) to address various application needs. The other challenge is to ensure that the neuronal and synaptic elements are connected while maintaining their asynchronous nature of communication to ensure both energy efficient and adaptive information processing capabilities.

At Eta Compute, we are exploring a new approach to the design of neuromorphic systems based on these principles.  We are focused signal encoding and representation by incorporating time in the form of action potentials or spikes. The second and equally important aspect is to operate the chip in a fully asynchronous fashion (based on our patented Delay Insensitive Asynchronous logic or DIAL) technology.

 By combining the asynchronous mode of chip operation with the asynchronous mode of learning using spikes, we have developed several applications for energy and memory constrained devices that interchangeably can learn and make inferences on data [17].  We are exploring some of the other ideas described above to further improve our neuromorphic solutions with the goal of enabling a whole new breed of intelligent devices at the edge in the future.

 

References

  1. C. A. Mead, Neuromorphic Electronics Systems, Proc. of IEEE, vol. 78, no. 10, pp. 1629-1636, 1990.
  2. S. B. Laughlin, Energy as a constraint on the coding and processing of sensory information, Current Opinion in Neurobiology, vol. 11, pp. 475–480, 2001.
  3. S. P. Strong, R. Koberle, R. R. de Ruyter van Steveninck & W. Bialek, Entropy and information in neural spike trains, Physical Review Letters, vol.  80, pp. 197-200, 1998.
  4. D. Paré, E. Shink, H. Gaudreau, A. Destexhe, E. J. Lang, Impact of spontaneous synaptic activity on the resting properties of cat neocortical pyramidal neurons in vivo, Journal of Neurophysiology, vol. 79, pp. 1450–1460, 1998.
  5. G. G. Turrigiano, Homeostatic synaptic plasticity: local and global mechanisms for stabilizing neuronal function, Cold Spring Harbor Perspective in Biology, 4, a005736, 2012.
  6. J. J. Harris, R. Jolivet, D. Attwell, Synaptic energy use and supply, Neuron, vol. 75, pp. 762–777, 2012.
  7. G. Q. Bi and M. M. Poo, “Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength and postsynaptic cell type,” Journal of Neuroscience, vol. 18, pp.10464-10472, 1998.
  8.  H. Markram, Y. Wang, and M. Tsodyks, Differential signaling via the same axon of neocortical pyramidal neurons, Proceedings of National Academy of Sciences USA, vol. 95, pp. 5323-5328, 1998.
  9. U. Frey and R. G. M. Morris, Synaptic tagging and long term potentiation, Nature vol. 385, pp. 533-536, 1997.
  10.  A. Stepanyants, P. R. Hof, D. B. Chklovskii, Geometry and structural plasticity of synaptic connectivity, Neuron, vol. 34, pp. 275-288, 2002.
  11. D. H. Hubel and T. N. Wiesel, Receptive fields, binocular interaction and functional architecture in the cat's visual cortex, Journal of Physiology, vol. 160, pp. 106–154, 1962.
  12.  D. S. Bassett, E. Bullmore, Small-world brain networks, Neuroscientist, vol. 12, pp. 512–523, 2006.
  13. B. Sengupta B., S. B. Laughlin, J. E., Niven, Balanced excitatory and inhibitory synaptic currents promote efficient coding and metabolic efficiency, PLoS Computational Biology 9:e1003263. 10.1371/journal.pcbi.1003263, 2013.
  14.  W. B. Levy and R. A. Baxter, Energy efficient neural codes, Neural Computation, vol. 8, pp. 531–543, 1996. 
  15.  B. A. Olshausen, D. J. Field, Sparse coding of sensory inputs. Current Opinion in Neurobiology, vol. 14, pp. 481–487, 2004.
  16. M.L. Schneider, C.A. Donnelly, S.E. Russek, B. Baek, M.R. Pufall, P.F. Hopkins, P.D. Dresselhaus, S. P. Benz and W.H. Rippard, Ultra-low power artificial synapses using nano-textured magnetic Josephson junctions, Science Advances, January 2018.
  17.  N. Srinivasa, and G. Raghavan, Micropower Intelligence for Edge Devices, Industrial IoT and Machine Learning, September, 2018, http://embedded-computing.com/e-letter/09-10-2018/.

 

About the author

Narayan Srinivasa, Ph.D., CTO of Eta Compute, is an expert in machine learning, neuromorphic computing and their applications to solve real world problems. Prior to joining Eta Compute, Dr. Srinivasa was the Chief Scientist and Senior Principal Engineer at Intel leading the neuromorphic computing group. Before Intel, he was the Director for the Center for Neural and Emergent Systems at HRL Labs where he led many programs in machine intelligence including DARPA SyNAPSE, UPSIDE and Physical Intelligence. He holds 54 US patents and has published over 94 articles. Dr. Srinivasa earned his bachelor’s degree in technology from the Indian Institute of Technology, Varanasi, and his doctorate degree from University of Florida in Gainesville and was a Beckman Postdoctoral fellow at the University of Illinois at Urbana-Champaign. Email Narayan at [email protected].