Strategies for power-efficient machine learning at the edge

Voice activation has come a long way to include voice-controlled TV remotes and smart earbuds, but keeping power usage in check for always-on devices requires a new approach such as waking a system only when a specific event has been detected. (Getty Images)

Partner content

The push for greater intelligence has paved the way for machine learning at the edge in battery-powered, always-on devices. While enabling greater privacy and performance, traditionally architected edge devices waste a significant amount of power by operating within the standard digital processing paradigm where all sensor data are digitized at the start of the signal chain, regardless of their relevance to the task at hand.

Fortunately, we’re seeing new options for extending battery life in these always-on devices, from enabling a lower-power always-on sensing mode to minimizing the length of time that the system spends in full-power mode. While their specific implementations differ, they have one commonality: Designers are turning to analog processing to save power.

Ranging from analog-inclusive to analog-centric, these new approaches include:

  • Using lower-power digital signal processors (DSPs) and microcontrollers (MCUs)
  • Detecting ambient activity in the analog domain and waking the system when the analog sensor data exceed a certain energy threshold  
  • Using analog machine learning to detect specific discreet events from raw analog sensor data, waking the system only when a specific event has been detected

Given all these choices, how can you decide what the best solution is for your application—whether that’s a voice-first device, like smart earbuds or voice-controlled TV remote, an acoustic event-detection product that detects glass-break or alarm tones, or a vibration anomaly-detection module?

Option 1: Lower-power digital components

The success of edge computing relies heavily on the rapid proliferation of DSPs and MCUs—some of which include an embedded neural network, i.e., a tiny machine learning (tinyML) chip. These mostly digital processors can handle the complex analysis of data, such as deciding whether a wake word has been spoken, right on the device. Because analog is more efficient than digital for some of the most important functions of a neural network, they may also integrate a limited amount of analog processing to reduce the overall chip’s power. In particular, some processors leverage analog circuitry to perform just the multiply-accumulate (MAC) functions, a scheme known as analog in-memory computing. However, these processors are still typical clocked processors that operate within the traditional digital paradigm and require the digitization of all analog sensor data before analysis. So while using some analog processing within a chip can reduce the always-on power of the system, the entire system is still on all the time, and only an incremental battery-life improvement is possible.

Option II: Analog activity detection and system wake-up

Over the last few years, we’ve also seen a number of chips, either standalone or integrated with MEMS (Micro Electro Mechanical Systems) sensors, that use linear analog circuitry to trigger a system wake-up when a threshold ambient energy level has been reached. These chips, which include some MEMS microphones, can’t determine the type of event that’s occurred or whether it’s related to the task of the system. By providing a very-low-power always-on mode for the system, this approach reduces power by keeping the digital system off for some of the time. Unfortunately, since these simple analog circuits can’t distinguish the type of event, they detect too many false alarms that trigger system wake-up, and battery-life savings is again incremental.

Option III: Analog event detection and system wake-up with analogML

System designers can now choose a third alternative, an analogML core (analog machine learning) that minimizes both the always-on power of the system AND the amount of time the system remains on. An analogML core operates fully within the analog domain, with no clock required, and uses raw analog sensor data for inferencing and classification. It enables a cascaded system architecture that determines the importance of the data first before spending any power on data conversion. And unlike other analog wake-up solutions, an analogML core can collect and compress data prior to event detection (pre-roll) and deliver this data to the processor when it’s needed. This is a particularly important function for maintaining the accuracy of wake word engines that detect a keyword.

How does this work in practice? In a voice-first system leveraging analogML, the always-on power of the system is ~ 35µA for voice detection and pre-roll collection. Once voice is detected, which is just 10-20% of the time, the analogML core triggers the wake word engine to wake up the system and deliver the stored pre-roll, so the system can analyze for a keyword. This intelligent architecture extends battery life by up to 10 times over traditional systems. In other applications, such as glass break detection, where the glass break event might happen once every ten years (or never), the analogML core keeps the high-power digital system off for 99+% of the time, extending battery life by years. See Figure 1:

 

tom doyle chart

Fig. 1: Among options for reducing power in voice-enabled, battery-operated devices, only the analogML core keeps the MCU/DSP asleep unless voice, the only sound that can contain a keyword, is detected.

No matter how much analog processing is included in a chip to reduce its power consumption, unless that chip operates in the analog domain, on analog data, it’s not doing the one thing we know saves the most power in a system—digitally processing less data. 

Join me on September 22 when I’ll present Strategies for Power-efficient Machine Learning at the Edge at Sensors Converge (September 21-23, 2021, McEnery Convention Center, San Jose, Calif.) or attend this session virtually from your home office. Register today.

Tom Doyle is CEO and founder of Aspinity and formerly was at Cadence Design Systems and Paragon IC Solutions. He holds a B.S. in electrical engineering from West Virginia University and an MBA from Cal State Long Beach.