How smart will the edge get?

What goes around comes around: after two decades of moving data and applications to a central cloud to be accessed by “dumb terminals,” the tide has turned. The edge is getting smarter, but how smart can it get?

Intelligence at the edge could be as simple as running analytics on data without having to send it back to a central data center, or even artificial intelligence (AI) to do simple inference. But until recently, the best practice was that the AI training and machine learning would be done in the cloud – now low power hardware is making it feasible for some of that machine learning to be done on a local edge device.

Being able to do more at the edge has some obvious advantages – you don’t need to consume energy and network capacity to send the data back and forth, nor do you have to worry about securing the data while in transit. And with some AI functionality becoming somewhat ubiquitous and affordable for most businesses, it appears inevitable that the edge will keep getting smarter.

Automation drives smart edge adoption

Embedded systems and devices are slowly taking steps to replace people, ITTIA founder Sasan Montaseri said in an interview with Fierce Electronics. “We allow them to capture and digest as much data as possible data and process that really rapidly.” This shift, he said, is exemplified by autonomous cars and robotics in factories – embedded systems and devices can handle and store more data to aid with tasks such as preventative maintenance. It is accomplished by looking at patterns of data, which makes the edge more intelligent.

Montaseri said one factor that is driving intelligence at the edge is that connectivity is not a hundred percent available, which means delays getting information back from the cloud. Another is that microprocessors and micro controllers are becoming more powerful. This enables the necessary data management at the edge, he said, and allows to devices quickly analyze and make sense of data.

ITTIA is focused on providing the software necessary for edge data streaming, analysis and management for embedded systems and IoT devices – robust data management is foundational for doing AI and machine learning in embedded systems at the edge, Montaseri said.

diagram
ITTIA provides software for edge data streaming, analysis and management for embedded and IoT for uses such as transportation when it's not feasible to depend on a central cloud. (ITTIA)

Reliability is critical for smart edge devices, he added, whether it’s for industrial robotics, medical or transportation applications. “You want to make sure that they don't go down.”

What’s also becoming apparent is that not all edges are created equal – some will be smarter sooner than others depending on the use case and industry, such as robotics and medical devices. Montaseri said today’s embedded systems that gain intelligence through IoT deployments will be doing the jobs needed for the next generation of computing. “The nature of everything is changing,” he said. “We are seeing more security, more safety, and more functionality, like the ingestion rate and the query rate. Our focus is safety, security, and reliability.”

Not all edges are created equal

What makes the smart edge murky is the definition of edge, which means different things to different people, Nandan Nayampally, CMO at BrainChip, said in an interview with Fierce Electronics. He was previously at ARM for more than 15 years when the edge was viewed as primarily sensor driven. “That's how IoT kind of cropped up,” he said. “IoT is a sensor plus connectivity plus processing.” While a Dell or an Intel might think of the smart edge as another giant box that’s now smaller, the right starting point to him is IoT with AI.

AI on the edge is a step forward from a sensor just doing one function, with devices now having more storage, memory, and processing power. Nayampally said this battle between cloud and the edge has been going on a for a while, going back to the days of a terminal connected to mainframe before the move to a client/server model. “What you realize is however much we think that latency to cloud or connectivity to cloud is guaranteed, and the bandwidth assured, it's never going to be the case,” he said. “You need that intelligence and computational power at the edge.”

diagram of chip
BrainChip's Akida processor can learn at the edge to address security and privacy while limiting network congestion. (BrainChip)

Having the smarts at the edge is beneficial for preventative maintenance in factories and patient monitoring, Nayampally said, both in terms of latency and privacy. “Anytime you send raw data or sensitive data out, you are obviously going to have challenges.” Privacy and security have become especially important to the general public, he added. BrainChip was started with the idea that edge computing was necessary and that any approach to AI at the edge had to be different from the cloud. “The cloud kind of assumes almost infinite resources and infinite compute.”

While compute resources at the edge are rather finite, more AI is possible due to advances with low power hardware including memory and systems on chip (SoC), which means not all training and machine learning need be shipped back to the cloud. Nayampally said it’s a matter of scaling, with neuromorphic computing offering inspiration for how to low power intelligence at the edge. “Let's try to emulate the efficiency of it and start from there.”

Machine learning will increasingly happen at the edge both because of inherent capability but also out of necessity. Nayampally said some applications that require a real-time response can’t afford the delay between the edge and the cloud, or the power. “Any time you use radio and connectivity, especially to cloud, that burns a lot of power,” he said. “Radios are the most expensive parts of devices.” Smaller, more cost-effective devices may not be able to afford to have connectivity and need to do more compute locally.

Nayampally said the neuromorphic nature of BrainChip’s Akida platform allows it to learn at the edge, which also addresses security and privacy and reduces network congestion – today’s autonomous vehicles can generate a terabyte of data per day, he noted, so it makes sense to be selective about how much data needs to travel the network.

For the smart edge, simple is better and BrainChip’s processor does that from a computational standpoint, as well as from a development and deployment standpoint, Nayampally said. “It's almost like a self-contained processor.” It is neuromorphic and event driven, so it only processes data when needed, and only communicates when needed, he said.

Being event driven is an excellent example of how basic machine learning may express itself in a single device for the user or the environment, which is what Synaptics is calling the “edge of the edge,” said Elad Baram, director of product marketing for low-power edge AI. The company has a portfolio of low power AI offerings operating at a milliwatt scale which is enabling machine learning using minimal processing and minimal learning – an initiative in line with the philosophy of the tinyML Foundation, he said. While an ADAS uses gigabytes of memory, Synaptics is using megabytes of memory.

Baram’s expertise is in computer vision, and Synaptics sees a lot of potential around any kind of sensing and doing the compute right next to where the data is coming from. Moving data requires power and increases latency and creates privacy issues. Organizations like tinyML are an indicator of how smart the edge could get. “We are at an inflection point within this year or next year,” he said. “This inflection point is where it's booming.”

diagram of a chip
Synaptics has a context aware Smart Home SoC with an AI accelerator for 'edge at the edge'. (Synaptics)

Context aware edge remain application specific

Baram said just as the GPU boom occurred in the cloud five years ago, the same evolution is happening with TinyML. Workloads at the edge that previously required an Nvidia processor, such as detection and recognition, can now be done on a microcontroller. “There is a natural progression.”

Sound event detection is already relatively mature, Baram said, starting with Alexa and Siri and progressing to detecting glass breaking or a baby crying. “We are seeing a lot of use cases in smart home and home security around the audio space.” In the vision space, he said, Synaptics is supporting “context awareness” for laptops so they can detect whether a user is present or not, and to ensure privacy, any imaging stays on the on-camera module – it’s never stored on the computer’s main processor.

Power, of course, is important for context awareness applications. Baram said. “You don't want the power to the battery to drain too fast.” But having this awareness actually extends battery life, he said, because now the system understands if the user is engaged with the device and its content and can respond accordingly. “You approach the laptop, it's turned on and you log in and it's like magic. The machine just knows what you want to do, what you are doing, and it can adapt itself.”

Similarly, an HVAC system could adapt based on the number of occupants in a room, or a dishwasher could let you know how full it is. Baram said a fridge could be smart enough so you can know whether or not you need to pick up milk on the way home. Aside from smart laptops and home appliances, there are many safety applications in construction, manufacturing and agriculture that could benefit from context awareness. “The amount of use cases out there in the world is pretty amazing.”

Baram said the hardware is coming together to enable the smart edge, including machine learning, while algorithms and networking are also improving significantly. “The neural networks are way more efficient than they were a decade ago.” As compute capabilities advance, devices will be able to have more general purposes, but for now processor and algorithm constraints mean smart edge devices will have targeted applications.

In the long run, making the edge smarter is ultimately contingent on successfully pulling on these elements together, which requires an AI framework, Baram said, such as TensorFlow, an open source ML platform. “Those frameworks make it much easier to deploy a neural network into edge devices.”