Artificial intelligence and machine learning technologies are not just for high-performance computing anymore and have emerged in embedded systems.
A big question for engineers is whether they have weighed the power constraints for their embedded designs, particularly at the edge. That includes the availability of power, as well as its cost.
“All edge compute is power hungry and the closer to the edge and farther from the power plant, the higher the cost of energy delivery,” said Brian Zahnstecher, a principal and independent consultant at PowerRox, LLC.
He dubs that power delivery cost the Power Cost Factor. “Regardless of how much more energy-efficient your hardware becomes on an MIPS per watt basis, the total energy density will only go in one direction…waaaay up.”
Noting that data transfer is the biggest user of energy of anything else in the digital world, he said engineers should be highly motivated to mitigate data transmission. “As patronizing as it sounds, there is a universal truth to this work that gets lost in the hype over how much we can do,” Zahnstecher added. “Just because you can do something doesn’t necessarily mean you should.”
Energy efficiency applies to its direct impact on hardware and software performing in an embedded AI/ML app, but engineers should still consider the network-level impact through the energy-efficiency lens, he added.
Zahnstecher co-chaired a group of 14 contributors in the Energy Efficiency Working Group that wrote a recent IEEE whitepaper on the topic of energy efficient design.
“The ultimate success of any new technology development is intrinsically tied to its energy requirements. Be it a battery-operated device or a data center, energy is the currency that determines its business viability,” the group said in a vision statement.
“Deceivingly, we often disregard small amounts of energy that are consumed at the edge of the network without realizing the farther a device or system is from the power plant and the closer it is to the edge, the higher the multiplication factor of its energy requirement,” the paper adds.
The paper includes a table with an alphabet soup of industry groups with an energy efficiency focus, from GreenTouch to the PSMA (Power Sources Manufacturers Association).
One focus of the paper is the impact of 5G wireless on power. The group introduced a hypothetical representation called the 5G Energy Gap to describe the disparity between available energy sources and demand loads of the “micro-power” devices that represent most of the “things” in the highly-scalable edge space of the network. There is another hypothetical representation called the 5G Equality Gap to depict the socioeconomic disparity between those that will be able to adapt their infrastructure and end user cases with the unanticipated underperformance due to energy-limited or economically-limited factors.
The paper raises several perplexing examples, including how self-driving cars that rely on AI will be challenged to operate unless a network infrastructure is broadly available to allow the cars to operate consistently. “Enabling billions or trillions of low-power edge devices means lowering their transmission power, so that they can be operated by battery or energy harvesting,” the paper says. “New standards allow such lower power protocols, but that also implies a densification of the cells in order to reliably communicate.”
The paper’s other contributors were officials from various companies in communications including Nokia Bell Labs, Verizon, Huawei, Global Foundries and HPE.
Brian Zahnstecher will be a panelist on Monday, September 28, at 11:45 a.m. ET, as part of Embedded Innovation Week sponsored by Fierce Electronics and Embedded Technologies. The online event is free and registration is available.