Batteries vs. Power Harvesting

E-mail Tom Kevan

As wireless sensor networks gain broader acceptance, one of the major questions to be answered is: Where will they get the power to operate? Are batteries up to the job, or will we have to rely on power harvesting techniques? The answer isn't cut-and-dried.

In 2004, petroleum producer BP began testing and evaluating the effectiveness and reliability of wireless networking mote technology. The Loch Rannoch project was actually a multiphase effort that went on to develop a commercial wireless sensor networking system that could be used in BP's industrial production facilities around the world. One of the technologies that BP tested in the project was energy harvesting, which converted the vibration from machinery into electrical energy. Harry Cassar, Technology Director, Digital and Communications Technology, BP, concluded, "We understand that energy harvesting will play a big role in motes."

In a recent email, Sol Jacobs, Vice President and General Manager for Tadiran Batteries, disagreed. "I cannot understand why anyone would want to use an expensive energy harvesting device when there are batteries that can last the entire life of the sensor. Tadiran Batteries manufactures lithium batteries that have been proven to last over 20 years in the field, in the harshest of conditions, without the need for replacement or recharging. We have customers using single AA cells to power remote devices for 21 years with enough capacity left for another 4-7 years of operation. Our lithium cells power millions of wireless devices worldwide and are both less expensive and more reliable than any energy harvester could ever be."

Two years ago, the Palo Alto Research Center reached the same conclusion and abandoned all research on energy harvesting based on a study of battery evolution and power demand projected into the future.

Battery technology (e.g., thin film) and power-optimization techniques have advanced to keep pace with the needs of wireless sensor networks. Moore's law bolsters this idea of inexpensive battery technology being sufficient for the needs of wireless sensor networks. Why invest in a power source that will last 10 years when the technology it is supporting will be obsolete and replaced in 7 years? For many industries, current wisdom states that any company not willing to replace its technology every 7 years will soon be out of business.

There are exceptions, though, that muddy the waters and suggest that there may be more than one answer to the question. Companies that cannot afford down time caused by the failure of critical equipment may have to rely on energy harvesting. In addition, there are those companies that either can't afford this relentless capital investment or believe that they are not subject to the dynamics of Moore's law. These companies may opt to keep their legacy systems as long as possible and limit their investment to power harvesting.

There are many sides to this debate. I'm sure many readers have opinions on the subject, and I would be interested in hearing them. If you would like to express your opinion, send it to [email protected].