What will it take to bring wireless sensors to real customers for real applications? As we all know, a good idea doesn't always guarantee technical or economic success. The first article in this series detailed the technological reasons why the trend toward wireless microsensors seems inexorable. The technology to develop and deploy wireless sensors is becoming available, but successful companies will need more than a great technology. Success will also depend on an understanding of the economics involved.
Those of us brought up in the technical fields sometimes find it difficult to understand why good technologies don't achieve instant success in the marketplace. But all too often, the best technologies are supplanted in the marketplace by other forces. Every industry has its own Beta vs. VHS story.
Trends indicate that the cost of wired solutions continues to grow while the price of wireless devices declines. The only question is when will the cost curves cross for your organization and those organizations with which you do business (see Figure 1).
Figure 1. The cost of wiring has steadily increased, and the cost of wireless solutions continues to decline. The scales of the graph are blank because the actual cost of wireless and wired solutions will vary by application and individual product. |
The trick is to target wireless solutions where the economics are most attractive. Over time, the economics may shift to where wireless solutions are commonplace and new wired installations are the exception.
Like most other technology penetrations, the early successes of wireless sensors will depend on vendors selecting and delivering high-value solutions to the right groups. Selecting and serving these early targets will require an integrated plan, starting with a sound technological path and ending with a thorough understanding of the economics of your own company and that of the companies you serve. Without such a plan, the life of sound technology can be cut short before true success is demonstrated. Once service of the early targets has proven the technology's worth, new applications will open up, and the economics will move increasingly toward the new technology. The Holy Grail of technology insertion is sometimes described as Moore's Law, where performance continues to improve as costs drop and profitability rises.
This article addresses economic and related issues that might affect the implementation of wireless sensors. Many of the ancillary costs associated with deployment of sensors in a production environment can swamp the costs of the actual hardware and software necessary for deployment. In some applications, the costs associated with deploying wired sensors make wireless sensors especially attractive; other applications may not show a significant cost advantage for wireless systems. The ultimate economic success of wireless sensors, however, won't occur until wireless solutions are convincingly more effective and undeniably less costly than wired sensors. The question is whether or not sensors will track the improved performance, lower cost curve characterized by Moore's Law.
Integrating Wireless Sensors into a Manufacturing Infrastructure
How about a market where more than $1 trillion is spent each year to replace perfectly good equipment because no reliable and cost-effective method is available to predict the equipment's remaining life? This seems like a perfect opportunity for wireless sensors. How does a supplier develop a business strategy to get into this market? Such a revolutionary change occurs only when plant safety and economic viability are not jeopardized by the new technology.
In today's manufacturing environment, systems and equipment must perform at levels thought impossible a decade ago. Manufacturers must push process operations; product quality; and equipment reliability, availability, and maintainability to unprecedented levels while staying within budgetary constraints. There is also a demand to reduce operational and support costs and to eliminate or minimize new capital investments in plant equipment because of lengthy returns on investments that impact short-term capital recovery. In short, manufacturers are trying to invoke new measures to ensure plant performance while minimizing costs and extending the operational life of new and aging equipment.
Many of the companies in this condition-based maintenance market are aware of the competitive advantage of being able to supply the necessary sensors without the need for running wires. Some of these companies are in the process of developing wireless sensors for their product lines. Success, however, will depend on not just the economics associated with manufacturing but also the economics of many other aspects of their companies.
Asking a customer to embrace a new technology, such as wireless sensors, may be difficult. The key is to understand the economics involved in technology insertion for the customer's business. Emerging modeling techniques will allow these concepts to be presented in a way that makes decision-making easier. The techniques described here can be used to better understand the economics of a supplier or a customer. The rigor of the techniques described and the opportunity for individual experimentation provided by the output spreadsheet can also be used to help convince skeptics of the value of the predictions.
The decision to integrate a new wireless sensor can be modeled as an investment strategy for both the supplier and the end user. You'll have to use an economic approach that mitigates the research and development risks. This strategy must:
- guarantee no blind technology alleys
- provide a holistic approach to solving the problem
- provide a robust solution that is crosscutting
- establish early decision points
- establish short-term, near-term, and future requirements
- match sensor/system requirements with customer expectations
Figure 2. Shown above is an investment strategy for the insertion of new technology into a market. |
To achieve this, you have to develop economic indices (operational and costs) that quantify and qualify the ability of a proposed technology to meet the functional and operational needs of a process. Therefore, your investment strategy has to provide control points in the development cycle (see Figure 2), where you can assess the impact of the technology. Integral to this process is the economic model that provides a break-even analysis and sensor and system performance assessment based on the economic concentration of losses and the ability of the sensor to meet systematic needs. The model becomes the tool by which a company can justify continued research and development expenditures for new technologies.
A Justification and Strategy Tool
The economic model is a tool for determining the economically justifiable cost of research and development as weighed against the projected costs of capital investments based on a fixed rate of return. The model provides a mechanism by which valid comparisons can be made among proposed technologies to determine which will go forward and which will be suspended. It also provides a way of considering collaboration among the sensor and subsystems to reduce the economic risk (i.e., calculating the associated costs of deploying a complementary system and its projected return on investment).
The future value of the projected price point is used as a fixed cost constraint in the development cycle. In the model, capital investments include implementation costs, system downtime while deploying the sensor/system, impact on the infrastructure, education, and reduction of personnel due to the elimination of nonvalue added tasks (see Figure 3). The outcome of the model is a simple spreadsheet that can be used to evaluate what-if scenarios specific to the user's or supplier's situation.
Figure 3. When applied correctly, an economic model identifies hidden and support costs. |
The role of the economic model as a research and development investment tool is important when considering the differing views that may arise when dealing with collaborating development organizations. Differences occur when individual participants begin developing their own definition of economic viability and cost. When each is compared with the group, contrasting views of extent and need develop. To resolve these points of contention, you need an economic model that provides a single organization with the ability to run strategic what-if scenarios using the operational bounds developed by the group. The results define a global control and decision point surface that can be used to suspend certain activities or continue others. Thus, decisions are not based solely on their functional capability but also on their economic content and value to the customer. This allows suppliers and perspective customers to discuss common goals without sharing confidential economic data.
Three Integrated Modules
The economic model consists of a break-even analysis module, a sensor performance analysis module, and the system performance analysis module. Each is designed to analyze one aspect of the benefits realized by implementing a new technology or approach.
The Break-Even Analysis Module. Economic decisions on capital investments are strongly influenced by the payback period. These time intervals are a critical selection criterion. The break-even constraint strongly influences the functionality of any new sensor, equipment, or system and obliges suppliers to consider the cost of the sensor/ equipment and estimate the profits/savings generated by the solution for the customer.
The module evaluates costs and savings by considering equipment, installation, training, operating, and troubleshooting expenses. Cost savings accrued from initial investments resulting from deploying a technology include those from reduced operations, reduced off-quality, reduced work-in-process, and tax benefits from depreciation and capital investments.
The Sensor Performance Analysis Module. This module is a probabilistic model that assesses the performance of a proposed sensor in terms of its ability to meet the functional requirements of the process (i.e., its added value). The model is based on conditional probability and takes into account both false positive and false negative impacts (total probability of false decisions per sensor). The module computes the total probability of true and false decisions for the system, and it accommodates combined sensors if several sensors/subsystems are integrated into the process. This probability is then factored into the system performance analysis module to determine if the cost of implementing the system more than offsets the cost of not meeting the functional needs. The module accomplishes this by assigning operational and economic metrics to the probability function for each sensor/subsystem. The module can be used to quantify (in a probabilistic sense) the added value that a sensor (or group of sensors) brings to a process.
Figure 4. By taking a priori information on product defects and process statistics, the economic model can be encoded into the control law to provide real-time estimates of production costs. |
The System Performance Analysis Module. This component provides the model's predictive capabilities in four basic functions: determining profit/loss of a process based on its operation and downtime history; tailoring the numbers to a per-product, per-customer basis; interacting on-line with the operator to allow for what-if scenarios; and delivering real-time economic data to enable real-time production decisions at the equipment, process, and plant level. The module can use statistical data, user inputs, material status, and production diagnostics. The output consists of predicted production costs and recommended process decisions. The first output can go to the user in response to queries or as alarm/alert signal. The second can be fed to a system to determine real-time control strategies. Figure 4 illustrates this flow.
What Happens Next?
The market for wireless sensors does not depend on Moore's Law for success. Sufficient conventional market forces make it clear that a company moving into the market can be successful without making the commitment to the higher performance, lower cost path dictated by the model.
A number of companies are moving into this market, some with more commitment and enthusiasm than others. Most of the big players (e.g., Honeywell and Allen-Bradley) have wireless programs and products, with the focus on communications networks external to the sensor. Some are packaging the sensor and telemetry systems together, but more work is necessary to bring the costs down to where the market will accept the new devices.
Companies are servicing the wireless sensor market in three different ways:
- buying radio devices from third-party suppliers and attaching the devices to their existing products
- repackaging existing products to include third-party RF components
- committing their organizations to developing truly integrated wireless sensors
Each of these options has both long-term and short-term advantages. The key is to make sure that there is a plan that defines how the current strategy maps to a long-term solution for the supplier and its customers. Most successes will undoubtedly come from suppliers developing new products to solve problems in their existing customer base. Newcomers will be able to penetrate markets with new products if the customer-perceived value is high enough.
Markets for wireless sensors currently depend on applications in which wiring is impossible or too expensive or where operating and support costs are prohibitively high. These include environments where sealed compartments are required (e.g., vacuum processing chambers or nuclear processing facilities). Others include applications in which obstacles make wired connections impossible or where the sheer number of sensors makes it impossible to access information on a timely basis.
For example, oil exploration crews must string geophones over large areas. These devices transmit data to large computers for integration and analysis. Technologies like those described in last month's article could be brought to bear in such an application, but the devices are not yet on the market.
The nuclear power industry (through the Electric Power Research Institute) currently estimates the cost of signal wiring in nuclear power plants at ~$2000/ft. These costs reflect estimates for cable trays, penetrations, quality assurance, configuration management, and many other costs normally hidden in other applications. Some power companies are experimenting with wireless connectivity in noncritical applications.
Many of the operating nuclear power plants are reaching the end of their design lives. Because 20% of the power used in the U.S. is generated by nuclear facilities, the Department of Energy and the power suppliers are quite concerned. Investigations show that two major components tend to show signs of age first: the pressure vessel and the signal wiring. The pressure vessel becomes brittle after years of exposure to the radioactive environment. The signal wiring experiences insulation faults and degradation in the connectors. The power suppliers estimate that they can replace the entire pressure vessel for less than what it would cost to address the wiring problems--a market opportunity if there ever was one. Who will step up and supply a solution? No one has yet.
The cost of wiring in a typical chemical plant is around $40/ft. The market for new sensors in these plants is being stifled by the need to run wires to connect the new sensors to the existing plant infrastructure. The market for fully integrated wireless sensors, which can make sensor data available via existing plant backbones, appears to be at about $200/sensor. No one has moved to meet the need for wireless chemical process sensors, such as temperature, pressure, humidity, and vibration sensing devices. The demand remains unmet.
The key to success for a sensor supplier seems to be to identify an application in an existing market where a strategic partnership with the end user will provide an economically beneficial solution to a prescribed problem. Once that partnership is established, the supplier must make sure that technological partnerships are available to bring the required technology to the application. To integrate wireless technologies with their products, companies require training, consultants, and close partnerships.
What does the long-term future hold for wireless sensors? After early successes supply the momentum to move the technology forward, how will the market mature? If certain conditions apply, perhaps wireless sensors will move into the economic situation commonly called Moore's Law.
Moore's Law and the Future of
Wireless Sensors
The market forces described in Moore's Law shape the curves in Figure 1. In 1965, Dr. Gordon Moore, then at Intel, noticed that the performance of ICs was improving every year and that the average cost was decreasing. From this he suggested a rule of thumb that is still used today: every 18 months, the performance of computer technology will increase by a factor of two, and the cost will decrease by a factor of two. Why this applies to computers and other electronic devices and not to wired sensor systems can be understood with the help of fairly simple concepts.
The key cost driver for electronics relates to the market for the device and the level of integration achievable. The manufacturing process for semiconductor devices leans heavily on mass producing huge quantities of identical components. Devices require the integration of several of these components to provide the functions required by the application. The more functions that can be provided on the electronic device, the greater the chance for cost reduction in the mass production environment of semiconductor manufacturing. Labor-intensive processes (e.g., soldering, packaging, wiring, and assembling) significantly affect the cost of producing the finished product. The first microprocessor is a classic example of how integration and market volume interact to create success stories. The cellular phone industry is currently in a similar cost curve.
An interesting question is why haven't sensors followed Moore's Law when other electronic components have. Figure 5 and Figure 6 show the cost of wireless phones vs. time and the cost of a typical flow sensor over time. As you can see from the graph, volume of sales alone is not adequate for the cost reductions called for by Moore's Law . Other issues come into play as well.
Figure 5. In a classic case that validated Moore's Law, the number of wireless phones produced increased while the price decreased. |
Figure 6. Unlike the wireless phone, flow sensors have not followed Moore's Law. Production volume has steadily increased, but prices have not declined--rather they have increased over time. |
For example, the level of integration seems to move systems toward Moore's Law. The underlying cause is the creation of a path to cost reductions realizable in high-volume production. Labor is a key cost in any manufacturing environment, and the assembly of the modules that make up the subsystems plays a big part in setting the cost of a fully deployed multicomponent system. To reduce the labor costs, a manufacturer must increase the level of integration of the modules and simplify their incorporation into the system. Auto manufacturers are moving in this direction in a big way. The Plymouth Neon won accolades in the auto manufacturing area for having the fewest (and most integrated) modules in the history of automotive mass production.
Another cost driver, known as the learning curve, also plays a role in systems that map well to Moore's Law. In this model, the yield and quality of the product increases as the manufacturing and assembly processes improve. This increase in yield drives down the cost of manufacturing because less waste and higher production efficiencies yield improved profit margins.
Chevrolet announced that the 1999 Corvette would cost less than the 1998 model. What makes this even more amazing is that the 1999 Vette is a better car than the 1998 model.
A final attribute of systems following Moore's Law is the growth market potential. Not only must the market have a growth path that supports exponential projections, but it must also renew itself. This is an interesting concept because most of us assume that customers resent buying the same thing over and over. But Moore's Law requires it. The difference is that the customer likes it. How many of us have replaced the batteries in our cellular phones? Usually the cost of a new cell phone is low enough and the new features desirable enough that we opt for a new phone rather than repairing the one we've had for a year or so.
Can wireless sensors follow a model like this? Some people think it's inevitable. The key will be the first moves into the market by companies who actually welcome this new model for marketing. Can suppliers continue to improve performance while reducing the cost of sensors?
A company expecting to move into this model must have a business plan that shows three things:
- how production costs can be reduced over time through improved processes and skills
- how product performance will be improved while customer costs decline
- how the market will continue to grow and be renewed as the cost drops and performance improves
Even though clear trends indicate that sensors may move in this direction, the jury is still out. The Next Generation Internet project funded by the U.S. government is expected to bring sensory awareness to the Internet. Will this provide a new market for sensors that will boost the volume to a level where Moore's Law will apply? Can sensor and sensor system suppliers participate in an environment like this? No one knows.
Conclusion
Will the sensor marketplace ever see the cost patterns called for by Moore's Law? Does it matter? The successful deployment of wireless sensors will depend on whether the suppliers' solutions meet the customer's requirements in a cost-effective way.
This may eventually lead to integrated wireless sensors that can be produced at ever-decreasing costs and ever-increasing performance. In the meantime, companies will make money and solve problems with technologies and techniques that don't rely on such visions. The truly successful companies, though, will provide solutions that can also provide a path to the future for their own company and their customers. An economic model supports the business decision to embark on such a path.
The economic model we have discussed can help you determine the justifiable cost of new sensors and subsystems with respect to value and operation. It will balance the research and development costs against the expense of maintaining operations and provide a method of calculating economic indices of performance that can be used as control points in deciding whether to continue development or to suspend actions. As an aside, the model can also be used as an integral part of an overall control loop using real-time process data from the sensors to make production decisions (e.g., stop production and repair machine, continue and warn of anticipated problems, or queue for repairs).
Every company in the sensor business must decide whether or not to embrace wireless technology. One major supplier said that it would wait until a small company invents the technology and then buy the company--a great strategy if you have more money than talent. Another executive from a sensor company commented that any sensor company not working on wireless sensors will be out of business in five years. Both of these quotes came from the floor of the 1998 ISA exposition.
We hope these articles have been helpful. Our organization works for the U.S. Department of Energy, so our interest is in applying wireless sensors to improve energy efficiencies, reduce waste, and improve production efficiencies. Any company wishing to collaborate with us or to use technologies developed at Oak Ridge can contact us at the address below.
Note: Oak Ridge National Laboratory is managed and operated by Lockheed Martin Energy Research Corp. for the U.S. Department of Energy under Contract DE-AC05-96OR22464.
CLARIFICATION: In "It's Time for Sensors to Go Wireless, Part 1: Technological Underpinnings" (Sensors, April 1999), the caption for Figure 1 on page 10 should have indicated that Oak Ridge National Laboratory's nose-on-a-chip is the first prototype of a micromachined arrayed sensor capable of detecting multiple components in a complex environment. Individual cantilevers on the chip are selectively coated to yield both redundancy and chemical specificity through the ensemble response of the array. Although, in principle, hundreds of species could be reported, incorporation of built-in intelligence would reduce the transmission rate and lessen the load on central stations, among other advantages. One licensee of the technology, Graviton, Inc., of San Diego, California, plans to manufacture intelligent sensors that couple to the Web so that information can be accessed.