Technological Darwinism

E-mail Tom Kevan

What happens when a core technology that has been responsible for critical communications in plant environments and been embraced by users for 10 years is overshadowed by new, dynamic mechanisms entering the marketplace? Can the technology evolve and catch up, or is it destined to fall by the wayside?

The Old Order
For the past 10 years, OPC's suite of standards has dominated the industrial automation world by providing open connectivity. Throughout its existence, OPC responded to automation requirements as the need for new tools became apparent, validating the principle that if something can be done, technology providers will do it. But while the OPC Foundation evolved the focus of its standards, it held tightly to the underlying technology upon which they were based.

Until now, OPC's standards have relied on Microsoft's component object model (OLE COM) and distribute component object model (DCOM) technologies. Combined, these have provided a framework for software development. Using these technologies, the specification defined a standard set of objects, interfaces, and methods for process control and manufacturing automation applications to facilitate interoperability.

Natural Selection
In 2004, an ARC Advisory Group (Dedham, MA) survey found that manufacturing companies preferred OPC to other methods for connecting systems in plant environments. Two years later, the recognition of existing weaknesses and the introduction of more agile technologies call into question the fate of OPC's standards.

Microsoft's COM and DCOM are now considered legacy (outdated) technologies. And the earlier OPC specifications do not provide a coherent data model. In addition, technologies such as Web services and service-oriented architectures offer greatly enhanced capabilities and significantly expand the limits of connectivity.

So what is to become of OPC? One thing is for certain: The foundation isn't taking the challenge lying down. In an attempt to rejuvenate the standards, OPC is about to release (expected in June) its next-generation standard, the Unified Architecture. The new specification provides a single, coherent data model, incorporates Web services as its primary transport mechanism, and leverages XML communication, all of which opens the way for migration to a service-oriented architecture.

The Big Question
Will this be too little, too late? The jury is still out. The well-established standards have momentum on their side. Manufacturers, system integrators, and hardware and software vendors have a lot invested in OPC, and everyone is comfortable using products based on the standards. Users will certainly not discard the technology overnight. But the newer technologies are having a big impact on enterprise systems, which means that those who set company policy and control the purse strings will be pushing for changes that they perceive will improve their bottom line. As the new technologies prove themselves and establish a record of their success, people will be looking real hard at what OPC can deliver.

For a closer look at OPC and where it is headed, watch for the Extreme Data column coming in the July issue of Sensors.