OPC-A Question of Relevance

For ten years, OPC's suite of standards has provided the industrial automation world with open connectivity, but the technology on which its standards are based is no longer on the cutting edge of data sharing. The foundation that rescued manufacturers, systems integrators, and software providers from the chaos of proprietary communications interfaces now has to compete with fast movers such as service-oriented architectures and Web services. The question is: Can the standards evolve, embrace new communications mechanisms, and remain relevant?

In the Beginning

OPC's specifications were a response to the need to enable and streamline communications between automation hardware (e.g., PLCs and DCSs) and SCADA software applications. In 1996, when the first OPC standard was released, the predominance of proprietary interfaces (drivers) impeded communications between hardware and software. Manufacturers who wanted to connect products from different vendors were forced to purchase custom interfaces.

Free Monthly Newsletter

Compelling read? Subscribe to FierceEmbeddedTech!

The embedded tech sector runs the market’s trends. FierceEmbeddedTech subscribers rely on our suite of newsletters as their must-read source for the latest news, developments and analysis impacting their world. Sign up today to get news and updates delivered to your inbox and read on the go.

"Fifteen years ago, we would spend an inordinate amount of time with these drivers," says Rick Pierro, president of Superior Controls Inc., a systems integration firm. "I'd have to buy a variety of drivers to match my variety of PLCs. And for these devices to communicate with a SCADA package, you would have to check to see if the package had the drivers available or see if someone sold them. Inevitably there would be all types of problems and timing issues. The most common and frustrating problem we would encounter was that the driver did not work."

Solutions

The first OPC standard, which is now called the Data Access Specification, was based on Microsoft's component object model (OLE COM) and distributed component object model (DCOM) technologies. COM/ DCOM provided a framework for software development. To facilitate interoperability, the OPC specification defined a standard set of objects, interfaces, and methods for process control and manufacturing automation applications.

The OPC architecture is based on the client/server model. The OPC server is a data source that receives requests for data from OPC clients, obtains the data from the proprietary system (hardware or software), and sends the data back to the client.

Adoption of the specification eliminated the need for hardware and software suppliers to develop or purchase custom drivers to achieve connectivity. Manufacturers and systems integrators gained real flexibility because they could choose products based on their features rather than the availability of an appropriate driver. Because the OPC Data Access Specification codified the connection mechanism, users were also assured of better connectivity.

"With OPC, a lot of the driver issues have gone away," says Pierro. "With OPC, I know I can make it work as long as the two parts that I am trying to connect have the OPC option."

The OPC Foundation (www.opcfoundation.org) has continued to respond to new automation needs by releasing a number of other standards specifying the communications mechanisms for industrial process data; alarms and events; historical data; and batch process data between sensors, controllers, software systems, and notification devices.

Future Relevance

Today, OPC's suite of automation standards is the preferred method for implementing plant connectivity. An ARC Advisory Group (Dedham, MA) survey in 2004 found that manufacturing companies preferred OPC over other methods for connecting systems in plant environments.

However, several factors raise questions about OPC's relevance and call for a technological renovation. For example, Microsoft's COM and DCOM, the foundation of the standards, are now considered legacy technologies. In addition, the earlier OPC specifications did not provide a coherent data model, as shown by inconsistencies in the hierarchies of the Data Access and Alarms and Events specifications. Both these factors mark the end of the OPC standards as they exist today.

"DCOM is being downplayed now as the technology to connect or communicate between computers," says Roy Kok, director of product marketing for iFIX and drivers at GE Fanuc. "If OPC doesn't change, then it will be going down, just as DCOM is going down."

To ensure its place on the factory floor, OPC is preparing to release its next-generation standard, the Unified Architecture (UA). The new specification provides a single, coherent data model, uses Web services as its primary transport mechanism, and leverages XML communication, all of which open the way for migration to a service-oriented architecture. The Foundation released the first draft of the specification in June.

Tom Kevan is a freelance writer/editor specializing in information technology and communications.

Suggested Articles

Intel also bought Nervana for AI in 2016

Cadence Design Systems, Inc. has agreed to acquire National Instruments' AWR Corporation subsidiary, to bolster its 5G presence.

Semiconductor supplier Analog Devices Inc. has sued Xilinx for violating its converter patents.