OPCUA
OPC Unified Architecture (OPCUCA) is a new technology created by the OPC Foundation that is more secure, reliable, and neutral (vendor-independent) for transmitting raw data and pre-processed information from the manufacturing floor to production planning or Enterprise Resource Planning (ERP) systems. Using OPCUCA technology, all the necessary information can reach every authorized application and every authorized person anytime, anywhere.
OPCUA is manufacturer-independent, applications can communicate with it, developers can use it to develop applications in different programming languages, and it is supported on different operating systems. OPCUA makes up for the shortcomings of existing OPC applications by adding important features such as platform independence, scalability, high availability, and Internet services.
OPCUA is no longer based on the Distributed Component Object Model (DCOM), but rather on a Service-Oriented Architecture (SOA). OPCUA can therefore connect more devices.
Today, OPCUA has become a bridge connecting enterprise-level computers with embedded automation components—independent of Microsoft, UNIX, or other operating systems.
The following is the main text.
1. Termination of Component Object Model (COM)/Distributed Component Object Model (DCOM)
Traditional OPC applications exchange data based on Microsoft's Component Object Model (COM) technology. Because the Windows operating system is widely used worldwide, and because it promotes the use of Windows computers in automation, COM technology has created conditions for the widespread use of OPC technology. In early 2002, Microsoft released the new .NET Framework and announced the cessation of COM technology development. While this does not mean that future Windows operating systems will not support COM, as a result of this cessation, the underlying technology of traditional OPC is no longer being developed and will inevitably be phased out sooner or later, necessitating the search for new replacement solutions.
2. Limitations of .COM
In the 1990s, with the widespread adoption of Windows computers, a set of features introduced by Microsoft COM/DCOM technology were highly appreciated by home computer users and industrial automation users. These features included copy and paste, drag and drop, linking and embedding. DCOM also provided a complete communication infrastructure with necessary security mechanisms such as authorization, authentication, and encryption. DCOM security mechanisms enabled remote access to data and programs from computers. However, DCOM security mechanisms also posed challenges for installation engineers, system integrators, and developers managing projects, including OPC communication across PCs. Properly setting up DCOM security features was a very difficult task, requiring considerable expertise. As a result, installation engineers and system integrators routinely opted for a quick process, imposing lax access authorization on all networked OPC computers, rendering most protections ineffective and allowing unauthorized remote access. This practice contradicted information technology (IT) security requirements. In the long run, there was a risk of damage from careless or malicious individuals. While setting up DCOM security often required specialized skills, configuring OPC communication features was much easier.
3. OPC communication through firewalls
In the automation industry, the necessity for OPC communication to cross computer boundaries was recognized early on, another area where DCOM restricts traditional OPC communication. DCOM requires multiple ports to establish a connection for authentication, data transfer, and a range of services. Therefore, many ports must be opened in a firewall to allow DCOM communication to pass through. Each open port on a firewall presents a security vulnerability, providing a potential opportunity for hacker attacks. Tunneling technology in OPCUA is a widely accepted strategy that solves the DCOM limitation problem in traditional OPC products.
4. Using OPC on non-Windows platforms
In industrial applications, the near-ubiquitous Microsoft platform, with DCOM as a component of the operating system, is a key factor in the rapid adoption of traditional OPC. However, the integration concept of OPC fails when using other operating systems because they do not support DCOM. This is the case, for example, in the IT industry where Unix or Linux systems are frequently used.
The same applies to automation; some application areas explicitly reject the use of Windows operating systems. Embedded devices are another area where Windows struggles (except for Windows CE or embedded XP). Here, complex applications are directly embedded in field devices, PLCs , operator panels, and other devices. They run VxWorks, QNX, embedded Linux, RTOS, or other embedded operating systems without DCOM. Using OPC integration concepts in these areas is doomed to fail because OPC requires DCOM as a technological foundation, which is precisely what is lacking in embedded systems.
5. Implement cross-platform OPC communication via Web services
With the release of the OPCXML-DA specification in 2003, the OPC Foundation demonstrated for the first time a platform-independent approach and a method to overcome the limitations of DCOM. Today, many OPCXML-DA products demonstrate web services-based OPC technology. However, the data throughput of XML-DA communication still lags behind DCOM, with communication speeds 5 to 7 times slower. This speed is too slow for many automation requirements. Web services-based OPC communication is still useful because it enables cross-operating system capabilities, but further improvements in data transmission performance are needed.
6. Unified data model
To date, traditional OPC technology has used three different OPC servers: a data access server, an alarm and event server, and a historical data access server. If a user needs to obtain the current value of a temperature sensor, an event where the temperature exceeds a limit, and a historical average temperature, they must send three requests to access three servers. Accessing process data, events, and historical data using different methods is time-consuming. Therefore, unifying these three object models would greatly simplify this process, benefiting not only OPC product vendors but also system integrators and users.
7. Supports complex data structures
A primary application of OPC is the operation and monitoring of serial communication or fieldbus-networked devices. To configure a device, the OPC client needs to write data types, which are then transmitted to the device via the OPC server, including the meaning of the data structure elements. The OPC Foundation has created a methodology for describing complex data structures, known as the Complex Data Specification. However, with very few exceptions, most traditional OPC products on the market today do not support the Complex Data Specification.
8. Ensure no data loss during communication.
The earliest defined data access allows client applications to periodically obtain the current state of process data. Data communication can be disrupted if the physical communication connection between the OPC client and the remote OPC server fails. When communication is disrupted, data transmitted to the OPC client may be altered or even lost. This data loss is not critical in some data access applications, such as trend logging, process monitoring, or process display. However, it is critical in other applications. For example, OPC technology has become fundamental in areas such as the chemical or petrochemical industries, where seamless data logging is required. To achieve this, vendors need to implement specialized extended methods. They use connection-based monitoring systems to ensure rapid detection of communication interruptions and automatic reconnection if communication fails. Data access servers also incorporate data caching, redundancy, storage, and forwarding capabilities. These extended methods are useful but are not defined in traditional OPC specifications and vary from vendor to vendor.
9. Enhanced protection against unauthorized data access
As Ethernet-based communications continue to grow in the automation industry, automation and office networks have become intertwined. Simultaneously, the idea of vertical integration has created new demands, and this type of integration also introduces new security risks. OPC has also increased the use of remote maintenance and remote control concepts. Again, unauthorized peripheral access must meet more stringent information security requirements. With the rise of cybercrime, espionage, and sabotage, information technology security is becoming increasingly important – hence the security requirements for using OPC. Traditional OPC vendors have not developed proprietary preventative measures and therefore cannot meet these security requirements.
10. Support for new command calls
In many applications, not only is reading and writing numerical values crucial, but executing commands is also essential, such as starting or stopping a drive or downloading a file to a device. The OPC command specification defines how to execute these commands, but this is only valid in OPCUA and cannot be used in traditional OPC.
Disclaimer: This article is a reprint. If it involves copyright issues, please contact us promptly for deletion (QQ: 2737591964 ) . We apologize for any inconvenience.