Autonomous IoT Lifecycle Management
With the advent of the Fourth Industrial Revolution, digitalization has thrown up complex challenges in lifecycle support. The future of lifecycle management will see autonomous engines underpinned by Artificial Intelligence.
A distinct feature across all industrial sectors is how product maintenance, service and life cycle management are handled. Given the many rules and regulations that govern the industry in general and the high cost of procurement, it is not unusual for customers to expect a near hundred percent uptime of their systems and life cycle support of at least two decades.
Unlike in consumer electronics where software updates on mobile phones and computers only need a quick restart, industrial plants have detailed maintenance schedules because unplanned downtime often has very costly consequences, both in the matter of economics and competition. In other words, reliability and availability take priority and providing solutions that fulfills these expectations are at the core of the industrial business.
The Third Industrial Revolution introduced during the 20th century, closed saw automation and process control systems where components of one piece of machinery, certain plant areas and sometimes the entire plant came from the same vendor. This ensured all parts of the system functioned in harmony. But by the end of the 80s, breakthrough open interfaces like the OPC and industrial communication protocols such as PROFIBUS or Fieldbus Foundation allowed plant operators to connect components from various suppliers.
Even as these procedures were based on well-defined standards, they resulted in a more complicated life cycle management as the components that interacted with each other were no longer owned by one company. Moreover, version management of the standardized interfaces and communications protocols themselves forced companies to evolve their components, making interoperability more complex. So, the key to ensuring product availability and reliability was not just connectivity but also backwards compatibility as plants often have a mix of old and new components. While established standards largely guaranteed connectivity, the question of ensuring backwards compatibility remained unresolved, as we have no control of the hundreds of combinations that are used to set up our systems.
With scheduled maintenance planned at least a year ahead for complex industrial systems, the deployment of a step-wise upgrade – often used for mobile phone software – is not feasible. To comply with changing standards, industrial suppliers offer only the latest version of their components, so plant operators are often faced with devices that are not backwards compatible. In an environment where hundreds of software and hardware devices are controlled by scores of vendors, neither the industrial supplier nor the plant operator can guarantee that the latest version of a single device will not disturb the reliability of the system.
A bigger challenge is that it becomes difficult to trace the effects of incompatible devices which might disturb the entire system by slowing it down or changing operational parameters. As a result, the plant operator and suppliers have to spend a considerable amount of time and money to pinpoint the root of the problem.
Realizing these challenges, many organizations started building huge test labs, where they test devices for connectivity and backwards compatibility. The company created a semi-open system in which only devices that passed the interoperability test could be used at customer sites. Such devices were put on an approved reference list. Soon, plant operators preferred to use devices and the according software components provided by such device libraries. This was a win-win situation as it guaranteed system reliability for operators and helped the suppliers and engineering companies to avoid hours of unproductive trouble-shooting.
While the referral system worked for many years, the advent of Fourth Industrial Revolution took complexity to a whole new level. In this era of digitally connected devices, it is possible to gain new insights and build disruptive solutions by collecting and analyzing data from various sources in the Internet of Things. Earlier there were established industrial vendors, now there are unlimited suppliers and startups who build solutions for enhanced industrial applications. The ecosystem is growing tremendously without any broadly recognized standards that ensure interoperability.
In this scenario, the value and quality of the solution depends on the availability and reliability of the data. Hence the answer to the challenge of life cycle management in the context of the Internet of Things is to move the core of version management from the system level to the solution level. However, the question arises, who owns a solution life cycle management which includes multi-vendor application, where is it hosted, who keeps it up to date and how does it get deployed?
Im Netzwerk teilen