The Internet of Things (IoT), initially born from machine-to-machine (M2M) technology, has become a top priority for businesses across various industries for several years now. Nevertheless, this concept still has a long way to go before reaching maturity. This path will bring together technological, economic, and social factors to create a new digital stage that serves our lives, work, and entertainment. This is a long-term vision, and we are currently only at the beginning of the journey.
Networking, intelligence, and autonomy
Most organizations view the Internet of Things (IoT) as a holistic system comprised of multiple stages. The general approach is to first connect devices, then make them intelligent, and finally, enable them to be autonomous. For example, self-driving cars are a typical example of autonomy. Initially, the focus was on interconnecting various devices through networks. However, once you recognize it as a holistic system, you discover the other underlying components—the broader technological and business requirements.
The current situation is that all these components are not yet mature, and some technologies have not yet found suitable ways to effectively support the intelligent and autonomous requirements of the Internet of Things vision. For example, the approach to deploying computing resources is an aspect worth discussing.
Flow computing
Determining where to deploy computing resources across the entire Internet of Things (IoT) is a crucial question. Computing power has oscillated between centralized and distributed approaches over the past four decades, with the latest trend being cloud computing—another form of centralization. There's a prevailing notion that we're returning to a distributed computing model, but I don't believe this is inevitable. Yes, we need to deploy more computing power at the network edge, but to fully realize the potential of the IoT, what we truly need is fluid computing.
For the Internet of Things (IoT) to reach its full potential, systems need to be able to access and utilize computing resources on a mobile basis. In the cloud, enterprise-grade virtualization capabilities have enabled elastic computing, allowing enterprises to run applications on any available computing resources, regardless of where those resources are located.
Let computing power flow between the edge, fog, and clouds.
Wind River has been working with its customers to plan their future and show them how to embrace and scale the concept of mobile computing, with the goal of achieving optimal workload balancing across the edge, fog, and cloud of the Internet of Things. This involves creating a topology that allows businesses to utilize computing resources at any level in a mobile manner, enabling them to deploy workloads to the best available computing resources as needed.
Ultimately, this will create an environment where computing resources can meet the needs of a wide range of applications and maximize resource utilization efficiency. However, this is a very complex undertaking, and for many, it represents a completely new IT vision.
For some time now, embedded systems developers have been integrating workloads along the horizontal axes. They have been able to integrate embedded virtualization technology into federated systems, thereby reducing overall costs. The next challenge is how to achieve workload coordination and integration along the vertical axes (between the edge, fog, and cloud).
One key aspect is leveraging cloud architecture, rather than continuing to build custom embedded solutions for customers. This is not easy to do, and the greater challenge lies in doing it effectively while maintaining the system integrity, performance, and determinism required by critical infrastructure.
Drive business model transformation with software-defined technologies
Wind River has always focused on helping customers address computing resource challenges and realize the vision of the Internet of Things (IoT), while also keeping an eye on the business models that the future market will create. As we accelerate our progress on the complete IoT ecosystem, we must consider a substantial transformation of our business models. The shift towards Software-Defined Networking (SDN) and Network Functions Virtualization (NFV) in cloud computing and telecom data centers has already profoundly impacted the business models of traditional equipment vendors. Their past profit strategies have been tightly tied to dedicated solutions.
End users are driving the decoupling of computing platforms, software infrastructure, application software, and business processes, which will bring about disruptive changes to the supply chain, forcing enterprise solution providers to rethink how they create and capture value. Cisco's transformation, for example, is a prime example of this evolution.
This trend is also occurring in traditional embedded systems, such as the industrial control market. Owners of manufacturing and process companies are striving to ensure that existing equipment continues to function in the new IoT environment and to respond to new market changes in innovative ways, while also further reducing costs. They are exploring how to leverage virtualization and topology to achieve decoupling, enabling the flexible deployment of innovative technologies.
Achieving a joint system across heterogeneous networks
Innovative technological applications cannot run on traditional IT infrastructure; they necessitate new mobile computing architectures, including novel edge computing platforms, to achieve ultra-low latency performance, comprehensive security mechanisms spanning the edge and cloud, carrier-grade uptime, and support for minimal resource allocation. Typical edge computing platforms differ significantly from enterprise-grade core infrastructure running in cloud data centers. Furthermore, it requires establishing cost-effective federated systems between distributed cloud computing platforms and edge computing resources, ensuring their cloud-agnostic abstraction.
At the recent Mobile World Congress (MWC 2018) in Barcelona, VMware and Wind River jointly demonstrated a multi-tenant service across multiple cloud platforms using the ONAP federated computing architecture. This is the result of a collaboration under the ONAP Multi-VIM/Cloud Project, in which VMware and Wind River are leading contributors. The project aims to advance the design of ONAP and the deployment of cloud-agnostic infrastructure environments, including OpenStack and its various versions, public and private clouds, microservice containers, and other instances. In this demonstration system, Wind River Titanium Cloud is located at the edge, while VMware vCloud NFV OpenStack Edition is located at the core. The application scenarios are virtual CPE and virtual IMS. In both scenarios, the edge components run on Wind River Titanium Cloud, while the core components run on the VMware VIO.ONAP orchestrator, automating data flow between the edge and core cloud and controlling overall service delivery and lifecycle management.
This joint architecture across heterogeneous networks (VMware VIO and Wind River Titanium Cloud) provides the high-value edge services required by formal service providers, which can be used to expand their existing networks or add new ones.
Industrial systems are learning from telecommunications systems
For the past 50 years, the service model of industrial automation systems can be summarized as "plug and pray." That is, whenever a new piece of equipment is added to the system, people can only pray that it works properly and without problems. This is because the cost of repair and replacement if problems do occur is incalculable. For example, generator sets installed in nuclear reactors fall into this category. In the era of the Internet of Things, we both have the possibility and should have new methods to solve these kinds of problems.
In today's rapidly evolving technological landscape, industrial enterprises that fail to understand how to leverage these new technologies to accelerate response times and reduce operating costs will inevitably be at a competitive disadvantage. Our discussion here focuses on the pressure industrial enterprises are feeling and realizing they can assess the topological shifts occurring in cloud computing and telecommunications data centers to determine which new technologies can be applied to their specific sectors.
These approaches may seem confusing within the traditional business models of industrial market suppliers, much like what we've seen in the cloud computing and telecommunications markets, but there's no need to panic. We've already seen industrial control companies actively learning from adjacent industries, adapting their value propositions, and rethinking the profit strategies that should accompany this transformation. They are beginning to realize that their value will increasingly be generated at the application and business process levels, rather than at the computing platform and software infrastructure levels.
Transforming processes is the key to the digital revolution.
As a technology service provider, the key technological challenges we are rapidly addressing include: 1) connecting different devices; 2) making them intelligent; and 3) enabling these devices to learn and act autonomously. Realizing the goals promised by the Internet of Things (IoT) vision also requires the rapid development of business models to match consumer demand with supplier capabilities. For many businesses, this is a very different and significant shift, and many companies and managers are concerned that this process will disrupt their existing revenue engines.
Of course, computing resources are only one aspect propelling the Internet of Things (IoT) forward. Another obvious aspect is data. Data is the new oil. While we've made some progress in capturing and utilizing data, we haven't yet truly experienced the enticing results promised by the IoT.
I've recently been reading a lot about the Industrial Revolution, with the biggest breakthrough being the use of steam to power conveyor belts and gears in factories. Soon after, large DC motors replaced the steam engine. However, electricity didn't fundamentally change manufacturing; it merely provided a more efficient power source for conveyor belts and gears. The real transformation didn't begin until the widespread adoption of numerous small electric motors in a distributed manner. This signified that humanity was truly beginning to understand and change traditional manufacturing processes, thus fundamentally altering production patterns.
Our approach to data processing remains essentially the same. Data isn't a new phenomenon; it's always been there. The key is finding more efficient ways to access, store, and analyze it. If we can't deploy computing resources at the edge of the Internet of Things, it means we haven't done anything entirely new or different, just as we've only just begun deploying large electric motors without actually changing the production process.
Machine learning is key to the Internet of Things
Machine learning is being widely discussed as a key technology with transformative significance, and it is crucial for realizing the lofty vision of the Internet of Things.
Machine learning has existed since the 1940s and since Alan Turing. Early methods used symbolic programming, relying on machine rules as the basis for learning. Today, algorithmic development has shifted towards pattern recognition, applying sophisticated learning techniques. As machine learning moves in this direction, we have already seen tremendous improvements in application efficiency. While it is still in its nascent stage, its development speed is astonishing, and its impact will be transformative.
However, supporting machine learning inevitably requires greater computing resources. Our main challenge is pushing computing resources far to the edge of the network that generates the data, allowing devices to make critical decisions more quickly without the delays caused by transmitting data to the cloud. Ultimately, enterprises will want to achieve a model of mobile computing, enabling devices to be intelligent wherever and whenever they need it. This will allow us to truly leverage the power of machine learning.
We are determined to proactively transform because competitors could arrive at any time.
A strong will is essential for successfully transforming for the Internet of Things (IoT). The extent of that determination is crucial. Any organization that views IoT and machine learning as new and unfamiliar territory must learn them. The importance of the willingness and ability to learn cannot be overstated.
Organizations that believe they can deter new competitors from neighboring markets simply by relying on their existing domain knowledge are making a grave mistake! Given the right datasets and sufficient computing resources, programmers, even with limited domain expertise, can develop algorithms that yield significant results. Therefore, organizations must recognize the need to accelerate the development and application of machine learning technologies, or they will see new competitors from nearby markets rapidly catching up in their rearview mirrors.
The application of machine learning isn't a prize to be won today, but it's an easily attainable goal in the near future, generally within a year. However, the development of machine learning and streaming computing will accelerate significantly in the next decade. Streaming computing, IT scalability, and machine learning are fascinating technologies, and from a strategic perspective, Wind River is exploring how to incorporate all of them.
Conclusion
Flow computing and machine learning have resonated strongly with engineers, but have instilled a sense of dread in business professionals, forcing them to rethink how they create value, as their roles no longer guarantee their position as they once did. From a technological perspective, this will be a refreshing era. From a business perspective, it will be an era of profound disruption.