23. Februar 2021, 14:00 Uhr | Harry Schubert
Datenverarbeitung am Edge hat Vorteile, stellt aber auch eine Herausforderung dar.
For faster and more responsive computing we will need powerful edge computing systems. So, as computing becomes smaller and more economical, edge computing will become ubiquitous. But it comes with its own challenges according to Kevin Dallas, President & Chief Executive Officer of Wind River.
? What are the challenges facing developers of intelligent edge computing systems?! Kevin Dallas: The next frontier of digital transformation comes at the intelligent edge given that 50 percent+ of all computed data processing and AI will be done at the edge by 2022 and 700 billion U.S. dollars will be spent on new edge-based AI systems by 2025. This can provide faster, more responsive computing, and as computing becomes smaller and more economical, this kind of intelligent edge system will become ubiquitous. However, edge computing comes with its own set of old world and new world challenges.
Kevin Dallas, President & Chief Executive Officer, Wind RiverWith more than 25 years experience of driving digital innovation and growth at technology companies, Kevin Dallas is now responsible for all aspects of the Wind River business globally.He joined Wind River from Microsoft, where he most recently served as corporate vice president for cloud and AI business development. Prior to joining Microsoft in 1996, Dallas held roles at NVIDIA and National Semiconductor – now Texas Instruments – in the U.S., Europe, and the Middle East – which included microprocessor design, systems engineering, product management, and end-to-end business leadership. He holds a B.Sc. degree in electrical and electronic engineering from Staffordshire University, Stoke-on-Trent, Staffordshire, England.
We are hearing about the following challenges from the customers we serve who are developing intelligent edge computing systems in mission-critical industries:
At Wind River we are addressing these challenges with our new Studio offering, which is the industry’s first and only cloud-native platform for the development, deployment, operation and servicing of mission-critical intelligent systems at the edge where security, safety, and reliability are required.
? Why can‘t moving intelligence to the edge be countered by moving cloud servers to the edge?
! Dallas: Moving intelligence to the edge is about moving the ability to run compute, data processing, and AI/ML functions to the edge. This can only be done through the adoption of a cloud-native platform at the edge.
Technically moving intelligence to the edge could be met by moving cloud servers to the edge – that is the point of using containers on individual devices and Kubernetes to manage edge stamps. However, sometimes the size of these server images is daunting and it may not be economical to run them on an edge device; additionally, applications running on cloud servers may have dependencies on cloud services / APIS.
Edge devices vary in size, computing power requirements, application diversity, environment, development cost, and most importantly operating cost. The powerful embedded processor lineup is still very capable of delivering various workloads due to their multi-core architecture, hardware supported virtualization, and application specific compute engines. Some of the edge devices deployed will remain in commission for decades requiring a different system lifecycle, thus leveraging the datacenter like hardware.
Important questions here include: How do we enable distributed intelligence from cloud to edge so that we run the right workload on the right, lowest cost, platform? How do we employ modern processors and GPUs with software? How do we adopt the stateless, microservices-oriented architecture/application at the edge? How do we provide a common development and deployment experience for developers building cloud-native applications as well as for those developing purpose-built applications?
Wind River is addressing these specific challenges with Wind River Studio supported by four decades of experience to ensure that these constrained devices and their developers will still be able to realize the benefits of containerized application development and deployment to the edge.
? Insofar as an edge computing system can be considered a bridge between, or a hybrid of, embedded domain and cloud environment. Which part dominates?
! Dallas: It depends, the move to edge computing is about right-sizing workloads based on where it is most economic to deploy, operate, and support them. An edge computing system can be considered a bridge or even a cloud compute on-ramp in most situations and especially in a brownfield situation. In some cases however, the edge device itself is the embedded device with cloud connectivity – take for example an industrial cobot connected to the cloud through 4G/5G. Such a device would be at the same time capable of making and delivering real-time decisions on the floor.
In some cases, an edge solution is the right size when it can capture 4k @30 frames per second so that it can identify objects and their position on a manufacturing line. To drive a robot arm to pick up objects in real time from a manufacturing line you must run that workload on the local device. Semantic data about the object being picked up is then communicated to the ERP system running in the cloud. That ERP system is connected to hundreds of other robots doing the very same job and that results in a macro status of the whole business. The recognition workload is right-sized on the edge. The ERP system is right-sized to a local, public or government cloud.
The particular use case/industry at hand is a key determining factor since each will have its own constraints in terms of cost, power and weight. We know for instance, that in some cases 4G or 5G will not be an option due to cost. And ultimately if there are no bandwidth, cost, and/or latency requirements, everything can be put in the cloud. In some cases, you will need to take cybersecurity into account as a primary requirement. In other cases, the confidentiality of the data will be key, and you will need private cloud and therefore edge computing to host those data.
The reality is we are not creating less data, we are creating more, and the only way to process that data is to leverage the horizontal scale of the cloud, such as enabling AI/ML to make decisions and identify opportunities that may not be immediately apparent. However, not all data is good data, and that is where edge compute can add significant value by pre-filtering which data goes to the cloud and then by orchestrating cloud insights and analytics when they come back down to the local level without burdening the cloud with the responsibility of keeping up with the local embedded devices.
? What skills will be required in the future from developers who have to realize an edge computing system or an embedded system for edge computing?
! Dallas: This new world of intelligent systems at the edge requires a new range of skills, ways of thinking, and knowledge. Knowing where the data is coming from primarily, and whether edge or cloud will shape the way teams should be organized and what expertise is most critical. Some of this experise will need to include:
Adaptability is also key given that not all requirements will be known at the start. Developers must therefore design with ambiguity in mind in as many modular ways as possible so that they can quickly adapt.
? What do developers have to pay attention to when they want to design and deploy edge computing systems for industrial applications which require long-term safety features?
! Dallas: There are several considerations for developers of edge computing systems for industrial applications. With safety systems, failure could harm human life, so first and foremost knowing the safety certification requirements and/or regulations that need to be complied with is essential.
Based on research we recently conducted in Europe and the U.S., we learned that industrial companies are not only moving to 5G but that their primary concern is cybersecurity, which is now de facto a cost of entry, and can no longer be considered a “nice to have” or be an afterthought. Additionally, timely long-term security updates have become a primary area of concern in industrial applications, so working with a company that has deep expertise and proven experience in both safety and security is critical.
Because industrial applications and systems can be, and are, in the field for decades, there also has to be a plan for how they will be built, deployed and maintained for the entire life of the product – from concept to grave, the complete product lifecycle.
Companies must start thinking about how to capture »tribal knowledge« and clever solutions to complex problems. The entire lifecycle now has to be documented so these systems can be properly serviced through that lifecycle. This is why it’s so important to have a platform like Wind River Studio that addresses the entire product lifecycle and is focused on safety, security, and reliability.
? Thanks for the interview.