FPGAs Conquer Industrial Applications

Edge computing for AI requires performance and flexibility

4. März 2020, 6:00 Uhr | Patrick Dorsey

With Industry 4.0, data volumes and the requirements placed on data transmission and processing are increasing. For Patrick Dorsey of Intel, it is clear that the required computing power must move closer to the data source. But edge computing must be both energy-efficient and scalable.

Diesen Artikel anhören

For the industry as a whole, 2020 is poised for interesting times with the growth of industrial applications and AI driving global demand. The introduction of the term Industry 4.0 was a doubled-edged sword. On the one hand, the potential of the connected factory mackes a lot of sense at every level, from advancements in control systems to applications such as predictive maintenance.

On the other, there are a lot of vagaries related to what Industry 4.0 really means in the reality of an actual implementation. Discussions abound regarding sharing potentially sensitive data with cloud applications, as well as the challenges of real-time control in such a constellation, have resulted in a large number of security concerns being raised.

Anbieter zum Thema

zu Matchmaker+

Industry 4.0 needs data and data lines

As the concept has matured, it is becoming clearer that this transformation is being driven by two core concepts: data and connectivity. Data is collected by ‘things’ at the edge, ranging from sensors to control solutions, while connectivity covers the sharing of that data.

While some of that data is being shared with cloud services, enabling remote evaluation and collaborative analysis, this is occurring only on a limited dataset where there is definite value add. Increasingly, however, private clouds are being developed to deliver the benefits of Industry 4.0 while minimizing the risks of unwanted intruders gaining access to sensitive data.

FPGAs are making it possible to connect the unconnected, linking legacy fieldbus protocols with more modern technologies, such as time-sensitive networking (TSN) over Ethernet and OPC-UA, to share that data for analysis as desired.

Against high latency: edge computing

Patrick Dorsey, Intel: »The introduction of the term industry 4.0 was a doubled-edged sword«.
Patrick Dorsey, Intel:
© Intel

By tackling the latency issue head on, edge computing is growing in significance, putting powerful algorithms next to the things that require control. This also addresses security concerns while developing the necessary infrastructure to collect the data.

For example, in a predictive analysis application, the data collected delivers a much-needed baseline profile of perfectly functioning equipment, allowing for collaborative analysis between supplier and user. In this way, the ethos of Industry 4.0 is definitely being lived, even if it is not being implemented in the manner defined by the hype.

Despite this, there are still many hurdles for industrial systems operators to overcome. By our very nature we humans often prefer to continue with what we know rather than implement a system based upon technology that we have not yet tested.

The key here is to continually invest in research, evaluating new technologies in the context of what needs to be achieved, to see where the real efficiencies and improvements lie. If these have the potential to deliver the business outcomes that provide value to both the supplier and the customer, only then should they be pursued.

Scalable data processing with programmable logic

One area of focus is the rapid scalability of industrial systems to temporarily fulfil unexpected demand, or respond to a period of rapid growth. In such cases fixed-function hardware solutions are making way for soft-functions, but it is not necessarily a hard switch that is taking place. Hybrid approaches allow the use of the best of the old and the new.

The old, hard function approach often provides lower latency thanks to dedicated hardware, resulting in faster reaction times. However, enabling the virtualization of control systems is new and provides the flexibility required to implement fall-over for critical solutions, and the scalability to orchestrate processing resources to match instantaneous need. That flexibility will continue to be provided thanks to the easy reconfiguration of FPGA-based platforms.

Patrick Dorsey, Intel: »By far the biggest trend challenging the industry is Artificial Intelligence«.
Patrick Dorsey, Intel:
© Intel

By far the biggest trend currently challenging the industry is Artificial Intelligence (AI) or, perhaps more realistically termed, the use of neural networks and machine learning. This requires an end-to-end solution and is where Intel processors can play a role at every stage. Many of today’s FPGA-based applications focus on vision systems, utilizing toolkits such as OpenVINO, but there are many untapped areas still waiting to be assessed.

Here the focus really needs to be on applications in which causal relationships can be determined from the aggregate data provided by multiple processes. Hardware such as the Programmable Acceleration Card (PAC) coupled with the oneAPI toolkit are one element of utilizing FPGAs to allow this data to be processed quickly in a solution that is power, cost and form factor constrained, and almost certainly located at the edge. Vendors who can deliver this range of capability across multiple scalar, vector, matrix and spacial (SVMS) architectures in CPU, GPU, AI and other accelerators will be essential to the success of those engineers using them.

Those planning to implement this powerful processing capability would be well advised to take a staged approach in order to avoid being overwhelmed. A first phase would look at complex event processing (CEP), while a second would employ neural networks to synthesize and train machine-learning models, creating machine-learned databases. From here, the final phase is to leverage those databases through AI implemented on an FPGA enabled platform with the goal of determining, or even automatically implementing, optimum actions.

 

The Author

 

Patrick Dorsey von Intel
Patrick Dorsey, Intel
© Intel

Patrick Dorsey

is vice president in the Network and Custom Logic Group (NCLG) and general manager of FPGA and Power Product Marketing at Intel Corporation. He leads the Network and Custom Logic Group’s marketing efforts for Intel’s broad portfolio of programmable logic solutions, spanning Field Programmable Gate Array (FPGA) devices, software and development tools, intellectual property, and hardware acceleration platforms.

His organization’s responsibilities include defining product strategy, executing marketing plans, defining new business models with partners and customers, establishing industry thought leadership, and go-to-market engagement.

Dorsey joined Intel in 2016 as a senior director in FPGA Product Marketing, a position he held at Altera when it was acquired by Intel. He was responsible for worldwide product marketing and product management activities for multiple hardware products spanning hardware platforms and semiconductor devices. Following this, Dorsey was a senior director of Strategy and Planning in Intel’s Programmable Solutions Group (PSG) and was responsible for leading M&A activities and the overall strategic direction and business plan for PSG.

Dorsey has over three decades of experience in general management, sales, consulting, marketing, strategic planning and operational management in hardware and software technology, and professional services. Prior to joining Intel, Dorsey held leadership roles at Texas Instruments, Deloitte Consulting, Sun Microsystems, Xilinx, and Altera.

Dorsey holds a bachelor’s degree in computer engineering from the University of Michigan, Ann Arbor, and an MBA from the University of Michigan, Ross School of Business.​

tanya.van.langenberg@intel.com


Matchmaker+