Schwerpunkte

Machine Learning

GPU Coder for Neural Networks

11. April 2018, 08:51 Uhr   |  Joachim Kroll

GPU Coder for Neural Networks
© Uwe Niklas | WEKA Fachmedien

Mathworks : From design and simulation to the real system.

GPUs give machine learning algorithms a powerful performance boost. The GPU Coder from Mathworks generates the appropriate code.

MathWorks presented at embedded world several examples of machine learning, such as the recognition and classification of objects and language for autonomous systems. The principle of machine learning is that engineers develop models that abstract information from data, draw conclusions, and autonomously learn how to respond to input signals in the future.

But how does the machine acquire such intelligence? For machine-learning processes such as »Support Vector Machine«, microcontrollers are mostly used, while for the implementation of Convolutional Neural Networks (CNN) in the field of Deep Learning, GPUs have established themselves. In both cases, the developed model must be transformed into less abstract code, for example C, in order to be transferred to the hardware.

This transformation is implemented by the MathWorks GPU Coder. It integrates the increasingly intelligent MATLAB applications and deep learning networks on GPUs. The MATLAB models are thus converted into optimized Nvidia CUDA code. This means that CPU-intensive parts of the MATLAB code can be executed faster. Changes in the MATLAB code can be transferred to the CUDA code at the push of a button, which simplifies synchronization between the two models.

Potential target systems for the code are Nvidia Tesla GPUs, Jetson System-on-Modules (SoMs) and the Nvidia Drive PX platform. The GPU Coder is particularly suitable for algorithms in the field of image and speech recognition.

Auf Facebook teilenAuf Twitter teilenAuf Linkedin teilenVia Mail teilen

Verwandte Artikel

MathWorks GmbH, MathWorks GmbH