“A consistent methodology is required”

Development of AI-based embedded software

27. Februar 2025, 14:07 Uhr | Andreas Knoll
Frank Graeber, MathWorks: “Embedded software increasingly relies on AI.”
© MathWorks

Embedded AI makes it easier for engineers to cope with the increasing complexity of technical systems. But how can AI-supported embedded software be developed, verified and debugged? Frank Graeber, Manager Application Engineering at MathWorks, provides the answers.

Diesen Artikel anhören

What role does AI play today in embedded software, e.g. for controls and predictive maintenance applications?

Embedded software increasingly relies on artificial intelligence (AI), which enables engineers to efficiently manage the complexity and versatility of modern technical systems. By using AI, systems can better fulfill dynamic requirements: AI models can adapt to changing conditions and often make more accurate predictions than would be achievable using conventional methods. Examples of this include reinforcement learning agents for control tasks on hardware, e.g. for gear selection in a gearbox. Another popular application is with AI-based virtual sensors (also known as “soft sensors”), which analyze data from existing equipment sensors to forecast potential failures and optimize maintenance schedules, thereby reducing downtime and costs.


How can corresponding AI-based embedded software be developed?

When developing AI for embedded systems, it is crucial to consider the characteristics of the target platform right from the start. How much memory can I use on the embedded system? What are the requirements for the speed of the predictions? In addition, a methodology is needed to ensure the models on the hardware behave as closely as possible to the training environment and can be easily integrated into the rest of the software, such as in a hardware-in-the-loop (HIL) test. After all, AI is often "just" one component in a more complex software package. In summary, engineers require software environments that meet all these requirements to develop embedded AI as efficiently as possible. This can be achieved with Model-Based Design.


What does Model-Based Design (MBD) mean in this context?

Model-Based Design is a methodical approach that structures and optimizes the entire development process of embedded systems. MBD integrates requirements, the actual implementation and test artifacts in a uniform digital thread, which enables a consistent and traceable development process. This approach includes the early design of system models, prototyping, validation through HIL tests and automatic C/C++ source code generation – capabilities that are also relevant in the development of AI models. MBD therefore facilitates the development of robust AI-supported systems by allowing engineers to test and optimize their designs in a virtual environment before physical prototypes are created. Throughout the development process, engineers can benchmark their AI models within their designs to ensure they meet their requirements for memory usage and inference speed. Overall, this leads to more efficient development workflows and higher quality end products.


How can verification and debugging of AI-based embedded software be accomplished?

Verification and debugging software require rigorous testing and verification processes to ensure product quality. Tools such as “Polyspace” from MathWorks provide robust static and dynamic testing for C/C++ code within continuous integration pipelines, enabling developers to detect errors early and collaborate efficiently. Using automatically generated C/C++ code, debugging can be realized in the context of SIL (software-in-the-loop) or PIL (processor-in-the-loop) with a suitable debugger for the host or target platform, as well as for the prediction of AI models.


What examples of this can be seen at the MathWorks booth at embedded world 2025?

At embedded world 2025, MathWorks will present a series of demos that illustrate the application of AI and embedded software in various scenarios. One of the demos is a workflow of AI-supported fault classification on edge devices, which uses physics-based models to generate data sets for fault-free and faulty states. Another demo shows the implementation of a battery management system (BMS) for electric vehicles. The BMS consists of several components and includes an AI model for estimating the state of charge - a virtual sensor. This sensor runs as an embedded AI model on NXP hardware. In another demo, a neural network estimates the rotor position of a motor in real-time based on data from a virtual AI sensor. The virtual sensor runs on the Parallel Processing Unit (PPU) of an Infineon Aurix microcontroller of the TC4x series.

The questions were asked by Andreas Knoll.

MathWorks at the embedded world trade fair: Hall 4, Booth 110

Anbieter zum Thema

zu Matchmaker+

Matchmaker+