Header image for Simulink to Cloud blog post

Simulink to Cloud: Deploying Digital Twins with Docker and UDP

simulink udp digital-twin docker python

Today, I’m building a Simulink model that I’ll deploy to a Dockerized environment and integrate with other components to build an application that will run 24/7. Although the model is simple, the architecture is a good starting point for building hybrid digital twins for predictive maintenance and monitoring applications. I’ll show you how to transform a Simulink model into a portable binary that acts as a lightweight live digital twin within a Docker composition.

My model is designed to run continuously in the cloud, using sensor data and control setpoints streamed externally via UDP. It processes the data and pushes the results to downstream Python services allowing users to monitor outputs through a dashboard and control the twin by adjusting setpoint value.

This is a classic blueprint for cloud-native digital twins featuring live data ingestion, high-fidelity physical models and IT deployment. I'm sharing the full overview and source code so you can use this pattern to build your own.

Architecture diagram of Simulink Digital Twin with UDP and Docker

The publisher acts as a sensor data provider, continuously generating synthetic data and control input. The control input, which typically comes through a PLC in industrial settings, is provided manually via the dashboard. The Simulink model processes both the sensor and control inputs, then forwards the results to a subscriber, which stores data in a database. Finally, end users monitor the data through a real-time dashboard.

Simulink Model

Simulink model showing UDP Sensor and SetPressure blocks

The UDP_Sensor data is sinusoid data and continuously streamed through UDP. The UDP_SetPressure is , and the digital twin subsystem computes the sum of the two input signals.

Embedded Coder and Simulink Coder are used to with subsequent compilation into a binary. Once the binary is compiled, it is integrated into a . The speed at which data will be streamed to dashboard is controlled by pacing_rate parameter defined in the project-wide . The final executable, as well as its dependencies is .

Real-time dashboard monitoring the digital twin

Deployment into Docker Composition

To put all containers together I use docker compose where I point to pull Docker images that are built and . The docker compose up starts the entire stack and localhost:5000 hosts the dashboard.

Docker Compose starting the digital twin application

Source Code

The full source code, including the MATLAB scripts, Dockerfiles, and Python services, is available on GitHub: