Your Guide to MATLAB and Python for AI - MATLAB & Simulink

Pocket Guide

MATLAB with Python for Artificial Intelligence

Explore how to integrate MATLAB with PyTorch and TensorFlow and leverage LLMs in your AI workflows.

This is your go-to guide for combining MATLAB and Python®-based models into your artificial intelligence (AI) workflows. Discover how to convert between MATLAB, PyTorch®, and TensorFlow™ using Deep Learning Toolbox.

By integrating with PyTorch and TensorFlow, MATLAB enables you to:

  •   Facilitate cross-platform and cross-team collaboration
  •   Test model performance and system integration
  •   Access MATLAB and Simulink tools for engineered system design
Diagram of conversion flow.

Convert Between MATLAB, PyTorch, and TensorFlow

With Deep Learning Toolbox and MATLAB, you can access pretrained models and design all types of deep neural networks. But, not all AI practitioners work in MATLAB. To facilitate cross-platform and cross-team collaboration when designing AI-enabled systems, Deep Learning Toolbox integrates with PyTorch and TensorFlow.

Why import PyTorch and TensorFlow models into MATLAB

When you convert a PyTorch or TensorFlow model to a MATLAB network, you can use your converted network with all MATLAB AI built-in tools, such as functions and apps, for transfer learning, explainable AI and verification, system-level simulation and testing, network compression, and automatic code generation for target deployment.

Prepare PyTorch and TensorFlow models for import

Before importing PyTorch and TensorFlow models into MATLAB, you must prepare and save the models in the correct format. You can use the code below in Python to prepare your models.

The PyTorch importer expects a traced PyTorch model. After you trace the PyTorch model, save it. For more information on how to trace a PyTorch model, go to Torch documentation: Tracing a function.

X = torch.rand(1,3,224,224)
traced_model = torch.jit.trace(model.forward,X)
traced_model.save("torch_model.pt")

Your TensorFlow model must be saved in the SavedModel format.

model.save("myModelTF")

How to import PyTorch and TensorFlow models

Diagram of how PyTorch and TensorFlow can be integrated into MATLAB.

You can import models from PyTorch and TensorFlow into MATLAB, converting them into MATLAB networks with just one line of code.

Use the importNetworkFromPyTorch function and specify PyTorchInputSizes with the correct input size for the specific PyTorch model. This allows the function to create an image input layer for the imported network because PyTorch models do not inherently have input layers. For more information, see Tips on Importing Models from PyTorch and TensorFlow.

net = importNetworkFromPyTorch("mnasnet1_0.pt",PyTorchInputSizes=[NaN,3,224,224])

To import a network from TensorFlow, use the importNetworkFromTensorFlow function.

Import PyTorch and TensorFlow models interactively

You can import models from PyTorch interactively with the Deep Network Designer app. Then, you can view, edit, and analyze the imported network from the app. You can even export the network directly to Simulink from the app. 

How to export models from MATLAB to PyTorch and TensorFlow

Diagram illustrates that models from MATLAB can be exported to PyTorch as well as ONNX and TensorFlow.

You can export and share your MATLAB networks to TensorFlow and PyTorch. Use exportNetworkToTensorFlow to directly export to TensorFlow and the exportONNXNetwork function to export to PyTorch via ONNX™.

exportNetworkToTensorFlow(net,"myModel")

Try it in your browser

Import a TensorFlow model and explain prediction.

LIME (peacock).

Co-Execute Python-Based Models in MATLAB and Simulink

Use Python-based AI models directly within your MATLAB workflow or Simulink system to test model performance and system integration.

Comparison: Co-execution vs model conversion

First, compare co-executing a PyTorch or TensorFlow model in your MATLAB environment against converting the external-platform model into a MATLAB network so you can decide which workflow best suits your task.

 

Co-Execution

Model Conversion

Works for every PyTorch and TensorFlow model

Yes

No

Simulate model in Simulink

Yes

Yes

Automatically generate code

No

Yes

Apply explainability techniques

Only for object detection

Yes

Verify robustness and reliability

No

Yes

Use low-code AI apps

No

Yes

Compress network

No

Yes

Co-execute Python-based models in MATLAB

Call PyTorch and TensorFlow models, or any Python code, directly from MATLAB. This enables you to compare Python-based models, for example, to find the model yielding the highest accuracy, as part of the AI workflow you built in MATLAB.

Diagram shows how you can call PyTorch and TensorFlow models, or any Python code, directly from MATLAB.

Co-execute Python-based models in Simulink

Simulate and test PyTorch, TensorFlow, ONNX, and custom Python models within systems using the Simulink co-execution blocks. This enables you to iterate on your design, assess model behavior, and test system performance.

Diagram show how to simulate and test PyTorch, TensorFlow, ONNX, and custom Python models within systems using the Simulink co-execution blocks.

Why Simulink for AI models

By combining AI with Model-Based Design, in addition to testing the integration of deep learning models into larger systems, engineers can accelerate and enhance the design of complex systems for applications such as virtual sensor design.

Lane and Vehicle Detection in Simulink Using Deep Learning

Call MATLAB from Python

Diagram depicts calling MATLAB from Python.

Another option for combining MATLAB with Python for your AI workflow is to call MATLAB from your Python environment. Explore this option to prepare data, which you input to a Python-based model using MATLAB domain-specific tools, or call MATLAB AI tools for visualizing and interpreting the decisions of a Python-based AI model.

Open repository in MATLAB Online

You can work with LLMs in MATLAB Online. File Exchange and GitHub® repositories with MATLAB code have an Open in MATLAB Online button. By clicking the button, the repository opens directly in MATLAB Online.


Access LLMs from MATLAB

You can access popular large language models (LLMs), such as gpt-4, llama3, and mixtral, from MATLAB through an API or by installing the models locally. Then, use your preferred model to generate text. Alternatively, you can use a pretrained BERT model, which is included in MATLAB.

LLMs for natural language processing

LLMs have revolutionized natural language processing (NLP) because they can capture complex relationships between words and nuances present in human language. Using an LLM from MATLAB is only part of the NLP pipeline (see MATLAB AI for NLP). Take advantage of tools in MATLAB to build the complete pipeline. For example, you can use Text Analytics Toolbox functions to access and prepare text data.

Illustration of the Natural Language Processing workflow.

Natural language processing pipeline.

Repository: LLMs with MATLAB

You can connect MATLAB to the OpenAI® Chat Completions API (which powers ChatGPT™), Ollama™ (for local LLMs), and Azure® OpenAI services using the LLMs with MATLAB repository.

Access LLMs via OpenAI API

Using the code in the repository, you can interface the OpenAI API from your MATLAB environment and use models (such as GPT-4 and GPT-4 Turbo) for various NLP tasks, including building your own chatbot and sentiment analysis. To interface the OpenAI API, you must obtain an OpenAI API key. See OpenAI API for more details on keys and charges.

Illustration of a user query going through a chatbot to generate a response.

Building a chatbot.

Access LMMs via Azure OpenAI Service

To use Azure OpenAI Services, you must first deploy a model on your Azure account and obtain a key for the model. Azure OpenAI co-develops its APIs with OpenAI and adds the security capabilities of Microsoft® Azure. The LLMs with the MATLAB repository provide you with code to connect to Azure OpenAI Services from MATLAB.

Access local LLMs via Ollama

Using the code in the repository and connecting MATLAB to a local Ollama server, you can access popular local LLMs, such as llama3, mistral, and gemma. You can use local LLMs for NLP tasks, such as retrieval-augmented generation (RAG), which can enhance the accuracy of the LLM by using your own data.

Illustration of a workflow for retrieval-augmented generation.

Workflow for retrieval-augmented generation.

Try it in your browser

Create a simple ChatBot with MATLAB and OpenAI API.


Panel Navigation

Tutorial

Deep Learning Onramp

Learn the basics of deep learning for image classification problems in MATLAB