If you train your deep learning network in MATLAB, you may use OpenVINO to accelerate your solutions in Intel®-based accelerators (CPUs, GPUs, FPGAs, and VPUs) . However, this script don't compare OpenVINO and MATLAB's deployment Option (MATLAB Coder, HDL coder), instead, it will only give you the rough idea how to complete it (MATLAB>OpenVINO) in technical perspective.
Refers to the the link below to understand OpenVINO:
Deep Learning and Prediction
How to export deep learning model to ONNX format
How to deploy a simple classification application in OpenvinoR4 (Third-party software)
Product Focus :
Deep Learning Toolbox
Openvino R4 (Third-party Software)
Written at 28 January 2018
Kevin Chng (2019). MATLAB to OpenVINO (Intel-Inteference) (https://www.mathworks.com/matlabcentral/fileexchange/70330-matlab-to-openvino-intel-inteference), MATLAB Central File Exchange. Retrieved .
when I use "openvino_2019.1.148" version +python3.7+numpy1.16, The previous steps are very good，when this step is reached，Convert the Onnx file to Model Optimizer (onnx to xml and bin file)，I get the error " File "mo.py", line 28, in <module>
from mo.main import main
File "C:\Program Files (x86)\IntelSWTools\openvino_2019.1.148\deployment_tools\model_optimizer\mo\main.py", line 16, in <module>
import numpy as np
File "D:\anaconda3\lib\site-packages\numpy\__init__.py", line 140, in <module>
from . import _distributor_initFile "D:\anaconda3\lib\site-packages\numpy\_distributor_init.py", line 34, in <module>
from . import _mklinit
ImportError: DLL load failed: 找不到指定的模块。"
I google it, can't find answer, My final solution is to lower the numpy version：uninstall numpy ,and re-install numpy version1.14.5 , it works!!!
Create scripts with code, output, and formatted text in a single executable document.