ONNX export yields error in Windows ML

1 Ansicht (letzte 30 Tage)
Gabriel Mittag
Gabriel Mittag am 18 Apr. 2019
Beantwortet: Ting Su am 24 Mai 2019
I want to export my deep neural network with the ONNX export function and use it with Windows ML. To test it, I use WinMLRunner, which simply checks an .onnx file. This seems to work with CNN-networks, however for LSTM-networks I receive the error: "First input does not have rank 2". If I have more than one LSTM-layer in the network the error message somehow changes to: "First input tensor must have rank 3".
I attached a file with following simple LSTM-network and another file with the corresponding ONNX export in onset version 7 (Windows ML needs onset v7 or v8):
layers = [ ...
sequenceInputLayer(14)
lstmLayer(100, 'OutputMode', 'last');
fullyConnectedLayer(1)
regressionLayer];
I exported the network as follows:
exportONNXNetwork(lstmNet, 'lstm_net_onset_v7.onnx', 'OpsetVersion', 7)
Then tested it with WinMLRunner:
WinMLRunner.exe -model C:\lstm_net_onset_v7.onnx -cpu
Thanks for any suggestions on how to solve this. I already posted this problem on the fileexchange page of the ONNX converter.

Akzeptierte Antwort

Ting Su
Ting Su am 24 Mai 2019
Hi, Gabriel,
A new release of ONNX export function will be released soon. The LSTM support will be better in the new release.

Weitere Antworten (0)

Produkte


Version

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by