ONNX export yields error in Windows ML
    6 Ansichten (letzte 30 Tage)
  
       Ältere Kommentare anzeigen
    
    Gabriel Mittag
 am 18 Apr. 2019
  
    
    
    
    
    Kommentiert: Marziyeh
 am 2 Feb. 2025
            I want to export my deep neural network with the ONNX export function and use it with Windows ML. To test it, I use WinMLRunner, which simply checks an .onnx file. This seems to work with CNN-networks, however for LSTM-networks I receive the error: "First input does not have rank 2". If I have more than one LSTM-layer in the network the error message somehow changes to: "First input tensor must have rank 3". 
I attached a file with following simple LSTM-network and another file with the corresponding ONNX export in onset version 7 (Windows ML needs onset v7 or v8):
layers = [ ...
    sequenceInputLayer(14)
    lstmLayer(100, 'OutputMode', 'last');
    fullyConnectedLayer(1)
    regressionLayer];
I exported the network as follows:
exportONNXNetwork(lstmNet, 'lstm_net_onset_v7.onnx', 'OpsetVersion', 7) 
Then tested it with WinMLRunner:
WinMLRunner.exe -model C:\lstm_net_onset_v7.onnx -cpu
Thanks for any suggestions on how to solve this. I already posted this problem on the fileexchange page of the ONNX converter. 
0 Kommentare
Akzeptierte Antwort
  Ting Su
    
 am 24 Mai 2019
        Hi, Gabriel, 
A new release of ONNX export function will be released soon. The LSTM support will be better in the new release. 
Weitere Antworten (0)
Siehe auch
Kategorien
				Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!


