Dealing with Large training datasets saved in a number of .mat files
1 Ansicht (letzte 30 Tage)
Ältere Kommentare anzeigen
Ruohao Zhang
am 10 Aug. 2020
Kommentiert: Ruohao Zhang
am 21 Aug. 2020
Hello all,
I have run into a problem where I need to train a LSTM signal classifier with huge amount of data.
Each 1D signal is around 100k samples, every 48 signals are saved in a .mat file. The total number of .mat files is around 2000.
The labels are similarly saved in corresponding .mat files in a different folder.
I would like to know if there's a way to train the network without the necessity of loading the whole thing into memory. (with 64GB ram I can only load ~1300files at once)
Your help will be very much appreciated.
0 Kommentare
Akzeptierte Antwort
Divya Gaddipati
am 13 Aug. 2020
trainData = fileDatastore('/path/to/data', 'ReadFcn', @load, 'FileExtensions', '.mat');
You can either use "load" or your own custom function defining how to load the data.
You can also refer to this link for more information on training LSTM while loading data using fileDatastore.
Weitere Antworten (1)
Frantz Bouchereau
am 20 Aug. 2020
Bearbeitet: Frantz Bouchereau
am 20 Aug. 2020
Ruohao,
You can use two signalDatastores - one to read your signal files and another one to read your labels. You can then combine them using combine(), split the combinedDatastore into training and test sets using subset() and then feeding the combined datastores into the training function of the LSTM network.
With signslDatastore you do not need to write a load function. You specify the variable names you want read from the mat file and those are returned at every read.
HTH
Siehe auch
Kategorien
Mehr zu AI for Signals finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!