Dealing with Differences in Classification Results when Using Different Mini-Batch Sizes in LSTM-based Hybrid GoogleNet Architecture
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I am currently working on a complex neural network architecture that combines a hybrid GoogleNet with an LSTM layer. My goal is to train this model using a large dataset consisting of over 4 million images. During the training phase, I have found that utilizing a larger mini-batch size significantly improves the speed and coverage of the training process.
However, I have encountered an issue during the testing and real-time classification phase. In these stages, I aim to classify individual samples that represent the latest state of the FOREX markets( not all 1124 samples as the training minibatch size). To achieve this, I need to classify each sample separately rather than using a mini-batch. Surprisingly, I have observed substantial differences in the classification results compared to the training phase.
Upon investigating this matter, I learned that the varying mini-batch size during prediction can lead to differences in classification outcomes. This can be attributed to the fact that LSTM requires uniform sequence lengths within a mini-batch, resulting in the application of padding to adjust the sequence sizes. Consequently, the amount of padding can differ depending on the mini-batch size, leading to discrepancies in the classification results.
While I understand that maintaining a consistent mini-batch size for both training and testing is generally recommended, my specific requirements necessitate the classification of individual samples in real-time. I would greatly appreciate any expert guidance on how to address this situation effectively, considering the unique characteristics of my network architecture and dataset.
Thank you for your time and support.
0 Kommentare
Antworten (0)
Siehe auch
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!