Does more number of hidden units in lstm layer means the network require more training time
24 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
debojit sharma
am 22 Feb. 2023
Beantwortet: Himanshu
am 3 Mär. 2023
I have the following queries regarding the number of hidden units in LSTM layer:
Does more number of hidden unit in the lstm layer means the network requires more training time?
I mean, how the number of hidden units in lstm layer affects the training time of the network, computational complexity?
Is it so that more number of hidden units helps lstm network to remember the previous data more?
0 Kommentare
Akzeptierte Antwort
Himanshu
am 3 Mär. 2023
Hello Debojit,
I understand that you have some queries regarding the hidden units in the LSTM layer.
The training time of the network depends on various factors, like the number of layers used in the network architecture, the complexity of the network architecture, the size of the dataset, etc.
Increasing the number of hidden units in an LSTM layer can increase the network's training time and computational complexity as the number of computations required to update and propagate information through the layer increases.
Increasing the number of hidden units also increases the capacity of the network to store and learn from past data. However, this is not always the case, and there is a trade-off between the network capacity and generalization performance.
A more extensive network may have more capacity to remember past data. Still, it may also be more prone to overfitting, which can affect the generalization performance of the network on unseen data.
You can refer to the following documentation to learn more about LSTM networks:
0 Kommentare
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Deep Learning Toolbox finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!