CNN Deep learning: Data size Vs Iteration per epoch

12 Ansichten (letzte 30 Tage)
Gobert
Gobert am 17 Sep. 2020
Kommentiert: Gobert am 21 Sep. 2020
I need your help to understand why the "data size" affects the number of "iteration per epoch". See below A and B.
(A) (B)
With the number of iterations per epoch, shown in figure A, the training data size = 3700 images. With the number of iterations per epoch, shown in figure B, the training data size = 57000 images. I did not change any settings in my CNN network (and, in both cases, the input images had the same size). Can you please explain why changing (or increasing) the data size has increased the number of iteration per epoch? In other words, what is the relationship between data size and number of iteration per epoch?
  2 Kommentare
Ritu Panda
Ritu Panda am 21 Sep. 2020
Iterations per epoch depends on the number of training sample that the model is trained on in each epoch.
For each epoch, your training data is divided into batches of data (specified by the miniBatchSize parameter in the options argument). The model trains on every batch and updates the weight parameters.
Hence, Iterations per epoch = Number of training samples ÷ MiniBatchSize
i.e., In how many iterations in a epoch the forward and backward pass takes place during training the network.
Gobert
Gobert am 21 Sep. 2020
I agree with you, @Ritu Panda

Melden Sie sich an, um zu kommentieren.

Antworten (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by