- trainnet supports dlnetwork objects, which support a wider range of network architectures that you can create or import from external platforms. For example, you can create and train neural networks with multiple inputs and multiple outputs.
- trainnet enables you to easily specify loss functions. You can select from built-in loss functions or specify a custom loss function.
- trainnet outputs a dlnetwork object, which is a unified data type that supports network building, prediction, built-in training, visualization, compression, verification, Simulink, code generation, and custom training loops.
- trainnet is typically faster than trainNetwork.
in 2024a version of MATLAB, I can't see my training and validation accuracy in training plot progress.
27 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
ZEBA
am 4 Apr. 2024
Kommentiert: Muharrem Tumcakir
am 11 Jun. 2024
I am using MATLAB 2024a version. Some functions and things are a littelke bit different in this version. I am experimenting a cnn architecture from scratch to classify images. In earlier version of MATLAB, final validation accuiracy could be seen in training plot progress. But in this version I can't see it. I have checked the MATLAB documentation but could not find any solution.
Also, I can not use 'classify' function here and i am confused about 'forward' function in this version.
0 Kommentare
Akzeptierte Antwort
cui,xingxing
am 5 Apr. 2024
Bearbeitet: cui,xingxing
am 4 Mai 2024
Hi,@ZEBA
As far as I can guess you should be using trainnet to train the network, which is indeed slightly different from the previous version. However, this doesn't affect your ability to see the final training and validation accuracy, you should check if the "ValidationData" in the "trainingOptions" contains your validation data.
Since R2024a, it is recommended to use trainnet instead of trainNetwork, which is almost compatible with the old version and offers more advantages:
BTW, you can't use the "classify" function on dlnetwork objects, because it only works with older versions of SeriesNetwork or DAGNetwork objects.
-------------------------Off-topic interlude-------------------------------
I am currently looking for a job in the field of CV algorithm development, based in Shenzhen, Guangdong, China. I would be very grateful if anyone is willing to offer me a job or make a recommendation. My preliminary resume can be found at: https://cuixing158.github.io/about/ . Thank you!
Email: cuixingxing150@gmail.com
6 Kommentare
Muharrem Tumcakir
am 11 Jun. 2024
Hello Cui,
Thanks for the answer. I have a related issue on R2024a "trainnet"s progress window:
I have built a dlnetwork with approx. 150 k learnables and I have built Custom Sequence Datastore by using document and code found in MATLAB Help and created approx. 3 Million sequences (observations) with 2 class (in 2 folders) for training and another 1.5 Million sequences for valdation.
I defined those in the training options with minibatch size 150 sequences.
I have 60 GB of RAM and started my training with "CPU" option because I didnt manage to initiate it in parallel since the Custom Datastore does not support partitionable/parallel operations.
My code fills approx. 55/60 GB of RAM in peak levels right now and I think it is proceeding. And my issue starts here. My "training-progress" window and "verbose" data (though I set my verbose frequency to only 10 iteration) is not shown during my "trainnet" runs, unless I hit Ctrl+C or "Stop" button in the debugger window. After these my "training-progress" window and verbose data start flowing until my training stop procedure finishes.
Both my training-progress window and verbose data display seem to be suppressed. Why do you think this issue appears? Due to busy memory or something else? Is there any way to see these information during runtime in my heavy memory/CPU case? B.T.W: My CPU load is %65 in average unlike the RAM.
Thanks in advance,
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Parallel and Cloud finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!