Neural network classification improvments
16 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
hi all,
I have trained a neural network using fitcnet, the data is a simple classification of hole numbering. the data is as per the simplified attached picture. but essentially the center of the hole xy point, the angle and distance relative to the center of the whole blade, the whole blade major axis and finally the hole number a value from 1:319.
the problem i have found is two-fold.
- i think it may be 'over-fitted' it works well on the synthetic data but performs worse on real data
- 2. it rountinly will identify multilple holes as the same number e.g there would be 4 hole number 3's now in the real data sometimes a single hole can split into a pair of much smaller holes they stay rougly within bounds of the original hole.
The data into the training was a single row of info for one hole listing the above info including its class effectivly treating each time i then ask it to predict the classification of a new hole being a 'clean-sheet' with no knowledge of already predicted holes.
Does anyone have any suggestions for possible improvments?
many thanks tim

3 Kommentare
Rik
am 4 Sep. 2025
If these are holes you're tracking over time, would it be safe to assume they will not move much? That would mean you can crop the image to a small patch surrounding the original/previous location of the hole and then detect which hole is the closest to the center. That way you can keep the identification, but keep using the fact it is an emerging property.
But with the rest of your question it is still not clear to me how this is a question I can help with. I would strongly suggest playing around with a chatbot first and then post a more refined question on a more specialized forum.
Antworten (0)
Siehe auch
Kategorien
Mehr zu Sequence and Numeric Feature Data Workflows finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!