-
I have been gradually increasing the number of classes in my dataset from 16 to 27 to 36 or 40.
Since I increased the number of classes to 36 or 40 I noticed that some classes start learning very late or not at all (seemingly). If I look at the last layer I noticed the conversion from So this convolution not only upsamples the patch but also increases the feature channels from 32 to 40. Is this why I start seeing multiple classes learning very late or not at all? Kind of like the 40 classes are predicted from a linear combination of 32 basis vectors. Should I increase the channels to |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @dyollb, The ability of a U-Net (or any neural network) to distinguish between different classes is not necessarily limited by the number of output channels or strides. Rather, it is often determined by the complexity and variability of the data, as well as the capacity of the network itself (depth, width, etc.). Thanks. |
Beta Was this translation helpful? Give feedback.
Hi @dyollb,
The ability of a U-Net (or any neural network) to distinguish between different classes is not necessarily limited by the number of output channels or strides. Rather, it is often determined by the complexity and variability of the data, as well as the capacity of the network itself (depth, width, etc.).
Increasing the number of classes naturally makes the learning task more challenging, as the model needs to learn to distinguish between a larger number of groups. Your observation that some classes start learning very late or not at all is aligned with the intuition that some classes may be more difficult to learn than others. There could be several reasons for this phenomena.…