Overfitting and underfitting are often shown as progress during stages of training (not enough training-underfit, too much training-overfit) or as a function of model complexity (not enough complexity-underfit, too much complexity-overfit). Like this image, it seems to suggest that underfitting and overfitting can’t happen at the same time but in theory, shouldn’t it be possible for a model to be both at the same time?
wouldnt that just be collapse?