Overfitting and underfitting are often shown as progress during stages of training (not enough training-underfit, too much training-overfit) or as a function of model complexity (not enough complexity-underfit, too much complexity-overfit). Like this image, it seems to suggest that underfitting and overfitting can’t happen at the same time but in theory, shouldn’t it be possible for a model to be both at the same time?
That’s a good point and it could be likely to happen in only in certain types of datasets (like ones prone to model bias)