Underfitting and Overfitting

Underfitting and Overfitting Underfitting :

  • Underfitting is opposite to overfitting hence it will not work on training as well as test dataas it has not understood well. This is similar to scenario where a student gives exam with less preparation.
  • To remove underfittingwe should give more data to the model so that it can learn with deeper knowledge.
  • High bias & High variance
    1. High bias: For training data error is High
    2. High variance: For test data error is high

Methods to avoid Underfitting:

  1. Increase model complexity.
  2. Increase number of features, performing feature engineering.
  3. Remove noise from the data.
  4. Increase the number of epochs or increase the duration of training to get better results.

Overfitting :

  • The clear sign of Overfitting is ” when the model accuracy is high in the training set, but the accuracy drops significantly with new data or in the test set”. This means the model knows the training data very well, but cannot generalize. This case makes your model useless in production.
  • Low bias & High variance
    1. Low bias: For training data error is low
    2. High variance: For test data error is high

Methods to avoid Overfitting:

  1. Cross-Validation
  2. Early Stopping
  3. Pruning
  4. Regularization

Leave a Reply

Your email address will not be published. Required fields are marked *