worsen performance on old data
Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Please briefly explain why you feel this question should be reported.
Please briefly explain why you feel this answer should be reported.
Please briefly explain why you feel this user should be reported.
This is a classic case of catastrophic forgetting.
When retraining only on recent data, the model adapts to new patterns while losing performance on older distributions. This is common in incremental learning setups.
To fix it, mix a representative sample of historical data into retraining or use rehearsal techniques. Regularization toward previous weights can also help.
Common mistakes:
Training only on the latest data window
Assuming more recent data is always better
Dropping legacy edge cases
Retraining should expand knowledge, not replace it.