I added thousands of new user interactions to my training dataset.
Instead of improving, the recommendation quality dropped.
Users are now getting irrelevant suggestions.
It feels like more data made the model less accurate.
Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Please briefly explain why you feel this question should be reported.
Please briefly explain why you feel this answer should be reported.
Please briefly explain why you feel this user should be reported.
This happens when the new data has a different distribution than the old data. If recent user behavior differs from historical patterns, the model starts optimizing for conflicting signals.
Neural networks are sensitive to data distribution shifts. When you mix old and new behaviors without proper weighting, the model may lose previously learned structure and produce worse recommendations.
Using time-aware sampling, recency weighting, or retraining with sliding windows helps the model adapt without destroying prior knowledge.
Common mistakes:
Mixing old and new data blindly
Not tracking data drift
Overwriting historical patterns
The practical takeaway is that more data only helps if it is consistent with what the model is learning.