Model drift is the degradation of AI model performance over time as the distribution of production data diverges from training data. Unlike traditional software, which performs consistently until code changes, AI models can decline in accuracy without any code modifications because the world they operate in evolves.
A customer support chatbot trained on January support tickets learns patterns from features that existed in January. By April, the product has shipped new capabilities and changed UI terminology. Users ask questions the training data never covered. The model's answers become outdated.