Model Drift
Definition: Model drift happens when an AI system’s performance changes over time because the data it sees in the real world becomes different from the data it was trained on. As the world changes, the AI’s old patterns stop matching new situations, and its accuracy slowly declines.
Example
An AI tool that helps predict case outcomes in personal injury law might work well at first, but if new laws, policies, or court decisions change how cases are decided, the model’s predictions become less reliable. It has drifted because the data it learned from is now outdated.
Why It Matters?
Model drift matters because decisions made by AI systems can affect real people. In law, it can lead to inaccurate predictions, unfair results, or bad advice if no one notices the model’s performance has changed. Regularly checking and retraining AI models helps keep them accurate and trustworthy.
