Catastrophic forgetting refers to the phenomenon where a machine learning model forgets previously learned information when it is trained on new data. This often happens in neural networks, particularly when they learn sequentially or when there’s a significant difference between the old and new data. It highlights a major challenge in machine learning, especially in scenarios involving transfer learning, where knowledge needs to be retained while adapting to new tasks.
congrats on reading the definition of catastrophic forgetting. now let's actually learn it.