When AI eats itself
As tech companies search for more troves of human-generated text and images to train their AI models on, some have turned to using AI-generated content (or “synthetic data”) to help fill in the gaps.
Researchers have shown this can quickly lead to a phenomenon called “model collapse.” In addition to being a great band name, model collapse describes a situation where after only a few generations of eating its own content, the model will spit out nonsense.
The Times’ excellent visual explainer shows how quickly generative AI degenerates into repeating phrases incoherently and blurry, similar images.
The Times’ excellent visual explainer shows how quickly generative AI degenerates into repeating phrases incoherently and blurry, similar images.