“The solution to continuous learning is literally a billion-dollar problem.”
Learning Gaze
New research has highlighted apparent flaws in AI models' ability to learn new information — or, as it turns out, they just can't learn at all.
According to a study conducted by a team of scientists at the University of Alberta in Canada and published in the journal Nature this week, AI algorithms trained with deep learning (put simply, AI models such as large-scale language models built by finding patterns in large amounts of data) do not work in a “continuous learning environment,” that is, when new concepts are introduced to the model's existing training.
In other words, if you want to teach an existing deep learning model something new, you will probably have to retrain it from scratch, otherwise research shows that the so-called artificial neurons in the brain will end up with a zero value, resulting in a total loss of “plasticity”, or ability to learn.
“If you think of it like a brain, it's like 90 percent of your neurons are dead,” Shibhanshu Dohare, a computer scientist at the University of Alberta and lead author of the study, told New Scientist. “There are just not enough neurons left to learn from.”
And as the researchers point out, training advanced AI models is a laborious and extremely expensive process, posing a major economic obstacle for cash-hungry AI companies.
“If the network is a large language model and the data represents a significant portion of the Internet, each retraining can cost millions of dollars in computational costs,” the study said.
Obstacle Course
This phenomenon of loss of plasticity is also a major disconnect between current AI models and imagined “artificial general intelligence,” a theoretical AI that would be as intelligent as humans. After all, in human terms, this would be like having to completely reboot your brain from scratch every time you take a new college course, lest you destroy a large portion of your neurons.
If there's a silver lining for AI companies, what is it? Interestingly, the study authors were able to create an algorithm capable of randomly reviving damaged or “dead” AI neurons, achieving some success in addressing the plasticity issue.
However, as things stand, no practical solution has yet been found.
“Solution to continuous learning is literally a billion-dollar problem,” Dohar told New Scientist. “If we had a true comprehensive solution that allowed us to continuously update models, the cost of training these models would go down dramatically.”
More on AI training: When AI is trained on AI-generated data, it spits out gibberish