AI may learn better when it’s allowed to talk to itself. Researchers showed that internal “mumbling,” combined with short-term memory, helps AI adapt to new tasks, switch goals, and handle complex challenges more easily. This approach boosts learning efficiency while using far less training data. It could pave the way for more flexible, human-like AI systems.
Scientists at Skoltech developed a new mathematical model of memory that explores how information is encoded and stored. Their analysis suggests that memory works best in a seven-dimensional conceptual space — equivalent to having seven senses. The finding implies that both humans and AI might benefit from broader sensory inputs to optimize learning and recall.