A directory of what people actually want. Classified, clustered, ranked and updated daily
Productivity · 1 mentions
#1992507686802624701
Google just dropped "Attention is all you need (V2)" This paper could solve AI's biggest problem: Catastrophic forgetting. When AI models learn something new, they tend to forget what they previously learned. Humans don't work this way, and now Google Research has a solution. Nested Learning. This is a new machine learning paradigm that treats models as a system of interconnected optimization problems running at different speeds - just like how our brain processes information. Here's why this matters: LLMs don't learn from experiences; they remain limited to what they learned during training. They can't learn or improve over time without losing previous knowledge. Nested Learning changes this by viewing the model's architecture and training algorithm as the same thing - just different "levels" of optimization. The paper introduces Hope, a proof-of-concept architecture that demonstrates this approach: ↳ Hope outperforms modern recurrent models on language modeling tasks ↳ It handles long-context memory better than state-of-the-art models ↳ It achieves this through "continuum memory systems" that update at different frequencies This is similar to how our brain manages short-term and long-term memory simultaneously. We might finally be closing the gap between AI and the human brain's ability to continually learn. I've shared link to the paper in the next tweet!