The State Space Model taking on TransformersRight now, AI is eating the world.And by AI, I mean Transformers. Practically all the big breakthroughs in AI over the last few years are due to Transformers.Mamba, however, is one of an alternative class of models called State Space Models (SSMs). Importantly, for the first time, Mamba promises similar performance (and crucially similar scaling laws) as the Transformer whilst being feasible at long sequence lengths (say 1 million tokens). To achieve this long context, the Mamba authors remove the “quadratic bottleneck” in the Attention Mechanism. Mamba also runs fast – like “up to…
Read More