Representation learning

The Limits of Today’s AI Models

The Limits of Today’s AI Models

Karan Goel, CEO of Cartesia, discusses the fundamental limitations of Transformer architectures, arguing they behave more like retrieval systems than learning systems. He explains how State Space Models (SSMs) enable compression and abstraction, and why Cartesia is tackling multimodal intelligence by first solving for voice AI, aiming to develop a transferable 'recipe' for end-to-end representation learning.

AI doesn't work the way you think it does

AI doesn't work the way you think it does

Today's AI, despite its impressive capabilities, may be an "impostor" with a messy, unstructured internal understanding—a "spaghetti" representation. This summary explores an alternative, open-ended approach to building AI that fosters a deep, modular, and truly intelligent foundation, moving beyond brute-force optimization to embrace serendipitous discovery and "evolvability."