Tech Week Singapore 2025
The Road to LLMs: From a Single Neuron to a Global Brain
This tech talk, "The Road to LLMs," charts the incredible evolution from a single computational concept to today's powerful AI. We begin with the fundamental building block—the artificial neuron—and explore how feedback loops enable it to learn. We then scale up, layering these neurons into the deep networks that powered the initial AI boom. The narrative pivots to the unique challenge of language, showing the limitations of early sequential models before arriving at the game-changing 2017 invention: the Transformer architecture. You'll gain an intuitive grasp of its "self-attention" mechanism, the key that unlocked our current progress. Finally, we'll see how scaling these Transformers into massive models, trained on the simple task of predicting the next word, led to the surprising emergent abilities of reasoning and creativity that define modern LLMs.