Yegor Tkachenko

Python, Deep Learning, and LLMs - Go from zero to training a pocket-sized language model.

by
This book is a coding and machine learning bootcamp in textbook form. You will get hands-on experience with the Python programming language, essential math concepts, neural nets – and, by the end, you will have coded and trained a pocket-sized language model. A high school math background is all you need to get started; no prior programming experience is required.

Add a comment

Replies

Best
Yegor Tkachenko
Large language models (LLMs) represent one of humanity's most exciting steps towards general artificial intelligence (AI). To outsiders, they can seem intimidating and opaque. Yet the core ideas behind LLMs are surprisingly accessible to anyone familiar with fairly basic high school math. Unfortunately, most introductions to LLMs fall into one of two extremes: 1. They are too superficial to be useful. 2. They assume advanced knowledge of programming, linear algebra, and optimization, making them inaccessible to newcomers. This book aims to bridge this gap. It offers a self-contained, rigorous, yet approachable introduction to LLMs for readers with minimal math background and no prior programming or AI experience. All algorithms and methods are implemented in the Python programming language - the de facto standard for machine learning. Complete, executable code is provided throughout. You are encouraged to type the code out, explore what it does, experiment, and, when you encounter errors or confusion, consult Google or a capable LLM. Active engagement is the fastest path to deep understanding, and you might be surprised by how much you can learn in a short time. This book is a language-immersion experience and a survival guide, designed to maximize your learning per hour invested. It requires work and will challenge you, but by the final chapter you won't just understand how LLMs work - you will have built a basic one yourself in Python. The goal is to equip you to meet the new era of AI with confidence and clarity.