Free, visual, interactive guide to AI - covers Model Internals, the Transformer, RAG, Vector Databases, and Agent Frameworks. By Rajul Babel.
Each chapter uses animations, interactive diagrams, and step-by-step breakdowns to build intuition before showing the math.
- Neural Network Foundations - neurons, weights, biases, activations, forward/backward pass, gradient descent, dropout, Adam, weight init.
- How LLMs Train - tokenization, self-supervised learning, cross-entropy, SFT, RLHF, DPO.
- Scaling & Modern Techniques - scaling laws, batch training, distillation, CLIP, the full pipeline.
- Road to Transformers - CNN, RNN, RNN's flaws, the Transformer.
- Transformer Input Pipeline - embeddings, positional encoding (sinusoidal & RoPE).
- Attention - Q, K, V - intuition behind queries, keys, values.
- Attention - Full Computation - dot products, scaling, softmax, multi-head, the complete formula.
- The Encoder - Add & Norm, FFN, residuals, pre-norm vs post-norm, batch norm vs layer norm.
- The Decoder - decoder-only LLMs, causal masking, cross-attention.
- Modern LLM Techniques - KV cache, grouped-query attention, mixture of experts, reasoning models.
- Vector Databases - HNSW, IVF, Vamana, scalar / product / binary quantization, IVF-PQ, HNSW+PQ, hybrid search, rerankers, FAISS, pgvector, Qdrant, Pinecone, Weaviate, Milvus, Chroma. Includes RAG and Agent Frameworks (LangGraph).
- React 18 - hooks-only component tree
- Vite - build toolchain
- GitHub Actions - CI/CD
- GitHub Pages - hosting
npm install
npm run devOpen http://localhost:5173/learn-ai/
npm run build
npm run previewPushes to main automatically build and deploy via GitHub Actions.
Rajul Babel - LinkedIn - GitHub
MIT