AI Engineering for Backend Developers
A structured series on building real AI systems — RAG pipelines, agents, LLM tooling, and production patterns. Written for engineers who already know how to build software.
Phase 1 in progress
5 articles planned
Updated weekly
1
Foundations of AI Engineering
● In ProgressCore concepts every backend engineer needs before building AI systems — tokenization, transformer architecture, embeddings, and how LLMs actually work under the hood.
How LLM Tokenization Works
BPE, WordPiece, and why token count matters for your context window.
Transformer Architecture: From Attention to GPT
Self-attention, positional encoding, and why the architecture changed everything.
Embeddings & Vector Spaces: What They Actually Mean
How text becomes numbers and building intuition for vector search.
New articles drop every week
Subscribe to get notified when the next article in this series goes live. One email a week — no noise, no marketing.