Notes
Notes on retrieval, evals, observability, and the engineering that starts once the demo is the easy part.
A practical guide to LLM product safety: prompt injection, excessive agency, unsafe outputs, evals, and sober boundaries.
How to reduce hallucinations in LLM systems with better retrieval, abstention, verification, evals, and guardrails.