Glossary

Embeddings

Embeddings are numerical representations of text—vectors of hundreds or thousands of floating-point numbers—that capture semantic meaning in a form machines can compare mathematically. Two sentences about the same concept will produce vectors that are close together in this high-dimensional space, even if they share no words. This is what makes semantic search work: instead of matching keywords, you match meaning. Embeddings power recommendation engines, clustering, anomaly detection, and the retrieval half of RAG architectures. The quality of your embeddings determines the quality of everything downstream. A bad embedding model will retrieve irrelevant documents, and no amount of prompt engineering on the generation side will fix that. Choosing the right embedding model for your domain—and evaluating it rigorously—is unglamorous work that separates systems that actually perform from ones that demo well.

Referenced in these posts:

Noah on Bloomberg Odd Lots: Why the Tech World Is Going Crazy for Claude Code

On Bloomberg’s Odd Lots, Noah Brier highlights Claude Code as a “computer within your computer,” using file system access and Unix commands to bypass token-heavy workflows and enable direct file manipulation. This fundamental reinvention of computing architecture ushers in a new era of structured, human-in-the-loop software development akin to sophisticated pair programming.

Software for One

I built Aesthete—a local-only Typescript/NextJS/Tailwind site powered by Claude Code and Anthropic frontend design skills—to curate nearly 200 aesthetically satisfying brands, showcasing how “software for one” can turn ideas into polished, interactive experiences almost instantly.

Related terms:

Context Window

A context window is the maximum amount of text a language model can process in a single call—input and output combined—measured in tokens. Larger windows (from about 4,000 tokens up to over a million) let you handle longer inputs but raise costs and can suffer from the “lost in the middle” attention issue.