---
title: Add search
description: >-
  Generate a static docs search index, query it at runtime, and optionally
  stream source-grounded answers.
group: docs-site
lastModified: '2026-05-11T20:02:32-07:00'
lastAuthor: 'github-actions[bot]'
---
# Add search

Leadtype search is static by default. Build time produces two JSON files; runtime code imports or fetches those files and queries them without a database.

## Generate the files

`leadtype generate` writes search files in site mode:

```bash
npx leadtype generate --src . --out public --base-url https://example.com
```

If you run the pipeline from scripts, call the search generator after conversion:

```ts
import { generateDocsSearchFiles } from "leadtype/search/node";

await generateDocsSearchFiles({
  outDir: "public",
  baseUrl: "https://example.com",
});
```

This writes:

```txt
public/docs/search-index.json
public/docs/search-content.json
```

The index contains compact ranking data. The content store contains the text used for excerpts and answer context.

## Query at runtime

```ts
import {
  searchDocs,
  type DocsSearchContentStore,
  type DocsSearchIndex,
} from "leadtype/search";
import contentJson from "../public/docs/search-content.json";
import indexJson from "../public/docs/search-index.json";

const index = indexJson as DocsSearchIndex;
const content = contentJson as DocsSearchContentStore;

const results = searchDocs(index, "run lint", { content });
```

Results include page URLs, heading paths, hash URLs, and snippets. Render them in your own search UI.

## Add vocabulary aliases

Search starts with lexical matching, stemming, prefix matches, typo-tolerant fallbacks, and a small built-in synonym map. Add product-specific synonyms only when users search with words your docs do not use:

```ts
const results = searchDocs(index, "starter", {
  content,
  synonyms: {
    starter: ["quickstart", "getting started"],
  },
});
```

## Optional AI answers

Use source-grounded answers only after basic search works. Leadtype retrieves chunks from the static index, builds a constrained prompt, and leaves model choice to the provider entry point you import:

```ts
import { streamDocsAnswer } from "leadtype/search/vercel";

const { response, sources } = streamDocsAnswer({
  index,
  content,
  query: "How do I run docs lint in CI?",
  model: "openai/gpt-5.5",
  productName: "My Library",
});
```

Display `sources` next to the streamed response. Do not ask the model to answer from memory; the answer context is built from retrieved docs chunks.

## Guard the endpoint

For API routes that accept user queries, use the request helpers from `leadtype/search`:

* `validateDocsQuery` to trim and cap query text.
* `readJsonWithLimit` to reject oversized JSON bodies.
* `getClientIdentifier` to read common proxy IP headers.
* `createMemoryRateLimiter` for demos.

Production apps should adapt the rate limiter interface to a shared store such as Redis, Vercel KV, Cloudflare KV, or Durable Objects.

## Verify

* `public/docs/search-index.json` and `public/docs/search-content.json` exist and are non-empty.
* Searching for an exact API name returns the expected reference page.
* Searching for a guide phrase returns a result with a section hash.
* AI answers cite sources from the returned `sources` metadata.
