June 23, 2025
x
min read

Introducing AI Assistant: Turning docs into your product expert

Han Wang
Co-founder

AI is the new interface for learning about your product.

With the rise of llms.txt and Generative Engine Optimization (GEO), it’s clear that LLMs are becoming the default way users discover and understand software.

But without control over how your docs are indexed, those answers can be incomplete—or flat out wrong.

The best way to meet users where they already are is by giving them that AI experience rooted in your source-of-truth.

That’s why we’re launching Mintlify’s AI Assistant: a fully embedded, conversational experience that helps users get to the right answer faster with your documentation.

Here’s what’s new.

Unlocking a new level of accuracy through agentic retrieval

Historically, many AI chat implementations—including ours—used traditional RAG, where the system attempts keyword-matching from the user’s question, selects relevant pages upfront, then feeds that information into the LLM with the question.

It works, sometimes. But you’re force-feeding the model what you think it should use. If it needs different information, it can’t ask for it.

Now, Mintlify’s AI Assistant uses agentic retrieval, which gives the LLM access to tool calling instead of a pre-constructed context window.

The model is aware of the tools it has (like doc search), and when given a user question, it chooses how to search and what context to retrieve on its own.

This means better understanding of intent, smarter information access, and dramatically more accurate responses.

And because AI Assistant now runs with an expanded context window on Claude 4, the best-in-class model, users get high-quality answers with minimal hallucination—even on complex, multi-step questions.

Embedded, conversational, & understands user intent

We’ve redesigned the experience to reflect how users actually interact with AI.

The new assistant is fully conversational. Whether it’s clarifying a vague question, refining a use case, or digging into a detail, the assistant keeps context and responds accordingly.

It also understands what users are trying to accomplish—not just what they ask. Based on intent, the assistant can suggest relevant pages and bring users directly to the source material for further reference.

And now, users can "Ask AI" directly from code blocks, to get instant explanations of specific snippets.

This fluid, AI-native experience isn’t something you can replicate by copying & pasting between third-party tools and your docs.

Mintlify’s AI Assistant works with your docs, in your docs.

Trusted AI answers on your own terms

When users ask questions about your product, many now go straight to tools like ChatGPT or Perplexity. And while optimizations like llms.txt & llms-full.txt help improve how AI crawls your docs, you still have little control over how often they’re indexed or how accurate the generated answers are.

That’s why bringing the AI experience into your documentation matters.

The new assistant gives users fast, contextual answers grounded in your source of truth—with clear citations so they can verify or dig deeper.

And because it’s an owned experience, you get visibility into how questions are answered, which queries succeed or fail, and where your content can improve.

By giving users the AI experience they want in a first party experience, you can take back control of your company narrative.

What’s next

Our previous AI chat handled over 1 million ad hoc queries routinely.

With the new Assistant, we’re building a future where your docs are an always-on product expert—resolving complex threads instantly, before they ever reach your team.

But the assistant is only as good as the source material—so we’re focused on making it easier to keep that foundation strong.

What’s next? We’re working on automating how you keep your docs fresh—by surfacing knowledge gaps from real user queries and soon, helping you draft updates with AI.

You can check out the AI Assistant in action in our customers' docs, such as X, Perplexity, or Bolt.new. If you'd like to see how it could elevate your documentation, get in touch with our team today.