AI Search Assistant with Local LLMs

First of all, good job with the new version of DT4. I have a question regarding AI Search Assistance. When I use a local LLM (e.g., through Ollama or LM-Studio), the AI button for AI search assistance is greyed out. It only seems to work with server-based LLMs. All other AI features (e.g., summarization, chat, etc.) work just fine with the local LLM.

Is this a general limitation for local LLMs or specific LLMs?

It’s only supported in case of modern models which support tool calls and/or reasoning and have a context window of at least 8k tokens, for example Gemma 3 or Mistral Small 3.2.

2 Likes

Thanks for the quick response. I am using Gemma 3, and after increasing the context window to 8k, the AI button worked!

Depending on the processor I would recommend a context window of 32k or more.

1 Like