Recommendation to add custom model providers

Glad to see the chatgpt feature on DT4, which makes me even more determined to continue using DT. suggest adding custom model providers so that you can use 3rd party api’s while using multiple ai providers such as open ai, deepseek, etc.

Welcome @panxq
There are more AI engines available than ChatGPT. Please read the Getting Started > AI Explained section of the help or manual.

This requires separate registration and purchase, if support for third-party services (that is, just add a custom URL instead of the original address), you can use the third-party services to use more models, I hope to develop this simple function, which is very important for some areas can not use the services provided by open ai official website, thank you!

I’m not sure what you’re referring to here, a separate registration and purchase of what?

And DEVONthink 4 supports more than OpenAI, as noted in the Help and also accessible in the AI > Chat settings.

you can already use a lot of LLM (cloud-based and also locally installed ones). so what you you miss in DTP4?

I’m running Ollama on my MacBook. DEVONthink lets you specify ollama as a provider, and to configure models and URLs in use; this means, as long as the provider has a chatGPT-compatible HTTP api (as they nearly all do), you can interface it with pretty much anything. I’ve had it talking to at least 2 different open systems running on my machine. It’s not cost me a penny, other than the existential trauma of using LLM.

There are third party websites that provide one api to use multiple ai. this is possible with software like chatbox that provides an api mode like openai compatibility mode. Such third party sites are supposed to access multiple api services through mirroring.

Such as AiHubMix’s AI model API routing service! Based on the unified OpenAI API standard, they support all official OpenAI models and other mainstream models (such as Claude, Gemini, DeepSeek, Ali Qwen, etc.) in the market. Just change the model name in the parameter, you can easily switch the accessed big models.

I also use ollama and my macbook max can only run up to 32b models, which is not very good for long documents. So would like to use a third party provider that offers api to use official models like open ai and deepseek as a supplement.