I usually use Amazon Bedrock to host and run LLMs, because it’s got a unified interface and is able to call many different models, including Anthropic, Mistral, etc.
I would love it if DEVONthink4 would support Bedrock as a provider in future.
Until then, I was wondering if there’s a possible workaround using litellm. This provides an OpenAI-compatible endpoint.
Is there any way I can point Devonthink4 at this OpenAI-compatible endpoint? I don’t see a way to specify an endpoint in the ChatGPT options; but there are Ollama, GPT4All and LMStudio local options, which do allow specifying an endpoint. I’ll do some experimentation with whether I can use litellm endpoints there in the meantime.
I have built a workaround for this, and it works really nicely. I’m selecting GPT4ALL in DEVONthink, and them am giving my local proxy endpoint, provided by litellm.
I’ve configured litellm to have a range of models, including Anthropic Claude and Deep Seek, provided by Amazon Bedrock. Queries get passed through, and everything works really nicely.
The only issue I’m having, is that i don’t think that models set up using the local models part of the interface in DEVONthink AI settings, actually work quite the same. Models accessed this way don’t seem to have access to the get_contents tool. Is this something that can be added?
Screencaps of your setup would be helpful.
Is this something that can be added?
Possibly, but there’s still much to do and it’s not all AI.
Tool support is internally handled and only enabled for known, reliable provider-model combinations.
Thanks; that explains it!
It would be great if Tool Support could be enabled by the user per model in the preferences, in the same way that other features are.
The holy grail of this concept would be if DT4 could support custom tools.
Obviously this would need to be a use at your own risk feature, just like smart rules or scripting.
But it would open a universe of capability unique to each user’s goals.
Yes; would be great if it’s able to use the MCP framework. Tools such as cline (vscode plugin) are worth a look for how they implement this. The Anthropic Claude computer use functionality works great for this.
I’ve been able to use Anthropic Claude 3.7 in DT, by using the GPT4ALL option and then pointing the URL to litellm, which then calls Bedrock and into Claude 3.7. This works well, but the tool use doesn’t seem to function, and the llm doesn’t have access to the content of the selected documents (other than possibly a summary… It seemed to tell me it could see the contents page but not much else).
GPT4All doesn’t support tool calls.
1 Like
DT4 MCP is feasible currently via Applescript.
I have a working version currently of a Python MCP server that will run any Keyboard Maestro Macro, including passing parameters.
That means you can use MCP to execute anything you can write in Applescript - which includes almost anything that DT4 can do.
If there is interest I will post the MCP server code after I have tested it a bit more.
[Yes - native integration with DT4 would be even better -but this works almost as well. I do not show any DT4 specific examples here but it works with anything you can do as a Keyboard Maestro Macro so that includes JXA, Applescript, and also basic keystroke automation.]
That’s excellent - thank you. I’ll definitely give that a go… The ability to invoke a KM macro via MCP is super handy in any case!