Could you provide support for OpenRouter in the AI functionality of DEVONthink? Because OpenRouter would provide several benefits:
Access to multiple AI models: OpenRouter serves as a gateway to various AI models from different providers (including Anthropic, OpenAI, Google, and others).
Cost efficiency: It allows users to choose models based on performance and pricing that best suit their specific needs.
Flexibility: Users could switch between different models without changing their workflow or setup within DEVONthink.
Future-proofing: As new AI models emerge, OpenRouter integration would make it easier to access these innovations without requiring major updates to DEVONthink itself.
Just wanted to chime in and strongly support this feature request.
OpenRouter is an incredibly useful service in that it provides increased (though obviously not perfect) anonymity for cloud-based LLMs. It also gives access to a wide range of LLMs for different use cases and at different price points.
Applications that provide access to cloud AI providers via APIs commonly include the OpenRouter endpoint, since it’s a popular option and an easy way to give users access to a large variety of AI models.
But is it really anonymous? If you upload data with your private business plans does it fix the situation by making it “anonymous”?
If confidentiality is essential, Anthropic appears to have made the strongest commitment to privacy of all the LLM vendors - especially when using an API, as DT4 does. They are willing to assure compliance with HIPAA, SOC 2, and other standards.
However, OpenRouter retains no logs and all requests from OpenRouter users are sent to a given LLM provider via the same API key. There is a pool of requests, so to say.
So you can use Anthropic models via OpenRouter and will benefit from the positive policies that you mentioned above AND the additional anonymity that comes from not having a personal Anthropic account that is tied to your identity, payment data, phone number etc.
Maybe… and maybe not. I’ve hear that story many times in other circles
PS: The suggestion may require more UI change than you’d expect. There is some precedent for an aggregate provider as several models in Image Generation support require an API from a Replicate.com account. But the dropdown menu options are statically added by development, not generated on-the-fly, and it wouldn’t be a good use of resources to have to keep manually modifying the dropdown menu for changing options in OpenRouter. Just something to consider.
Well, there is an endpoint that allows fetching available models from OpenRouter… These could then be displayed for selection in the usual dropdown, so potentially no UI change needed?
OpenRouter could be the provider where you guys just allow users to access any model, without constraints. There could be an info box that points out experimental models are included and support will be limited.
Again, DEVONthink isn’t “an AI application”, intending to run as a backend to OpenRouter, etc. There are already apps for that.
where you guys just allow users to access any model, without constraints.
Think about this from a development and support standpoint. Who do you think will receive complaints and support requests? OpenRouter, the LLM provider itself,… or us? And this notion of “any model, without constraints” is not something that should be allowed. It has to work within DEVONthink, safely, privately, and also effectively.
You’re welcome and apologies if I came across as blunt. Development (and support) on the inside is very different from the outside, especially when it comes to commercial software. There’s a lot more to be responsible for when it’s peoples’ data and privacy involved.
Everyone is entitled to their opinions, including developers and bluefrogs
I actually think it’s a positive that the approach is a bit cautious. Just hoping that you will look into it more. If so, you’ll probably find that the concerns are not warranted in this case, and that many commercial applications rely on OpenRouter. Then you might recognize the massive opportunity to address power users’ needs through implementation of a single additional endpoint.