Claude Code is now available to $20 per month customers

Have you come across the Elephas way of integrating DT into their particular LLM-interaction architecture, called ‘SuperBrain’?

It not only blends in very well with the overall MacOS (and iOS now) to the extent of feeling ‘native’, but it also allows to set-up very dynamic/versatile RAG-kind of constellations, with direct access to any (det of) DEVONthink-DBs, but also to other parallel(!) integrations (Obsidian, Notion, LogSeq,…), any curated set of folders, Web-Search, YT, custom snippets notes etc etc.
– So, basically you can throw together any kind of ad-hoc PKMS with it, with DT as part of the constellation…

One of the big plusses, aside from this layered approach and ease of integration with MacOS workspace, is its result providing reference to all the relevant source-‘contexts’ (think ‘paragraphs’) used, alongside the LLM output (kind of making the vectorization user-readable/-traceable, I guess… but also similar to Perplexities valued approach).
Also on the plus: you can BYOK to it, almost w/o limits. This is particularly nice, as you can include some general AI-wholesalers like OpenRouter (which I would crave for as being possible/available in DT as well).

I wonder how you would compare this to the Raycast use-case, given your experience – with particular view to DT-interactions?

– BTW: I think your case for interaction design being part of the ‘ergonomics’ and thus ‘intelligence’ of any app architecture is 100% convincing, and it seems very clear to me looking at all the different approaches in different notetakers, PKM-apps etc. And, I think the exemplary case of either using Siri (i.e. natural language and voice) for any of this vs. tossing up an Apple Script is quite compelling and clear to me as an argument. Especially with view to real user bases… Of course, I am only speaking for the peripheral group of non-coders and -scripters here :smiling_face: