I’m using ChatGPT with DEVONthink 4, and it seems to work quite well. I have two specific questions:
There are two icons for opening a chat window — one in the inspector bar and one on the toolbar at the top. Are they different, or do they do the same thing? I can’t see any functional difference, apart from the fact that the one on the toolbar can be detached and made to float.
I’m mainly using the AI for summarising PDFs or large notes. I know Apple offers a summarisation feature, but I don’t want to use that. The problem is that when I select a file and then open the chat window, the focus is removed from the file. So when I ask the AI to summarise the PDF, it asks me for the file name, and I have to either copy and paste it or type it manually. It would be great if I could just select a file and then ask the AI to summarise the selected item directly.
They serve the same purpose but are independent of each other.
You open what “chat window”, the Chat popover?
There are multiple methods of summarization: the Chat assistant, Edit > Summarize via chat, Tools > Summarize Documents via Chat > …, the AppleScript command summarize contents of, and the Chat Suggestions > Summary placeholder.
Thank you for your response. Am I right in thinking that all of those options summarise via Apple Intelligence? I believe ChatGPT provides better summarisation, and I’d prefer that LLM to handle summarising by default.
For example, if I open a PDF in DEVONthink’s editing window and select the Chat icon in the inspector toolbar, then enter the prompt “Summarise this”, it replies, “Which document do you want summarising?” I then have to copy and paste the name of the PDF for it to work.
It seems to me there should be an implied intent to summarise the document currently selected and visible in the window.
No. Only the standard macOS features (e.g. Edit > Writing Tools) use Apple Intelligence, everything else uses the chat engine selected in Settings > AI.
I would recommend you don’t assume things like this. Just as communication between people is greatly improved with specificity, you should follow suit with AI. “Summarize this” is not a useful prompt. At a minimum, “Summarize the selected document” is better.
Also, AI models don’t all share the same capabilities so you’ll need to learn how to effectively talk to the one you’re using.
This got me, too. And, while I do prefer having to specify the AI touch the document in my prompt, I still occasionally get a reposnse that it doesn’t know what document I’m referring to, even though it’s highlighted/selected. It seems there needs to be a better way to know if the AI is going to know what documents you want it to look at. Perhaps a contextual menu item for “Chat about these document(s)” or similar?
I believe it’s usually Claude, as that’s my default provider. As for which model, I’m not exactly sure, as I have been trying several for different prompts. However, the one I have set as default is Claude 4 Sonnet, so that’s probably the most likely.
I’ll have to pay closer atention to which model I’m using when this happens. Though, I’ll admit, that’s not my main focus when I go to use the tool, LOL. I do know I’ve used Haiku for some things.
On a side note, I really have a poor understanding of which models to use for which functions; especially with integration in DT4. I’ve read over Anthrop\c’s use case, but still a bit nebulous to me.