AI - not working?

Hi!
I’m using DT4beta1. I think i have my AI setting correct but I must be missing something. If I click chat from the Tools menu the Ollama text bar is not accessible. I have Ollama installed on another computer on the network at macminim2:8080. From the screen shot you can see I can access Ollama from my browser. I have also attached a screenshot of my DT4 configuration.


The URL is not correct, by default it’s http:/localhost:11434/api/chat. Only the host & port should be usually changed if the server is running in the same network.

Hi! Thanks for the quick reply. http://localhost:11434/api/chat won’t work either (and doesn’t tried even though I knew it wouldn’t work). As noted above, I have Ollama installed on another computer on the network. I also have open webui installed on that computer (macminim2). Hence, the reason http://macminim2:8080 works. I did try http://macminim2:11434/api/chat but that didn’t work.

What OS?

Are you using your intranet ip for the host computer?

Or do you have port forwarding set up?

This might be a firewall issue. Does it work if DEVONthink & Ollama run on the same computer?

You must tell ollama to be available across the local network. I had the same issue and you have to set an environment variable in local ollama server:

OLLAMA_HOST=0.0.0.0:11434

Or similar. And left the standard URI, but changing the http://localhost:11434/api/chat by http://<server_ip>:11434/api/chat

Written this, I prefer LM Studio because I can see what is going on.

1 Like

See also Ollama Local Network Setup | Restackio