DT4 b2: Local LLM/Ollama can't access/read files

So, I’ve upgraded to beta 2 and got the Ollama compatible Pico Chat running. The AI assistant responds nicely, but it can’t access the selected PDF.

No matter which file I try and which question I ask, the LLM can’t see the selected file. When I ask how to fix this, it starts hallucinating something like “go to the settings” and “only the opening of properties or contents of selected elements” would be activated.

I have no idea how to fix this. With beta1, it could see the file that I clicked, now it doesn’t…

And how does it work with a commercial model?

Good thing I got a Claude key… testing…

I selected a 10 KB markdown file and asked for a summary. After a long time, Haiku replied “overloaded”. Switching to 3.7 Sonnet first showed “accessing ” or something like that, then it actually gave me a result.

It’s weird, but the local LLM can’t access the file? It can’t be a network issue though, I get the access error reply on the laptop and the Mac Mini (where the Pico Chat server is running). The setting is “Ollama” in the AI settings, but I run the Pico Chat server instead (has a better UI).

Maybe I need to switch back to the OG Ollama and check if that’s the issue (but between Ollama and Pico, the URL and local network port are all the same…)

I would not assume a local LLM has the same abilities as a commercial model, especially when made available via different applications. Ollama and Pico aren’t the same thing so implementations can and do vary.

Did you read the Getting Started > AI Explained section in the help?

Sure, I read the topics in the help section.

But yeah, I found that Ollama and Pico vary just so much so the Pico-hosted LLM can’t access the file, but Ollama can.

Which is a shame, since Ollama only has some Terminal commands and Pico has a nice UI.

Alright, back to Ollama it is… sigh… Thanks for the help!

You’re welcome and thanks for reading the help section.