I want to use ollama and IRIS for my AI use case - How do I connect to ollama?
I combined @Rodolfo Pscheidt https://github.com/RodolfoPscheidtJr/ollama-ai-iris with some files from @Guillaume Rongier https://openexchange.intersystems.com/package/iris-rag-demo.
My own project is https://github.com/oliverwilms/ollama-ai-iris
I can run load_data.py and it connects to IRIS (same container).
When I try to run query_data.py https://github.com/oliverwilms/ollama-ai-iris/blob/main/query_data.py , it cannot connect to ollama:
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible.
I wonder if I need to add in query_data.py, how it can "find" ollama which is in docker-compose.yml?
https://github.com/oliverwilms/ollama-ai-iris/blob/main/docker-compose…
Comments
I see that @Guillaume Rongier has this in a Business Operation:
def on_init(self):
self.model = Ollama(base_url="http://ollama:11434",model="orca-mini")
Maybe my example can help you:
https://community.intersystems.com/post/diagnosis-searching-similaritie…
I was able to connect to ollama with this:
Settings.llm = Ollama(
base_url="http://ollama:11434", # tell it to connect to the Ollama container
model="llama3.2",
request_timeout=360.0
)