Supported Providers
Ollama Provider
The Ollama Provider allows for integrating locally running Ollama language models into Keep.
The Ollama Provider supports querying local Ollama models for prompt-based interactions. Make sure you have Ollama installed and running locally with your desired models.
Cloud Limitation
This provider is disabled for cloud environments and can only be used in local or self-hosted environments.
Authentication
This provider requires authentication.
- host: Ollama API Host URL (required: True, sensitive: False)
In workflows
This provider can be used in workflows.
As “step” to query data, example:
If you need workflow examples with this provider, please raise a GitHub issue.
Connecting with the Provider
To use the Ollama Provider:
- Install Ollama on your system from Ollama’s website.
- Start the Ollama service.
- Pull your desired model(s) using
ollama pull model-name
. - Configure the host URL in your Keep configuration.
Prerequisites
- Ollama must be installed and running on your system.
- The desired models must be pulled and available in your Ollama installation.
- The Ollama API must be accessible from the host where Keep is running.