The Ollama Provider supports querying local Ollama models for prompt-based interactions. Make sure you have Ollama installed and running locally with your desired models.

Cloud Limitation

This provider is disabled for cloud environments and can only be used in local or self-hosted environments.

Authentication

This provider requires authentication.

  • host: Ollama API Host URL (required: True, sensitive: False)

In workflows

This provider can be used in workflows.

As “step” to query data, example:

steps:
    - name: Query ollama
      provider: ollama
      config: "{{ provider.my_provider_name }}"
      with:
        prompt: {value}  
        model: {value}  
        max_tokens: {value}  
        structured_output_format: {value}  

If you need workflow examples with this provider, please raise a GitHub issue.

Connecting with the Provider

To use the Ollama Provider:

  1. Install Ollama on your system from Ollama’s website.
  2. Start the Ollama service.
  3. Pull your desired model(s) using ollama pull model-name.
  4. Configure the host URL in your Keep configuration.

Prerequisites

  • Ollama must be installed and running on your system.
  • The desired models must be pulled and available in your Ollama installation.
  • The Ollama API must be accessible from the host where Keep is running.