Supported Providers
vLLM Provider
The vLLM Provider enables integration with vLLM-deployed language models into Keep.
The vLLM Provider supports querying language models deployed with vLLM for prompt-based interactions.
Authentication
This provider requires authentication.
- api_url: vLLM API endpoint URL (required: True, sensitive: False)
- api_key: Optional API key if your vLLM deployment requires authentication (required: False, sensitive: True)
In workflows
This provider can be used in workflows.
As “step” to query data, example:
Check the following workflow example:
Connecting with the Provider
To connect to a vLLM deployment:
- Deploy your vLLM instance or obtain the API endpoint of an existing deployment
- Configure the API URL in your provider configuration
- If your deployment requires authentication, configure the API key