The vLLM Provider enables integration with vLLM-deployed language models into Keep.
The vLLM Provider supports querying language models deployed with vLLM for prompt-based interactions.
This provider requires authentication.
This provider can be used in workflows.
As “step” to query data, example:
Check the following workflow example:
To connect to a vLLM deployment:
The vLLM Provider enables integration with vLLM-deployed language models into Keep.
The vLLM Provider supports querying language models deployed with vLLM for prompt-based interactions.
This provider requires authentication.
This provider can be used in workflows.
As “step” to query data, example:
Check the following workflow example:
To connect to a vLLM deployment: