The Airflow provider integration allows you to send alerts (e.g. DAG failures) from Airflow to Keep via webhooks.
Apache Airflow is an open-source tool for programmatically authoring, scheduling, and monitoring data pipelines. Airflow’s extensible Python framework enables you to build workflows that connect with virtually any technology. When working with Airflow, it’s essential to monitor the health of your DAGs and tasks to ensure that your data pipelines run smoothly. The Airflow Provider integration allows seamless communication between Airflow and Keep, so you can forward alerts, such as task failures, directly to Keep via webhook configurations.
To connect Airflow to Keep, configure Airflow to send alerts using Keep’s webhook. You must provide:
https://api.keephq.dev/alerts/event/airflow
).A common method to integrate Airflow with Keep is by configuring alerts through Airflow Callbacks. For instance, when an Airflow task fails, a callback can send an alert to Keep via the webhook.
There are several steps to implement this:
Structure your alert payload with the following information:
To send alerts to Keep, configure the webhook URL and API key. Below is an example of how to send an alert using Python:
Note: You need to set up the
KEEP_API_KEY
environment variable with your Keep API key.
Now, configure the callback so that an alert is sent to Keep when a task fails. You can attach this callback to one or more tasks in your DAG as shown below:
After setting up the above configuration, any failure in your Airflow tasks will trigger an alert that is sent to Keep via the configured webhook. You can then view, manage, and respond to these alerts using the Keep dashboard.
This provider can’t be used as a “step” or “action” in workflows. If you want to use it, please let us know by creating an issue in the GitHub repository.
The Airflow provider integration allows you to send alerts (e.g. DAG failures) from Airflow to Keep via webhooks.
Apache Airflow is an open-source tool for programmatically authoring, scheduling, and monitoring data pipelines. Airflow’s extensible Python framework enables you to build workflows that connect with virtually any technology. When working with Airflow, it’s essential to monitor the health of your DAGs and tasks to ensure that your data pipelines run smoothly. The Airflow Provider integration allows seamless communication between Airflow and Keep, so you can forward alerts, such as task failures, directly to Keep via webhook configurations.
To connect Airflow to Keep, configure Airflow to send alerts using Keep’s webhook. You must provide:
https://api.keephq.dev/alerts/event/airflow
).A common method to integrate Airflow with Keep is by configuring alerts through Airflow Callbacks. For instance, when an Airflow task fails, a callback can send an alert to Keep via the webhook.
There are several steps to implement this:
Structure your alert payload with the following information:
To send alerts to Keep, configure the webhook URL and API key. Below is an example of how to send an alert using Python:
Note: You need to set up the
KEEP_API_KEY
environment variable with your Keep API key.
Now, configure the callback so that an alert is sent to Keep when a task fails. You can attach this callback to one or more tasks in your DAG as shown below:
After setting up the above configuration, any failure in your Airflow tasks will trigger an alert that is sent to Keep via the configured webhook. You can then view, manage, and respond to these alerts using the Keep dashboard.
This provider can’t be used as a “step” or “action” in workflows. If you want to use it, please let us know by creating an issue in the GitHub repository.