• Home>
  • data>
If this helped you, please share!

Trigger DAG runs with Airflow REST API

Published June 24, 2019 in data - 0 Comments

This article and code is applicable to Airflow 1.10.13. Hopefully the REST API will mature as Airflow is developed further, and the authentication methods will be easier.

The experimental REST API does not use the Airflow role-based users. Instead, it currently requires a SQLAlchemy models.User object whose data is saved in the database.

The code shown below was the easiest way I found to set up this kind of user, although it feels like a hacky solution to me. I set up Airflow with the password_auth authentication backend enabled, so I needed to set a password when I created the user.

Obviously don’t run the code before running airflow initdb. If you are running Airflow with the KubernetesExecutor, this code can be run in one of the Airflow containers using kubectl exec.

from airflow import models, settings
from airflow.contrib.auth.backends.password_auth import PasswordUser

user = PasswordUser(models.User())
user.username = 'test_user' = 'test_user@mydomain'
user.password = 'tH1sIsAP@ssw0rd'
session = settings.Session()

user_exists = session.query( is not None
if not user_exists:


This is how to trigger a DAG run for a DAG with id my_dag. If running Airflow with the KubernetesExecutor, remember to forward the webserver port to localhost using kubectl port-forward. I’m using the default port 8080 in this example.

import requests
import json
from pprint import pprint

result =
  auth=("test_user", "tH1sIsAP@ssw0rd"))

According to the tests I performed, the data parameter is required. This is how to pass configuration key/value pairs to a DAG run. The command line interface (CLI) version looks similar. Also, this way of calling the function with authentication worked best for me.

Tags: airflow, python

No comments yet

Leave a Reply: