Accessing configuration parameters passed to Airflow through CLI

In the case you are trying to access the Airflow system-wide config (instead of a DAG config), the following might help:

Firstly, import this

from airflow.configuration import conf

Secondly, get the value somewhere

conf.get("core", "my_key")

Possible, set a value with

conf.set("core", "my_key", "my_val")

This is probably a continuation of the answer provided by devj.

  1. At airflow.cfg the following property should be set to true: dag_run_conf_overrides_params=True

  2. While defining the PythonOperator, pass the following argument provide_context=True. For example:

get_row_count_operator = PythonOperator(task_id='get_row_count', python_callable=do_work, dag=dag, provide_context=True)
  1. Define the python callable (Note the use of **kwargs):
def do_work(**kwargs):    
    table_name = kwargs['dag_run'].conf.get('table_name')    
    # Rest of the code
  1. Invoke the dag from command line:
airflow trigger_dag read_hive --conf '{"table_name":"my_table_name"}'

I have found this discussion to be helpful.


There are two ways in which one can access the params passed in airflow trigger_dag command.

  1. In the callable method defined in PythonOperator, one can access the params as kwargs['dag_run'].conf.get('account_list')

  2. given the field where you are using this thing is templatable field, one can use {{ dag_run.conf['account_list'] }}

The schedule_interval for the externally trigger-able DAG is set as None for the above approaches to work

Tags:

Python

Airflow