concept task instance context in category apache airflow

appears as: task instce context, task instance context, task instce context
Data Pipelines with Apache Airflow MEAP V05

This is an excerpt from Manning's book Data Pipelines with Apache Airflow MEAP V05.

Instead, the task context variables can be provided as variables, to be used in the given function. There is one side note: we must set an argument provide_context=True in order to provide the task instance context. Running the PythonOperator without setting provide_context=True will execute the callable fine but no task context variables will be passed to the callable function.

In this example we instantiate the BashOperator and call the execute() function, given an empty context (empty dict). When Airflow runs your operator in a live setting, a number of things happen before and after, such as rendering templated variables and setting up the task instance context and providing it to the operator. In this test, we are not running in a live setting but calling the execute() method directly. This is the “lowest” level function you can call to run an operator, which is the method every operator implements with the functionality to perform, as shown in Chapter 7. We don’t need any task instance context to run the BashOperator above, therefore we provide it an empty context. In case the test would depend on processing something from the task instance context, we could fill it with the required keys and values[39].

sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
test yourself with a liveTest