From the course: Apache Airflow Essential Training
Unlock this course with a free trial
Join today to access over 24,900 courses taught by industry experts.
Passing data using the TaskFlow API - Apache Airflow Tutorial
From the course: Apache Airflow Essential Training
Passing data using the TaskFlow API
- [Instructor] In this demo, we'll build the same DAG as in the previous demo but this time we use the taskflow API. You can see I have the passing data with taskflow API function defined on line 19 and it's annotated using the at DAG annotation. I also have the individual tasks defined as Python functions that are local to the DAG function. So the first Python function is get order prices. Where I set up my order price dictionary, notice the at task annotation, and all I do is return order price data. Now, you might say that this is just an ordinary Python function that returns a value, and yes, indeed it is. This return value will be added to the EXCOM backend and will be available to other tasks. But all of the EXCOM usage will be abstracted away from you when you use the taskflow API. You'll just use these Python functions as ordinary Python functions. Next we have the compute sum, and compute average Python…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
Prerequisites39s
-
(Locked)
Quick Airflow setup overview3m 27s
-
(Locked)
DAG using PythonOperators3m 33s
-
(Locked)
DAG using TaskFlow3m 55s
-
(Locked)
Passing data using XCom with operators5m 37s
-
(Locked)
Passing data using the TaskFlow API4m 41s
-
(Locked)
Tasks with multiple outputs5m 40s
-
(Locked)
Passing multiple outputs in TaskFlow1m 47s
-
-
-
-
-
-