From the course: Databricks Certified Data Engineer Associate Cert Prep
Unlock this course with a free trial
Join today to access over 24,900 courses taught by industry experts.
Automated pipelines with Delta Live Tables
From the course: Databricks Certified Data Engineer Associate Cert Prep
Automated pipelines with Delta Live Tables
- [Instructor] Delta Live Tables are a new way of scheduling workflows because you're able to use all of the advanced features of Delta Live, and you can obviously create a new pipeline by just selecting Create. Notice you'll have to select a pipeline name. You can choose what product for Advanced, Core, Pro, etcetera. You can even see some documentation about which ones to choose. And then you have the option of doing continuous or triggered. So the difference between a continuous, with streaming data, you could always be ingesting the new data as it arrives, versus a triggered would be on an event, and then you would pass in the source code path as well. You know, for example, your repo that's connected to your workstation, that could be a good a place to store your source code. And then in terms of the destination, potentially the Unity Catalog. And then in terms of compute, you would select what kind of compute…
Contents
-
-
-
-
-
-
(Locked)
Vacuuming and garbage collection2m 53s
-
(Locked)
Table documentation3m 37s
-
(Locked)
Automated pipelines with Delta Live Tables2m 58s
-
(Locked)
Delta Live Tables components2m 29s
-
(Locked)
Continuous vs. triggered pipelines2m 27s
-
(Locked)
Configuring Auto Loader1m 36s
-
(Locked)
Querying pipeline events1m 36s
-
(Locked)
End-to-end example of Delta Live Tables1m 42s
-
(Locked)
-
-