Why you want Databricks Auto Loader
Auto Loader is one of the standout features in Databricks and this post will introduce you to why you’d want to use it to address common data ingestion challenges.
Auto Loader is one of the standout features in Databricks and this post will introduce you to why you’d want to use it to address common data ingestion challenges.
This post is for anyone who is unaware that interactive Databricks clusters can be deleted 30 days after termination, unless the cluster is “pinned”.
In part 4, the final part of this beginner’s mini-series of how to handle bad data, we will look at how we can retain flexibility to capture bad data and proceed uninterrupted.
We’ll look to use specifically, the “badRecordsPath” option in Azure Databricks, which has been available since Azure Databricks runtime 3.0.
In the 3rd instalment of this 4-part mini-series, we will look at how we can handle bad data using PERMISSIVE mode. It is the default mode when reading data using the DataFrameReader but there’s a bit more to it than simply replacing bad data with NULLs.
In the second part, we’ll continue to focus on the DataFrameReader class and look at the option, DROPMALFORMED to remove bad data.