Series

Part 3 - Pre-commit hooks - SQL Linting

Part 3 - Pre-commit hooks - SQL Linting

It can be a challenge to keep code formatted consistently and with a lack of consistency, errors soon follow.

In part 3 of this pre-commit hooks series, we’ll focus on how we can use pre-commit hooks in Azure git repos, to automatically check for stylistic and programmatic errors in SQL scripts.

Part 2 - Detect secrets in Azure repos

Part 2 - Detect secrets in Azure repos

Even with the advent of cloud computing and all manner of technology enhancements, exposing secrets seems to be a problem that won’t go away.

Without the right controls in place, developers can leak secrets that can cause financial and reputational damage to an organisation.

In part 2, we’ll look at how we can use a pre-commit hook to try and detect secrets in our code.

Part 1 - Pre-commit hooks in Azure repos

Part 1 - Pre-commit hooks in Azure repos

Having standards for code development is a necessity but making sure those standards are followed can be a challenge.

As human beings, we make mistakes and can overlook standards at the very moment we need to apply them.

Central to that challenge is making sure standards are applied before changes are committed.

In this series, we’ll look at taking on that challenge with pre-commit hooks. We’ll explore what pre-commit hooks are, why we might want to use them and how they work.

Part 4 - Bad records path

Part 4 - Bad records path

4 min

In part 4, the final part of this beginner’s mini-series of how to handle bad data, we will look at how we can retain flexibility to capture bad data and proceed uninterrupted.

We’ll look to use specifically, the “badRecordsPath” option in Azure Databricks, which has been available since Azure Databricks runtime 3.0.

Part 3 - Permissive

Part 3 - Permissive

3 min

In the 3rd instalment of this 4-part mini-series, we will look at how we can handle bad data using PERMISSIVE mode. It is the default mode when reading data using the DataFrameReader but there’s a bit more to it than simply replacing bad data with NULLs.