The Data Governance Problem

Data governance is important for organisations of all types and sizes. Cheaper and more accessible storage solutions do not necessarily preclude thinking critically about how data is created and consumed or implementing solutions for ensuring that the data is accessible as well as trustworthy. The benefits of having good data governance policies are improved operational efficiency, less data-related bottlenecks as well as overall improved security within the data environment. The definition of “good data governance” is nebulous and it can be difficult to gauge whether or not it has been thoroughly and successfully implemented. This is especially the case because data governance is not a one off, fire-and-forget deployment. To continue reaping the benefits requires ongoing organisation-wide commitment.

Data Governance Implementation

Most of the steps that need to be taken to achieving better organisational data governance are non-technical and very much involve clarifying processes, formalising areas of responsibility and educating staff as to best practices. One of the most impactful ways to get an organisation on track, data-wise, is to establish a Data Governance Committee. The role of the committee is to make all the top-level decisions about vision, strategy and implementation about all things data. Another important step is to create and empower the role of the Data Steward. These individuals are on-the-ground shock troops: identifying data quality issues, ensuring analytics are fit-for-purpose and putting into practice the changes mandated by the Executive Committee.

Although there is no single technical solution to improving data governance, there are a number of tools that can be used to help the process along, provide clarity, support and quality-of-life improvements for the implementers.

Data Ingestion Transparency

A process of data ingestion and integration pulls together data from various proprietary data formats and consolidates it into a structured enterprise data warehouse. The more teams or departments there are using their own management systems, the more transformation needs to be done before there is a consolidated database using a single schema. This needs to happen every time there is a data refresh, meaning that having reliable and up-to date data relies on being comfortable with your Extract, Load and Transform (ELT) process.

An important responsibility of a Data Steward is implementing the necessary structural changes in order to improve data quality. An all-purpose data ingestion and integration tool such as Loome Integrate allows them to have perfect control over your data pipelines all throughout the Extract, Load and Transform process. It allows you to know what tasks are being performed in what order and at what time, allowing you to have perfect control over the vital data refresh process.

Data Catalogue + Glossary

One important principle of good data governance is the importance of context. Knowing the origin of any given data point, which source system it originated from and what transformations or formulas have been applied are important details that ensure that any data-based decision making is well informed.

Having consistent naming conventions is an important part of standardising an organisation and implementing good data governance. Although this can be implemented in a non-technical manner, it is much easier to do if a business glossary is easily accessible within the reporting and visualisation tool. Having access to this feature prevents the organisational headache of the same entity or concept being referred to by multiple different names and existing in multiple forms within the database.

Real Time Data Quality Alerts

When there is an issue with your data quality or integrity, you want to be able to be notified and act on it as soon as possible. For instance, data can start being ingested that does not conform to an expected format or contains fields that are not accounted for in the transformation process. The longer an issue with a data source remains unresolved, the longer reports derived from it may be providing incorrect data.

How Loome Can Help

A real-time data alert system such as Loome Monitor allows you to be on top of irregularities whenever they happen. With the ability to customise what conditions set off the alerts, who gets notified and in what way, data quality issues can be resolved almost immediately. Using such a tool is a way to ensure an ongoing commitment to good data governance within an organisation.

A data catalogue, such as the one found in Loome Publish, offers a view of data lineage, reducing the potential issues around having a decontextualized data point present in reports. Being able to quickly and easily investigate what sources feed into any particular element helps maintain accuracy and clarity within a data ecosystem.