"I don't have any data quality issues". "We manage our data through an ERP system, so it's good quality. ". Who hasn't heard that? However, things have changed over the last few years, with the arrival of cloud computing and especially big data. Today, data quality is a key issue, especially in the banking and insurance sector where Solvency II regulations require data quality management. Éric Gacia, Consulting Director for Banking and Insurance at Micropole, reviews the recent changes in this area.
According to a 2011 PricewaterhouseCoopers (PwC) survey, "90% of companies believe it is essential to have a data quality strategy, but only 15% actually address it." Why? Companies often do not have internal sponsors to manage this critical aspect. The goal, however, is crucial: to know the data, define an internal data dictionary and a common vocabulary for the company to share a consistent vision of the data.
A regulatory requirement
Solvency II places the emphasis on data quality and data governance through very concrete regulatory requirements. This has now become a real business issue in the banking and insurance sector, whereas just two or three years ago, the profitability and relevance of these projects had not been proven. "At one of my customers, a major group in the banking and insurance sector, everyone is now on board with this initiative, because no one has any indicators of data quality, which makes it difficult to use the data and leads to discrepancies, sometimes significant, between the various reports produced. Another, smaller organization told me that it carries out its marketing campaigns "blindly", due to its lack of confidence in the data used." Management has become very sensitive to this issue. There's a real maturity about it, especially as ROI is rapidly being achieved. The key is to gain confidence in the data used, to know it so as to exploit it more effectively, to save time and to devote oneself to higher value-added tasks linked to one's core business.
What governance for what data?
Governance encompasses all the rules and procedures to be followed. It is necessary to set up a data quality steering committee involving the many stakeholders concerned: CEO , the IT department, the risk department and management control… The objective is then to set up an operational working group that will analyze the quality indicators, propose and follow action plans aimed at improving it, and proactively analyze the impacts of data-related developments (new products, new management applications, etc.). Data quality management actually leads to the “creation” of a new role in companies: that of Data Manager. A true conductor, the Data Manager will monitor the indicators in place or to be created, the integrity of the repositories, as well as manage improvement actions to measure their effectiveness.
If all the company's data is concerned, external data is more difficult to control. In order to integrate this data, it will be necessary to implement a data quality firewall that constitutes a data quality lock (compliance with quality standards, management of rejections and implementation of manual correction actions, etc.) before it is integrated into internal operational applications. The data stewardship portal can also be made available to data provider partners in order to make them accountable and provide them with all the tools they need to guarantee the quality of their data.
The need to involve business managers
To lead a data quality governance project, various internal sponsors will be essential. All departments are mobilized: CEO , finance, marketing, risk management, internal control, accounting, etc. Very often, each department manages its own data, which is therefore neither shared nor centralized, which is not without creating significant discrepancies in the figures. A cross-functional understanding of the approach makes it possible to create a reliable data repository shared by all.

