Data quality

“3% incorrect data ensures 30% higher operational costs.”(Harvard)

An average organization has 38 different databases. For many organizations, these are tied together as a spaghetti landscape. This is often because the IT department was leading on the basis of processes and not the information needs of users.

The result is a interweaving of large quantities of systems, making management costs enormous. The flexibility to make adjustments quickly is gone and the overview is completely lost. It is no longer clear whether data that is in one system is also being taken over by another system. As a result, there is data contamination in systems and there is a good chance that incorrect decisions will be made based on the wrong information.

In addition, there are also a huge number of mutations in data, for example in the case of customer data alone, the fact that thousands of people move every day. In the Netherlands even an average of 7 times a human life. When this data is processed in 1 place in the organization, it is of great importance that it is also automatically adjusted in other places.

Costs of poor data quality

Various studies show that poor data within most organizations has a major impact on the cost aspect within an organization. The cost aspect is subdivided into two categories:

  • Cleaning and correction costs
  • Cost of incorrect decisions

Examples of poor data quality and data processes within organizations:

  • The same (article) codes with different descriptions
  • Different (article) codes with the same descriptions
  • Missing mapping, think of a sales order without a customer
  • Different data definitions in different source systems
  • Different (master) data from different source systems, but impossible to connect easily due to missing logic
  • Processes for managing (creating, editing, deleting) are not standardized and the management takes place on an ad hoc basis
  • There are no clear agreements about entering data

The correction of this data quality is a labor-intensive task. The associated costs are partly visible, but partly also hidden. Consider periodically performed activities to clean the data, cleaning of data in Excel files by users. Time that should be spent more useful in the daily operation.

The second aspect of costs are the costs of incorrect decisions. The moment management makes important decisions, it must be able to blindly rely on the quality of the data. Wrong data leads to incorrect insights and therefore incorrect decisions. Examples are structurally overestimating customer demand, as a result of which too many stocks are held. Or a wrong investment decision that is not earned back in practice. Having the right data increases the quality of decisions and reduces costs.

How do you ensure data quality?

Is your data quality good? Then you get much more out of your organization. But also from your investments. Such as those in your CRM, Analytics, KPIs or a data warehouse. Everything stands or falls with the quality of your data. But how do you increase the quality of data?

  • Let business be in lead. What is needed?
  • Improving data quality is not a one-time activity. There must be ongoing attention.
  • Link different software packages within the organization. Integrate desired information in the right package. For example ERP, CRM, time registration tool, sickness absence tool etc.
  • Make sure you know more than just the customer history from your own system, bar information also from official sources. Also consider open data: freely available data from, for example, governments and knowledge institutes. Consider the data from Statistics Netherlands, Chamber of Commerce, GIS systems or RDW. The purpose of open data is reuse. By linking this with company data you can enrich it enormously.
  • Also don’t forget the Zero data, this is data that is not in the database. This data indicates, for example, what customers did not buy (such as abandoned shopping carts).
  • Automatically compare data from your partners with the data in your own system and receive a notification when there are deviations.

Data protection

Also take into account the protection of your data. Also consider the AVG legislation.

Privacy is an important aspect. Your customers and partners entrust their data to your organization. Ensure that only the data that is required is visible in the various applications and accessible to the people who are entitled to it.

Security is also a hot topic. It should not be possible to hack data from an online database.

Transparency is becoming increasingly important. Which customer data do you store and which data do you share with third parties?

Quickscan

A Quick scan can be performed to gain insight into the current and desired level of data quality. This can be done internally within the organization or with the help of a third party. All data-related processes and systems are examined in this assessment. The first step is to identify all data sources and the corresponding data producers (which system generates the data) and the data consumers (which system needs the data). This provides more insight into the processes surrounding data and how they are handled within the organization. This is a trend whereby IT and users start working together more.

When it is clear which information is needed where, the information can be linked via logical flows.

Read more about how Dovetail works in this process and can support you in this.

Brief summary

  • Poor quality data costs an organization a lot of money
  • Monitoring the quality of the data is a continuous process
  • Enrichment of data with information from other sources is essential

What can Dovetail mean in monitoring data quality?

Dovetail can realize links between systems and ensure that seamless data integration takes place.

Data integration is the connection of existing and new business applications in a network environment and with the outside world. Data starts from a starting point, is transported and transformed as usable data to the end point.

  • Integrating data improves the quality of the data
  • The quality of the data is continuously guaranteed
  • This allows employees to work more efficiently and has more time to take the organization to a higher level
  • System data can easily be tested with data from other sources to keep it as pure as possible
  • Error routing can be set to quickly detect and correct errors

Dovetail works on the basis of an ESB (Enterprise Service Bus). If multiple systems are linked, it is wise not to link applications directly to each other, but to use an intermediate software layer. The direct linking often leads to spaghetti and a difficult to maintain IT landscape. To prevent direct links and to create a clearly manageable landscape, it is therefore necessary
often uses an ESB. Think of an ESB as a universal socket with which various systems can be automatically linked and logic can be applied so that the messages that go back and forth can be edited or enriched. The ESB also ensures that the messages are sent to the correct end station.

Dovetail is the solution for all your integration challenges. The platform has a user-friendly, graphical user interface so that integrations based on no-code can also be made easily by non-programmers.

Screenshot 2019-02-19 at 13.51.47 Screenshot 2019-02-19 at 13.52.53