Bad Big Data and How to Avoid it

Fri, 09/20/2013 - 13:31 -- admin

For the very beginning, we should be aware that the most powerful analytical tools are only as good as the data they crunch.

This means that intelligence built on bad data can be worse than no analysis at all. That's maybe surprising, but it's a fact.

It's not rare case that lots of companies make important decisions or formulate strategic objectives with full knowledge of facts that their data is flawed or incomplete.

Now, the most important question here is, what can organizations do to avoid making bad business decisions or exposing the organization to increased risks?

Well, in order to get useful information from their data, organizations should focus on implementing both data quality and governance measures. Also, BI executives should consider actionable steps about how to prevent poor data from entering the system in the first place. This might look like an old method, but it's definetly workth trying.

One of the first steps of preventing bad data is to examine the three most common problem areas. These usually are:

Business applications - customer relationship management programs, enterprise resource planning, customer information systems, etc.

Movement processes - extract, transform, load

Storage - enterprise data warehouse, integration and analysis

Because each business application has different rules about how it captures, formats, identifies and sorts information, data can become flawed at several levels.

As organizations begin the work of integrating multiple applications or a business application to its data warehouse, problems immediately surface if the data is not properly normalized to fit the target system.

It is often difficult to understand that data can be technically correct and incorrect at the same time. For example, formatting dates between the U.S. and Europe or recording time across different time zones, can make big difference.

Small details can have broad implications, and this is why there is need for organizations to create data governance committees that cooperatively set the standards for how data is used and consumed. Of course, organizations should also worry about the quality as well.

Often organizations will implement data quality solutions in order to improve the data within their business applications. A “fix” for a single application that’s run at the end of a month or quarter is an temporary solution and does very little to prevent the flood of future errors.

This is why business enterprises that handle big data should first consider forming a data governance committees within their organizations. They should be responsible for implementing business processes to measure and track data quality.

All in all, with proactive action, ongoing maintenance and continuous improvement, data governance evolves from being a one-time fix to a front-and-center BI priority that routinely checks, corrects and augments data. Then, BI managers will be able to make predictions or support critical business decisions with confidence that their insights are built on accurate and high quality data.

Source: information-management.com

REQUEST A QUOTE OR DEMO

REQUEST A FREE QUOTE TO ADDRESS THE SPECIFIC
REQUIREMENTS OF YOUR ORGANIZATION.
INITIAL CONSULTATION IS FREE AND WITH NO OBLIGATION TO YOU.



Navigation

About Us

We at CERAiT Inc. focus on assisting businesses in being more effective, innovative and profitable through the power of advanced, business intelligence based software. Our approach is driven by delivering significant business benefits in shortest possible time frame as we build long term partnerships with our clients.

Get in touch

  • CERAiT Inc.
    175 BLOOR STREET EAST
    Suite 1316, NORTH TOWER
    Toronto, ON M4W 3R8

  • +1 800 577 8515

  • [email protected]