Nalco masters data governance with quality management tool

With operations around the globe and more than 70,000 customers in 130 countries, data quality ended up being the hot button in Nalco Company's migration to a single global instance of SAP. “Data governance is a critical issue for the company. Quality data is a key asset,” says Charlon Franklin, senior data steward analyst for the Naperville, Ill.

By Frank O Smith (fosmith@thewritinggroup.com) May 1, 2008

With operations around the globe and more than 70,000 customers in 130 countries, data quality ended up being the hot button in Nalco Company ‘s migration to a single global instance of SAP .

“Data governance is a critical issue for the company. Quality data is a key asset,” says Charlon Franklin, senior data steward analyst for the Naperville, Ill.-based water treatment and processing systems manufacturer.

Since bringing its outsourced ERP back inside the company, Nalco has been migrating all global operations to one platform. A year and a half ago it formed the data governance group Franklin heads to ensure data consolidated from multiple sources is consistent, without errors and duplication, and properly formatted to home country and corporate standards.

Nalco selected a best-of-breed data quality management system from DataFlux , an operating unit of SAS , to shore up the data.

“We’re bringing up different countries and other companies we’ve acquired over time that haven’t been on SAP,” Franklin says. “We use DataFlux to look at the non-SAP information. Inconsistent data can create all kinds of problems.”

For one, there’s a business problem on the demand side, called “hanging orders.”

“Demand planning never gets the order, so it doesn’t know that it’s supposed to plan for filling an order,” says Franklin. “You leave a customer out there expecting an order, only it doesn’t come.”

It’s problematic on the supply side, too. “If you have duplicate records for a vendor,” explains Franklin, “you may end up double billing them. DataFlux eliminates duplicate records.”

Prior to implementing DataFlux, Franklin’s group used Microsoft Access and Excel as tools to analyze and validate the data migration. “But with CRM, one group had more than 600,000 records,” she says. “As you move from parent-to-child relationships, the database doubles every time.”

Franklin’s team uses DataFlux to analyze record sets it receives from multiple sources, determining how large the table or database is and how it’s structured.

“It can look at the entire database, or just the fields you select, and can tell you what the values are and if there’s something wrong,” says Franklin. “You just have to point the tool to the table you want to analyze, and it runs.”

DataFlux, a data quality integration solution, analyzes records from multiple sources, determining how large the table or database is and how it’s structured. An Architect tool within DataFlux dfPower Studio builds consistent business rules for normalizing and validating data brought in from multiple systems to ensure standard conformance.

Nalco is following a path typical to many companies migrating data to a new system. “There is no direct link currently between SAP and DataFlux,” Franklin says, adding that data is extracted in batches and loaded on a server, where it is cleansed and normalized based on the business rules it has created to govern data. But once the entire migration is complete, Nalco plans to directly link the two systems so analysis will be in real time.

“Full rules monitoring of the database on the front end will trigger notification whenever there’s any problem,” Franklin explains.

Next up, Nalco will evaluate the DataFlux Master Data Management component for implementation, in an effort to synchronize all corporate data from a single master data reference file.