Guys, I am back. I was out for a while, travelling to east coast of US. So where were we? We talked about data profiling, its need and the approach towards it.
Last few days I have gone through a very good, detailed process of supplier normalization, classification, enrichment using a global compendium. I worked with the people in industry who are doing this business since last 25 years and associated with big names in finance, banking, entertainment and packaging.
So once you do the data profiling, you come to know the richness (or dirtness) of the data. Based on which you can estimate your efforts. But what if customer already has a good structure of the data (not good data though)? Initial work is easy. Talk to customer about the data format, input columns, totalling, what all things customer wants to see. Once requirements are frozen, you can go to next step setting up an environment in your system for the customer.
Test the whole setup using the sample data format from the customer. The input format mapping looks like this one -
So now you have inputs and you are also done with the input mapping analysis of the file you received. Whats next ? Next thing is to preprocess the data. Means whatever are the obvious flaws - knows issues - with teh data correct those. These are normally knows issues communicated by customer. Or may be something that you found and wanted to tell customer about it. Once you are on the same page, get ready to go ahead with first step processing of data transformation.
Will talk about it later. Transformation can be done in two ways - using a tool like informatica (www.informatica.com ), you can map source to target data, setup cleansing rules and get the processed cleansed data before going ahead with normalization. If you are not yet ready with the automation thing, just go ahead on writing your own rules. You need to be really good on writing SQL queries and procedures to do this. Also, this works when you know the rules with priorities and details.
Comments
Post a Comment