When two companies merge, one of the biggest headaches is merging all the systems and processes of the two companies, including all the data. Doing this well is critical to the success of the merger.
It is critical that the process of merging and de-duping business functions also ensures the quality of data in a merged environment. A rigorous enforcement of data cleansing policies and procedures when consolidating data instances is essential, otherwise the integrity of applications that access these merged data instances can be compromised, putting the success of the M&A engagement at risk. Consequently, data quality needs to be at the core of the M&A process. Making appropriate technology choices in tools to ensure that all data quality issues are detected, flagged, and fixed before going live is critical
Persistent has a whitepaper where they try to lay down the best practices in this domain. Download “Critical Data Issues in Mergers and Acquisitions: A Blueprint for Success” here