Standardization of data and data quality measurement

 

Objective

 

A large US-based bank wanted to standardize data processes, implementing and monitoring control check points to measure data quality for large scale migration of legacy risk management systems into a strategic system.

 

Challenges
 

  • Lack of trust in data due to poor data quality and lack of data quality reporting.
  • Siloed risk management in outdated legacy systems.

Approach
 

  • Created an end-to-end control framework for data quality management.
  • Established a scalable and standardized reporting framework and automated across 3 stages leveraging Python.
  • Incorporated Machine Learning matching algorithms (fuzzy logic) using distance and ratio methodology to automate data quality checks.

Impact
 

  • Increased data quality and built a reproducible framework for measuring and reporting DQ for future use.
  • Overall automation resulted in a reduction of ~ 85% turn around time.

Request for services

Error Msg

Questions



Looking for high-end research and risk services? Reach out to us at:

 

United States
1-855-595-2100/
+1 646 292 3520

 

United Kingdom
+44 (0) 870 333 6336

India
+91 22 33 42 3000 /
+91 22 61 72 3000