Please Note: Blog posts are not selected, edited or screened by Seeking Alpha editors.

Proactively Avoiding Risk with Accurate Data - from Tony Fisher's "The Data Asset" (Wiley, 2009)

Proactively Avoiding Risk with Accurate Data

Chapter 2, pages 26-27: The Data Asset: How Smart Companies Govern Their Data for Business Success

By Tony Fisher, president and CEO of DataFlux

 

Risk mitigation is not just about keeping clear of the long arm of the law.  As companies that traded in subprime mortgages have discovered – you don’t have to be facing government fines to still find yourself publicly embarrassed and in a great deal of financial trouble.  The subprime mortgage fiasco is a hard-learned lesson in how businesses could have proactively used data quality to reduce risk. 

 

Let’s think back to how you acquired a home mortgage 20 years ago.  You met with a banker or broker in person toting a thick file folder of “proof” that you were a good risk: W-2 statements, paycheck stubs, bank statements, and a detailed list of your previous addresses.  During the credit check the broker might even have called your HR department to double-check on your employment status and salary history.

 

Fast forward to two years ago.  Customers acquired a mortgage online with nothing more than a quick credit check.  In some cases, subprime lenders barely bothered to double-check any of the information gathered.  Housing prices were rising, after all, and the mortgages were bundled and sold as quickly as they were issued.  The loan originator had washed its hands of any risk long before the homeowner missed the first payment.

 

However, there were some checks in place, mean to assess risk.  For example, loan packages were “rated” using some of the same criteria from the days of the thick file folder.  Unfortunately, the rating agency typically had 24 hours or less to rate a package and therefore did so with the skimpiest of details.  Likewise, mortgage package buyers or those creating mortgage-backed securities knew very little about the underlying risk associated with the mortgages.  Even though the checks were in place, without the appropriate data they were virtually useless.  With sound data management in place, at least some of the resulting problems could have been avoided. 

 

More important than examining the what-ifs of a recent catastrophe is preventing a new one. Few would argue that subprime lenders were fraudsters of the highest order, and it’s true that more detailed attention to data quality might’ve mitigated certain risky transactions.  However, data quality can be used to combat more traditional types of fraud.

 

Accurate and trusted data is a fraudster’s worst nightmare.  Unlike paper files that sit in desks, matching processes can be run on well-managed data to unearth all sorts of fraudulent behavior.  A classic example is the Medicaid/Medicare provider who uses data matching and resolution technology to find patients or physicians who submit a claim multiple times – with just a little change – hoping the claims won’t attract attention and, instead, will be paid multiple times.  This solution employs time-tested, sophisticated matching technology that flags similar claims and alerts company officials to the possibility of fraud.

 

Another example of how data matching technology can mitigate fraud risk involves a client in the wireless telecommunications sector.  The company paid independent agents to resell mobile phone services.  The agents got a larger commission for adding a new customer than they did for re-signing an existing customer.  Some of the agents thought they would take advantage of the system by re-signing an existing customer under a slightly different name.  For example, existing mobile phone customer Robert R. Jones was now “new customer” Bob Jones.  By using matching and identity management technology and by enriching the process with address and phone number data, the mobile phone provider was quickly able to eliminate this fraud and reduce its commission outlay by millions of dollars per year.

 

Tony Fisher joined DataFlux as president and CEO in 2000. In his years at DataFlux, he has guided the company through tremendous growth as DataFlux became a market-leading provider of data quality and data integration solutions. Tony is also a featured speaker and author about emerging trends in data quality, data integration and master data management, and how better management of data leads to business optimization. Prior to DataFlux, he was the technology director at SAS.  His new book, The Data Asset: How Smart Companies Govern Their Data for Business Success, is available now at http://www.wiley.com/WileyCDA/WileyTitle/productCd-0470462264,descCd-description.html