Sign up today and take advantage of member-only content — the kind of timely, cutting edge industry insight that only Money Management Executive can deliver.
  • Exclusive Online Only Content
  • Free Daily Email News Alerts
  • Asset Management Blogs

Data Collection the CoreOf Systemic Risk Regulator

NEW YORK-Final legislation to create a new systemic risk regulator in the U.S. has yet to be inked. But there was plenty of talk - and concern - at the Securities Industry and Financial Markets Association 2010 Financial Services Technology Expo here Wednesday.

Top of mind: Just what type of information any monitoring body might ask for and how firms would need to comply.

Data collection is at the core of the proposal within financial reform legislation to create an Office of Financial Research within the Treasury Department. That means each financial institution - or at least those considered critical - must send some sort of data on a daily basis.

At issue is whether firms have the ability to do so when they are still grappling with creating best practices for data collection, governance and upkeep on their own. Also unclear is just how much clout a systemic risk regulator will have in preventing a market crisis.

"There won't be a single approach on the type of data and it will be a multi-year effort," said Edward Hida, a partner with Deloitte & Touche, which helped SIFMA create a recent white paper outlining the eight approaches a systemic risk regulator could take.

He classified the systemic regulator's approach in three categories- the stress based approach, the summarized information approach and the granular data approach. Of the three, the approach involving granular details is the one which will likely draw the most controversy - and cost the most to implement, said Hida. He declined to speculate on the cost involved with adopting a combination of the three approaches other than to say this would be a "substantially expensive" endeavor.

"The systemic risk regulator will likely start off with requesting stress testing, followed by aggregate data and more granular data," Hilda predicted. The stress testing approach involves requiring firms to submit periodic reports on enterprise-wide stress tests and reverse-stress tests. The summarized information approach allows reporting for aggregated risk, risk sensitivity and concentration exposure - that means exposure by market, counterparty and type of financial instrument. In the granular detail approach, the regulator would likely have access to either industry utilities or trade repositories to mine for the transaction and position-level data to conduct its analysis.

Difficulty/Upside Analysis

"Despite the difficulties of complying with the new systemic risk regulator there will be substantial benefits," insisted Keith Saxton, global director of financial markets for IBM. Those benefits include creating data standards, consistency in communication and better stress testing.

Costs and benefits of the OFR aside, there is still too much unknown about how it will operate, what data will be required and what their responsibilities will be. And that means firms cannot delegate the appropriate resources to prepare.

"We're having internal meetings with IT, operations and data specialists but cannot make any definitive plans without final details so we are left spinning our wheels," said a data management expert at one New York brokerage firm.

Yet another brokerage executive was concerned about the lack of defined data formats and whether the government would impose standards the industry could not meet. "Will the regulator want the data to be tagged in XBRL protocol and who will create the data taxonomy that will be used?" he asked. "And we don't even have uniform identifiers for reference, counterparty data, counterparty data or corporate actions data. That means that someone will have to make the decision and it shouldn't be the government which won't necessarily understand whether its decisions are practical or cost effective."

Another attendee, an operations executive at a clearing firm, recommended that Depository Trust & Clearing Corp. and a group of operations and data management executives from a cross-section of Wall Street firms should have the final say on what the data they can provide the regulator and in what formats. "I'm hoping the regulators won't work in a vacuum and impose their own standards when it's the securities industry that should be leading the charge," this clearing executive said.

DTCC, the umbrella organization for the clearance and settlement of U.S. securities transactions, isn't sitting idly by.

Speaking at a SunGard Data Systems event on Monday, Susan Cosgrove, managing director for equities clearance and settlement services for DTCC, said that it has its own plans to mitigate systemic risk.

Those include accelerating its trade guarantee to near real-time instead of midnight the day after a trade is executed for equity transactions and also developing central counterparty services for mortgage-backed securities.