The Next Frontier in Data Management Responses to Regulation

Blog entry

Not a conference on technology for capital markets goes by without many mentions of regulation, so here at A-Team Group we have put together a distinguished panel of speakers that will review the regulatory landscape and suggest how firms can develop data strategies that are compatible with a volatile environment at this week’s Data Management Summit in London.

The panel will be moderated by regulatory financial specialist Selwyn Blair-Ford and joined by Francis Gross, senior advisor to the directorate of general statistics at the European Central Bank; James Phillips, global head of regulatory strategy at Lombard Risk; Peter O'Keefe, an independent data management expert; Brian Sentance, CEO at Xenomorph; and Alessandro Sanos, market development manager, risk and enterprise, at Thomson Reuters.

Register Today: Join Data Management Summit to Discuss Driving Business Value out of Data

To give you a little insight into how the discussion might unfold, we caught up with some of the speakers ahead of the event. Brexit in Europe and the Trump administration in the US are expected to cause significant changes to the order of the day and will top regulation already coming down the track, while Markets in Financial Instruments Directive II (MiFID II), the Fundamental Review of the Trading Book (FRTB) and AnaCredit are cited as the toughest regulations in sight.

Xenomorph’s Sentance, suggests that in a volatile market one key action for the industry is to put data in the hands of the business and encourage a strategic and flexible, rather than tactical, approach to data management that will make it easy for business users to access and manipulate data as events occur. In terms of regulation, he adds: “The challenge is to put a data architecture in place that can cope with multiple regulations. It needs to recognise datasets that overlap across regulations and be flexible enough to accommodate future regulation.”

Lombard Risk’s Phillips notes that more granular regulatory reporting requirements are driving the need to improve data quality and suggests the benefit of this will be the ability to send accurate data to regulators that will then look for hot spots. Like Sentance, he promotes a strategic approach to data management, saying: “The approach has to identify destinations for granular reporting, implement agility for the future and delete inefficiencies. Firms need granular data and data quality to be always on so that they are ready for reporting at any time and can gain competitive advantage.”

Looking at the financial landscape from a regulatory perspective, Francis Gross at the European Central Bank notes that regulators are building ever larger data systems and feeding them with more granular data in near real time. The reasons for this are lessons learnt from the 2008 financial crisis, when data could not be aggregated fast enough to present a clear picture of risk.

While this is an improvement on the regulatory front, Gross says a lack of change in the types of people who are regulators, typically economists and lawyers who are not technologists, and the habit of regulators to set out policy and require the market to find solutions, mean firms are throwing money at solving data problems, when the industry should be taking collective action.

He says: “We need to see regulators being more courageous and leading in fields such as standardising the digital representation of entities. Contracts also need to make progress. As reporting requirements increase, the only way to reduce the burden on industry is to build infrastructure that will allow organisations to automate the delivery of more data, faster and at a lower cost. The Legal Entity Identifier (LEI) supports this in providing a standard entity identifier. We need to do the same for contracts, then mobilise legislation and reduce the cost of regulatory compliance to a minimum.”