Regulatory change is driving a fundamental shift in how banks and brokers organise their data management processes and supporting infrastructures. The writing has been on the wall for pure-play, in-house approaches for some time, but now the demands of the post-crisis regulatory framework make the evidence in favour of hybrid strategies impossible to ignore.
In the aftermath of a new risk reporting and aggregation framework under BCBS 239, stress tests set by central banks, and the capital and liquidity restraints of Basel III, banks and brokers face further data management challenges over the next 18 months to meet the reporting and algorithm testing demands of MiFID II and the exponential increase in market risk calculations required by the Fundamental Review of the Trading Book (FRTB).
With regulators demanding more detailed data, more frequently, the data storage and compute capacity needed to comply efficiently with the evolving post-crisis regulatory framework go well beyond the existing capabilities of most banks and brokers.
For FRTB, banks must compute at least 79 different calculation inputs for each sensitivity class for risk computation under the mandated standardised approach to calculating market risk, requiring as many as 12,000 calculations per trade, compared to 250-500 calculations at present. Meanwhile, MiFID II not only dramatically extends pre- and post-trade reporting requirements for banks and brokers (especially for those that fall under its ‘systematic internaliser’ regime), it also obliges them to undertake a new round of data-hungry and compute-intensive market disorder testing and certification before their algorithms can be permitted to trade on EU trading venues.
However, it is easier than ever to combine existing in-house resources and capabilities with the increasing range of powerful, flexible and secure offerings of specialist third-party data infrastructure providers.
Many banks and brokers no longer see running data centers as a core competence, partly due to the cost of managing existing capabilities, but also the capex involved in acquiring new higher-density compute capacity to handle foreseeable, let alone, future needs. Even those committed to in-house solutions are looking carefully at their cost base, reviewing whether their facilities are located optimally from the perspective of energy costs, enterprise-wide coordination and data privacy and governance.
Moreover, such firms do not need to build for peak capacity when they can access dedicated facilities on a secure but flexible basis, supplementing existing resources as and when regulatory or business-driven demand dictates. As such, the future is likely to be hybrid, with banks and brokers partnering with specialist infrastructure providers that offer a range of managed, hosted capabilities, as well as cloud-based offerings.
Increasingly, banks will become aggregators of data management, compute and networking resources across a federated infrastructure of world-class facilities, adding value in the particular way they harness and combine capabilities to support customer-facing services.
Although the evidence is increasingly compelling, effecting change is never easy. Transformation can be daunting, with many banks and brokers favouring more gradual, modular approaches. To help senior technology executives across the sell-side to make the right calls on data management strategy, Verne Global is hosting a Roundtable discussion led by a panel of industry experts in London on Sept 19, starting 5.30pm. Please register by clicking on the link above or contact me (firstname.lastname@example.org) for more information.
The Roundtable will include participation from The Realization Group, BT Radianz, Nomura, SAS, ScotiaBank, TraderServe as well as Verne Global. By sharing our experiences of tackling today’s data management challenges, the industry can achieve consensus on best practice, meeting regulators’ requirements and drive value for customers. I look forward to seeing you there!