Exploiting deep learning opportunities in finance could turn out to be very costly, but firms could get better results by tapping specialist data centers.
Sound and effective training lies at the heart of any successful business venture. To be sustainable and cost-efficient in the long run, companies need the right systems and processes in place to recruit the best staff and train them up so that they meet their objectives and remain motivated.
The successful evolution of deep learning in the financial services sector rests on a similar logic - machines must be connected to the right infrastructure and properly trained if they are to bring tangible benefits to businesses.
While this is not a technology that will work for every process, there is no shortage of potential use cases for deep learning. On the trading desk, for example, a machine could be trained to determine the best way of slicing an order to achieve optimal execution, resulting in greater preservation of Alpha. Or in dealing with complex regulations, deep learning might be deployed to sift through vast volumes of data and spot anomalies or discrepancies that require investigation, pre-empting costly penalties for non-compliance, for example.
In some cases, work has already begun on developing such tools, however these are still early days and although deep learning in finance is becoming more widespread, particularly in quantitative finance, it remains in the nascent phase. The jump from a hypothetical idea to a living reality is a big one, and will require market participants to be imaginative, innovative and resourceful in training their machines.
The toughest challenge is that training a machine to carry out resource-intensive tasks requires a volume of data that is many times greater than the industry is accustomed to dealing with. It also demands untold levels of compute power, not often to be found in a typical bank or investment firm, and prohibitively expensive in traditional financial hubs like New York, London and Frankfurt, where power costs are high,and only getting higher.
Housing this volume of data and compute power internally is unlikely to be practical for the majority of financial institutions, so external storage facilities would need to be considered. Many firms already use data centers to colocate their trading engines close to exchange matching engines and thereby reduce latency, but space in such data centers tends to be both limited and costly.
Crucially, the data and infrastructure needed to support deep learning applications do not need to be stored in close proximity to the firm using them – as long as the access is easy and the basic site-selection criteria are ticked, they can be located pretty much anywhere. In this context, there is a strong case to be made for low-cost data centers in locations such as Iceland for example - where there is an abundance of green power - as a means of bringing these ambitious new concepts to market. Data centers in Iceland are able to pass on the considerable savings (approx. 70%) of utilising renewable energy, free-cooling of servers and long-term price contracts. This is why Iceland and other Nordic countries are seeing an increase in the amount of intensive compute applications moving northwards.
Much of this does, inevitably, come down to cost. In the current environment, when all market participants are battling with the rising costs of doing business, few firms are lucky enough to have unlimited IT budgets with which to test and develop new ideas. But if firms can find the most cost-effective infrastructure that allows them to begin training systems, then it may be that the evolution of deep learning will accelerate at a faster pace and help to solve some of the most pressing market and regulatory challenges with which firms are currently contending.