Large technology firms like Facebook, Amazon and Google are a bigger threat to banks than fintech startups according to a study by the World Economic Forum that dissected the evolving landscape for financial services. Anthony Jenkins, ex-CEO of Barclays agreed when he stood up at a recent fintech conference in Copenhagen and gave banks a stark warning,“We will see the possibility – not necessarily the probability – of what we call a ‘Kodak moment’, where increasingly banks become irrelevant to their customers.”
The post-recession years within the financial services industry were defined by regulatory constraints and squeezed margins. Lately, however, banks are focussed on the potentially threatening developments from outside the finance industry. The combination of technology developments, reduced barriers to entry, and consumer demand have allowed large technology firms to position themselves as challengers to incumbent banks.
“Banks can avoid that,” continued Jenkins, “But they have to act now, and what they really need to do is think about innovation, but also transformation, doing something radically different.” In order to survive, financial organisations need to focus on leveraging technology
to underpin this transformational growth. The SWIFT Institute, in line with one of its core
objectives of extending an understanding of future needs in global financial services,
examines some of the leading examples of these technologies that are expected to bring
about fundamental changes to the industry.
DLT – a technology enabler
It seems like not a day goes by in the financial press without a new blockchain, or distributed ledger technology (DLT) solution popping up. Most banks are currently working internally with DLT and at times teaming up with other banks and cross-trade organisations; even credit card service providers are looking to provide corporate cross-border payment services based on the new technology. Ruth Wandhöfer, Global Head of Regulatory & Market Strategy at Citi is currently co-writing a paper for the SWIFT Institute on the Future of Transaction Banking 2030 and Beyond, examining how technology innovation such as DLT could play a part in this future. “Old infrastructures will take a long time to adapt, and central banks will not be quick in issuing digital currencies,” explained Wandhöfer in an interview with the SWIFT Institute. “So the industry needs to think, first, about how this revolutionary technology can be leveraged to create a much smoother cross-border payment experience and, second, examine some of the different business models that are beginning to appear. Wandhöfer’s paper will cover these subjects in addition to suggesting some potential designs of DLT and alternative systems that could be used in the future.
In terms of the organisation and eventual setup of DLT systems, it is important to note that the technology, while still in the midst of development and configuration, has been designed with interoperability capabilities from the outset. Different domestic blockchains will therefore have the ability to interconnect. “In a way you could replicate the idea of real-time payment systems today, with domestic systems running on blockchain in the future with that interoperability layer built in,” described Wandhöfer. The industry still has work to do to make this a reality. Developments are needed in the areas of technology, operations, business models, governance policies and potentially regulations in order to achieve any substantial progress.
AI: the art of data capitalisation
Some bankers predict that Artificial Intelligence (AI) could provide the biggest technological shift in our history, bigger than either the computer, Internet or smartphone revolutions. The financial services industry not only needs to prepare for it, but recognise the fact that consumers are rapidly coming to expect the same level of service that is being provided by the technology giants of today. When it comes to AI (or machine learning) in the financial industry, the key point for financial organisations is the need to identify and work with more data – not only in customer payments, but in relation to every aspect of the organisation. Banks, for example, are sitting on large amounts of unstructured data across many different business segments. The consolidation of this information in a meaningful way, through data analytics firms and self-learning algorithms, will allow a bank to immediately see how much credit risk is on the books. In order to achieve this, however, a firm would first need to undertake the daunting task of digitising past data held in legacy systems. Machine learning is expected to assist bankers with customer experience, risk management, operational efficiency, and inventing new business models. Nevertheless, despite promises of efficiency, firms need to be wary of the risk of endorsing and replicating existing biases. “The idea is that AI will deliver better decisions because there will be more access, visibility and understanding of the data. But this needs to happen with less bias going forward, hopefully not repeating mistakes of the past,” warned Wandhöfer.
Open APIs: making data work
Public application programming interfaces (Open APIs) have been used by technology companies for many years now, making real-time data available and creating value out of that process. As described above, banks have been sitting on the key asset of data for far too long without doing much about it. One big push is about to come through in Europe in the form of the Revised Payment Services Directive (PSD2) which will allow other parties to use banks’ customer data for different purposes. What this means in practice is that technology companies, including the likes of Facebook etc., will be able to access bank account information and provide payment services through online messaging applications such as Messenger and WhatsApp. “The banks need to act soon. These tech companies will be able to use Open APIs like any other fintech startup if they get regulated, moreover, these tech platforms will be able to become banks much faster than the banks will be able to become platforms,” explained Dr. Markos Zachariadis of Warwick University, co-author of SWIFT Institute Working Paper entitled The API Economy and Digital Transformation in Financial Services: The case of Open Banking. Dr. Zachariadis expects that Open APIs will at first focus on account information and payment initiation, eventually moving into lending services, such as mortgages, and investment applications.
In their paper, Zachariadis and his co-author, Dr. Pinar Ozcan, emphasise the concept of “Banking-as-a-Platform”. This moves away from the traditional banking model, focussing rather on the creation of an ecosystem on top of a bank’s infrastructure that will offer products and services provided by technology companies. The bank would then be responsible for overseeing the collaboration with certain apps and deciding as to what capacity the various technology companies would become part of the value proposition of the platform. Dr. Ozcan commented, “This requires a radical change, not only in the mindset of banks, but in operational organisation and the approach to potential partners. Banks need to make a conscious decision and serious effort to get there because it will require a profound transformation. Our report tries to motivate firms to take that step now, rather than being reactive to regulation and potentially falling behind in terms of their competition.”
Financial organisations will have to address a number of issues, such as the level of openness required and whether to become API-enabled themselves in order to allow faster data to play in their favour. Not typically experienced in building and managing such platforms, banks will need to strike the right balance between speed and security to attract suitable fintech partners. High switching costs will also need to be incurred with value created simply from platform participation, thus discouraging potential moves to other tech giants. Data
security and governance will obviously play a critical role in this transformation with a need for technology to allow data querying without exposure of sensitive data. Wandhöfer commented, “This is an area where banks really need to innovate in order to stay alive, because otherwise technology companies will simply create value on the back of a bank’s customer, and the risk is the customer will move away to the mobile space. Banks have to make sure they keep their customer relationships.”
Quantum computing: cyber threats
Quantum computing, based on the principles of quantum theory that explain the nature and
behaviour of energy and matter on the quantum (atomic and subatomic) level, marks a leap
forward in computing capability. Following the laws of quantum physics, a quantum
computer would gain enormous processing power through the ability to be in multiple
states “at the same time”, thus exploring all possible configurations simultaneously with the
ability to solve problems that were previously thought intractable.
Once developed on a scalable and fault-tolerant level, quantum computers would have the
ability to break some of the fundamental cryptographic codes used today. There are
essentially two types of cryptography, the first uses public keys and is often used for
authenticating the origin and integrity of information, as well as for establishing symmetric
keys. The second is based on symmetric keys used by two parties to share and transmit
information. The advent of quantum computing would essentially break all deployed public
key cryptography, as well as weaken the symmetric key cryptography, meaning new
methods of cryptography need to be deployed – and soon.
“Entire ICT infrastructures will need to be quantum-proofed. In terms of urgency in the immediate term, decision-makers need to embark on a road map now to make sure cyber systems are secure against quantum attack,” advised Michele Mosca, Co-Founder of the Institute for Quantum Computing at the University of Waterloo, and upcoming speaker at Sibos 2017.
“I believe it will be between eight and fifteen years before we can actually start to use
quantum computing services, but we have to be ready to defend against quantum
computers before that day. It is not as if we can get this done in a year or two if we forgo
some of the frills. It will take at least ten to fifteen years to reliably design and implement
these roadmaps in which difficult technical challenges need to be tackled, such as math and
engineering problems, and practical business and regulatory challenges.
Banks talk in terms of risk, yet even a ten per cent chance of the collapse of the entire ICT system should not be considered as tolerable.” Wandhöfer agreed, “People are starting to talk about quantum even though quantum computers have not yet been rendered scalable. From my own experience working on cyber-risk and payments security, we see time and again that it is the organised criminalswho are the first to deploy technology innovations. I am worried that some people have work in progress and may be able to suddenly realise things that we are not even aware of.”
Electronification to digitisation: past lessons
Upon reading the short summaries of technological developments above, the situation facing the financial services industry may seem more than a little daunting. A useful solace, therefore, might be found in the fact that this is not the first time the industry has faced a set of such revolutionary changes. The electronification of securities settlements in the late 1990’s provided an equally intimidating set of challenges.
The SWIFT Institute issued a Working Paper in November 2016 examining how CREST, a leading settlement infrastructure, facilitated the leap from paper-based to electronic post- trading in London. Arising out of the ‘settlement crisis’ during the stock market crash of 1987 when paper-based processes used at the London Stock Exchange were unable to keep pace, the electronification of settlements drove forward a ‘back-office revolution’. The resulting higher transaction volumes led to the advent of new financial products and more efficient post-trading processes both in terms of cost and time. Examining the successful role CREST played in the financial landscape, the paper also pinpoints how these lessons could be used to help map out infrastructure of banking technology for the future.
Leveraging advantage for seismic shifts
“The technologies of the Fourth Industrial Revolution have triggered a seismic shift in the financial system, the implications of which will extend far beyond the fintechs that pioneered their use in financial services,” continues the World Economic Forum report as cited above. New technologies and regulations are prompting banks to act differently. They are looking to share data and work with new technology partners to take advantage of such innovations as Open APIs, DLT solutions and AI.
At the heart of this digital evolution is the potential for financial institutions to fundamentally change their business models. “Whilst regulation will continue to tie banks to certain obligations, the organisations that will be able to instantly deploy balance sheets and mobilise liquidity will be those that will survive,” asserted Wandhöfer. “It is one thing to talk about blockchain, but ultimately there needs to be a move towards new infrastructures that allow data visibility from the beginning. Yet outdated IT systems continue to deter progress
in this area. This is why the level of risk remains so high today, including cyber risk.”
Avoidance of a ‘Kodak moment’ is dependent on the willingness of financial organisations to transform their operational models and leverage technology in order to propel them to a place of continued relevance with the customer of tomorrow.