Banks do not need to be wedded to complexity, writes Navin Suri, CEO of data technology start-up, Percipient in an essay for finews.asia.

Marie Kondō’s bestseller, «The Life-Changing Magic of Tidying Up: The Japanese Art of Decluttering and Organizing», is sweeping the world. Her message that simplicity pays off applies as much to a bank’s data architecture as it does to a person’s wardrobe.

Few bankers would argue with the notion that the IT architecture in banks is overly complex and as a result, far less productive than it could be. So how did we get here?

Rather than a single blueprint, most banks’ IT evolved out of the global financial industry’s changing consumer demands, regulatory requirements, geographic expansion, and M&As. This has led to a tangled web of diverse operational systems, databases and data tools.

High Value Business Intelligence Remains Elusive

But rapid digitisation has put this complex architecture under further stress. Amid dire warnings, such as the one from Francisco Gonzales, then CEO of BBVA, that non-tech ready banks «face certain death», many rushed to pick up the pace of their digital transformation.

Banks rolled out their mobile apps and digital services by adopting a so-called «two-speed infrastructure», that is, enhanced capabilities at the front, built on a patchwork of legacy systems at the back. Now over a third of all banks, according to a 2015 Capgemini survey, say «building the architecture/ application infrastructure supporting transformation of the apps landscape» is their topmost priority.

Meanwhile a key reward of digitisation – high value business intelligence – remains elusive. Banking circles may be abuzz with talk of big data, but the lack of interoperability across systems makes this difficult to achieve. In some cases, cost effective big data processing technologies like Hadoop have actually deepened the problem by introducing yet more elements to an already unwieldy architecture.

Historically Fragmented Infrastructure

To address the problem, financial institutions have opted for two vastly contrasting approaches. Either paper over the cracks with a growing number of manual processes, or bite the bullet, as UBS is doing. The world’s largest private bank announced in October last year that it will be spending $1 billion on an IT overhaul to integrate its «historically fragmented infrastructure».

However, for those banks unable or unwilling to rip out and replace their existing sytems, there is a third way. The availability of highly innovative open source software offer banks the option of using middleware to declutter and integrate what they have.

Architectural Complexity

Percipient’s data technology solutions, for example, enable banks to pull together all their data without the need for data duplication, enterprise data warehouses, an array of data transformation tools, or new processes and skills.

These solutions are, at their core, an attack on the architectural complexity that banks have come to grudgingly accept.

As Kondō points out, «Visible mess helps distract us from the true source of the disorder.» In the case of most banks, the true source of the disorder appears to be an IT infrastructure derived, rather than designed, to meet the huge demands placed on it by digitisation. There is now a real opportunity to turn this visible mess into visible order.


Navin Suri co-founded Percipient in 2014 with three senior ex-banking colleagues. Prior to Percipient, Navin was Asia Pacific Head of Intermediary Distribution at the Bank of New York Mellon, CEO of ING Investment Management, India and Sales & Distribution Head for Citibank Retail Bank, Asia Pacific. Navin currently serves on the Board of Directors of Nomura Asset Management Taiwan.