April 11, 2017
Finance is swooshing into blockchain, machine learning and artificial intelligence. Each organization is adopting new ways of serving clients faster than the other. Or are they? Is this the actual reality we live in? While the challengers have nothing but upside, existing institutions find themselves in a swamp of technology black boxes they have to grapple with and loath the idea of marrying yet another complicated technology octopus that will just not let go in decades. That might seem like an exaggeration, but go through an institutional procurement process or a dozen, and the reality will be quite apparent.
On one side, you have technology adoption being faster than ever before. FinTech challengers are indeed coming and fast. Established companies are launching their own robo-advisors (ex: JPMorgan’s recent plans for an ‘automated advisor’) and lending mobile apps. At the same time, established companies are still trying to overhaul systems from decades ago in an effort to find new competitiveness. Systems that are incredibly critical serving large client segments and interconnected across service lines. Where do these two trends clash?
It’s no secret that technology that serves millions of people today in finance is legacy. It’s often poorly documented, misunderstood and poses a systemic risk. However, overhauling it is also no easy task and oftentimes it gets surrounded by a lot of red tape in bold letters stating DO NOT TOUCH given the inter-dependencies legacy systems will have due to the way they were built (giant monolith applications – the API Economy wasn’t invented yet!).
I've been asked countless times, "How do you connect to our systems?" and my answer is always the same, "Do you actually want to?" Oftentimes, in new greenfield projects, isolation is your friend. Proving out a new business line or technology in a smaller segment in its own stack can be far more nimble and easy to roll out. Based on validated data and use, it's then easier to justify integration into other technology stacks on a basis of broader deployment and actual usage need. New business lines are exactly that, new and with that newness, comes the choice of design. Credit scores and data is a perfect example. Lenders have these on record for their borrowers, so it should be a simple case of retrieval right? Yet, I can count the times that organizations have wanted us to seek that information from their systems on my fingers versus getting it from specialist, modern third-party source. Especially, the more complex the environment, the more benefit from a focused, modular approach.
These legacy systems process an abundance of data and requests; they really are a gold mine of information. But it’s not like they are built on modern JSON APIs, which is why banks are going overboard on building out connectivity internally as well as externally, cheered on by policy such as PSD2. So getting your hands on this information, both due to the critical and sensitive nature of systems and the lack of modularity, is an issue.
We seem to be in a new era of software development. API Economy, specialization and modularization – all call for a data-driven go to market with rapid prototyping and decision-making based on real observed user behavior. Getting organizations and their processes to change to this mindset isn’t a walk in the part, yet at the same time, this means that how we approach technology today is not the same as even five or ten years ago. Despite legacy, we now have the opportunity to build finance applications in a modern way and due to modularization; we can make risk-weighted decisions that are always an API call away for an upgrade.