Today’s technology to make payments and settle transactions is more adaptable than ever.  Unlike in the 1980s, nothing appears to be hard coded anymore and our ability to configure applications provides us with endless flexibility.  Any of us who have worked in the requirements definition phase of a payment systems’ deployment in Financial Services will immediately be able to recite examples of where key system parameters “must be configurable”.

We see this maturity of technology parameterisation in all walks of life, especially for those at the forefront of their industry.  In the motor industry, we see that Formula 1 cars are reconfigured for every race depending on weather, track layout, competition, etc.  In the world of airline travel, we see market-leading engine manufacturers monitor and recalibrate engines between flights depending on environmental conditions.  In modern medicine, physicians adjust prosthetics, treatment plans and the use of drugs to fit the individual patient.  However, in the world of Financial Services, we rarely change our highly configurable systems.  What was good enough for yesterday is often assumed good enough for tomorrow.  So why is this?

Well, the risk of change is significant.  Large-scale banking outages regularly impact thousands of customers as a result of poorly tested changes.  The cost of managing this risk is significant.  Testing every possible permutation of system interaction on a by and large random set of behaviours when one wishes to change a system’s configuration, is usually a large undertaking.

Then there is the Regulatory burden to contend with.  Demonstrating to the Regulatory community that the planned change will not bring negative impacts to a ‘finely tuned’ financial environment is usually an intensive activity.  Satisfying the well-intentioned but probing questions about testing and modelling methods can also take its toll on the critical path of a change programme.  Then there is the client ‘Anxiety Gap’ – how to put clients’ minds or the market at ease that the change will deliver benefits rather than problems for them.  So, what do we do?  Generally, we leave things as they are….. 

However, over the last 18 months I have seen a change to this Financial Services Modus Operandi (I will refrain from acronymising that).  I have been privileged to work with some institutions around the globe who have decided to buck this trend.  These institutions are not content with the status quo of their systems.  They are looking to increasingly optimise their systems, including customer facing, operational and other systems.  Whether they are looking to optimise their use of liquidity, reduce the quantity of fraudulent payments or even to reduce the likelihood of timeline critical issues, their confidence in embracing a different way of working is increasingly refreshing.  Like the Formula 1 analogy, these institutions accept that their system configuration was good for the day it was deployed, but tomorrow they may have different weather, track layouts or competition.  Or, in the case of Financial Services, different liquidity challenges, client needs or client behaviour.  Not only are these institutions proactively improving their systems and service offerings, but are also recognising that a reinvigorated system’s configuration is once again only as good as the day it was deployed.  This then takes them on a journey of improving their system monitoring capabilities to monitor the effectiveness of the new configuration.  Which, in turn, leads them to constantly re-assess how far from the optimum configuration they are, to simulate new or anticipated changes and then recalibrate where necessary.

And how is all this achieved?

At FNA we deploy simulators for our Financial Services clients to replicate their own systems.  These simulators can be scripted to run far ahead of normal clock-speed and therefore run hundreds of different system configurations over their own data in just days, or even hours.  This results in valuable insights into how best to tune their system, how to accurately communicate the impact of their change internally and externally, all in the safety of their own environments.

Dave Sissens ([email protected]) is FNA’s Chief Solutions Officer.

More News

FNA Talks: Data Science in Economics and Finance for Decision Makers

Data Science in Economics and Finance for Decision Makers with ECB’s Per Nymand-Andersen.  In this latest episode of the FNA Talks podcast series, the ECB’s Per Nymand-Andersen joins FNA’s Suptech Lead, Adam Csabay to discuss one of the most awaited Fintech publications of 2021: Data Science in Economics and Finance for Decision Makers.  Per – […]

FNA contributes to Risk Book’s: Data Science in Economics and Finance for Decision Makers, Edited by Per Nymand-Andersen. 

FNA’ers Kimmo Soramäki, Ivana Ruffini, Mikhail Oet, Tuomas Takko and Adam Csabay contributed to the recent Risk Book: Data Science in Economics and Finance for Decision Makers, edited by the European Central Bank’s Per Nymand-Andersen.  The chapter FNA authored, Prudential Stress Testing in Financial Networks, provides a taxonomy of organizational problems facing firms operating in […]

Suptech Training & Events Update

During Winter 2021 we have been exceptionally busy with Suptech Training and Events for central banks and financial authorities. In February and March, we hosted the inaugural “Harnessing the Power of Suptech” Training Series – and very much enjoyed facilitating the engaging and thought-provoking discussions with 100+ delegates and guest speakers. In response to popular […]
Copyright FNA © 2021 | Privacy Policy