Mainframe offloading and stream Processing to support BNL’s digital channels
The evolution of data management architectures must have the ambition to coexist with the IT ecosystem that companies have consolidated over time and, hopefully, improve it. Back office systems are, by definition, the victim of significant computational loads and repeated capacity problems. They clash with the now pervasive need to have digital touchpoints as synchronized as possible in near real time to guarantee an experience holistically to customers, able to satisfy their needs but also to be able to make strategic decisions on the so-called ‘fresh data’. In 2020, BNL started a project for the implementation of a platform capable of offloading data from its mainframe and then processing it in real time and subsequently consolidating and aggregating it by reconstituting events relating to different domains, allowing its use by part of banking applications. The solution exploits the change data capture mechanism, which was implemented using Qlik Replicate. This new platform has Confluent Kafka as the cornerstone of its creation and has allowed open banking systems to have a constantly updated view of the customer (personal data, account movements, purchased products, KYC, …) achieving not only the objectives mentioned previously but reducing including mainframe-related impacts and costs.