Source: Finance Derivative
Authored by Jason Hill, Reply
Financial services companies are no strangers to complex algorithms, but even today’s most sophisticated software can only analyse a fraction of available data. However, quantum computing is about to change all that. Quantum computers will far surpass the limitations of classic computers: performing complex tasks within minutes and completing actions that were once deemed impossible. So what does this mean for financial services?
Use cases in quantum
The potential impact of quantum computing can hardly be overstated. Just as the development of the microprocessor in the 1970s unlocked the power of personal computers to the average end user, and the proliferation of the Internet in the 1990s revolutionised the way the world communicates; quantum computing, with its vastly superior processing power, will have a transformative impact on virtually every industry and individual.
In financial services, there are a myriad of applications where it can be applied including reinforcing cyber security, targeting investments, profiling risk, optimising portfolios and liquidity management, from context-defining indicators to collateral optimisation.
In the portfolio optimisation case for example, quantum computing could be used to limit a company’s exposure by identifying a portfolio of assets with minimal correlation between them. This is particularly useful when diversifying the portfolio of securities to reduce any risk that might impact return. Furthermore, as the size of an investment portfolio increases, so does the complexity of the computational problem. Quantum computing can quickly solve problems that would take days, months or even years on traditional computers.
Quantum will ultimately help financial institutions prepare for their future and get ahead of their competition by knowing more, more quickly. For example, Reply recently worked with a credit institution to develop a quantum computing algorithm that allowed it to optimise daily collateral costs related to over-the-counter derivatives trading. This took into account non-linearities in the model and involved a dedicated simulation-based optimisation tool to plan for multiple scenarios.
From quantum computing to predictive analytics
One particularly interesting application in quantum computing is predictive analytics which can be used to forecast future events based on past data. Quantum computing can even help users make smart assumptions about data that doesn’t exist. For example, a bank’s cash flow can be projected using the so-called the Monte Carlo method which involves getting a clear, statistical picture based on a high number of simulations. Monte Carlo simulations are a form of predictive analytics and because they require a lot of calculations (with potentially many variables), quantum can process them much faster. This is particularly useful in portfolio management as for example, it allows an analyst to determine the size of the portfolio a client would need at critical times, such as at retirement, to support their desired lifestyle.
Financial companies aren’t at a loss for historical data sources: contracts, transactions, inquiries, and claims. These are the solution to a more certain future. By learning from past knowledge companies can make future estimations with higher accuracy.
Where can we go from here?
The performance ability of quantum computers far outweighs current possibilities. The range of problems that can be addressed thanks to Quantum Computing is broad: it does not stop at combinatorial optimization but, instead, crosses into other areas such as machine learning and quantum security. Quantum neural networks and quantum internet networks are just two of the more interesting ones.
Quantum machine learning (QML), makes the most of the advantages of two current themes: quantum computing and machine learning. Although QML is still in its early stages, it nevertheless offers a whole new world of opportunities, combining the new knowledge provided by machine learning with the accelerated calculation potential and the enhanced accuracy of quantum calculations.
It is not a trade secret to know that today, all major financial services companies have departments focused on Big so they can benefit from the huge amount of data collected over the decades. And with the increase in remote cloud computing power, much more complex prediction models can be employed.
The race is now on for the companies that provide quantum computing solutions to fully realise their potential but once those solutions are in place, they will have a huge advantage over their competitors. It makes sense for them to partner early with companies who have existing use cases in quantum computing. Because the companies that adopt Big Data and Machine Learning processes will build more commercially efficient offerings that will have customers lining up.
Building a Greener Web: Six Way to Put Your Website on an Emissions Diet
By Roberta Haseleu, Practice Lead Green Technology at Reply, Fiorenza Oppici, Live Reply, and Lars Trebing, Vanilla Reply
Most people are unaware or underestimate the impact of the IT sector on the environment. According to the BBC: “If we were to rather crudely divide the 1.7 billion tonnes of greenhouse gas emissions estimated to be produced in the manufacture and running of digital technologies between all internet users around the world, it would mean each of us is responsible for 414kg of carbon dioxide a year.” That’s equivalent to 4.7bn people charging their smartphone 50,000 times.
Every web page produces a carbon footprint that varies depending on its design and development. This must be more closely considered as building an energy efficient website also increases loading speeds which leads to better performance and user experience.
Following are six practical steps developers can take to reduce the environmental impact of their websites.
- Implement modularisation
With traditional websites that don’t rely on single page apps, each page and view of the site is saved in individual html files. The code only runs, and the data is only downloaded, for the page that the user is visiting, avoiding unnecessary requests. This reduces transmitted data volume and saves energy.
However, this principle is no longer the standard in modern web design which is dominated by single page apps which dynamically display all content to the user at runtime. This approach is easier and faster to code and more user-friendly but, without any precautions, it creates unnecessary overheads. In the worst case, accessing the homepage of a website may trigger the transmission of the entire code of the application, including parts that may not be needed.
Modularisation can help. By dividing the code of a website into different modules, i.e. coherent code sections, only the relevant code is referenced. Using modules offers distinct benefits: they keep the scope of the app clean and prevent ‘scope creeps’; they are loaded automatically after the page has been parsed but before the Document Object Model (DOM) is rendered; and, most importantly for green design, they facilitate ‘lazy loading’.
- Adopt lazy loading
The term lazy loading describes a strategy of only loading resources at the moment they are needed. This way, a large image at the bottom of the page will not be loaded unless the user scrolls down to that section.
If a website only consists of a routing module and an app module which contain all views, the site will become very heavy and slow at first load. Smart modularisation, breaking down the site into smaller parts, in combination with lazy loading can help to load only the relevant content when the user is viewing that part of the page.
However, this should not be exaggerated either as, in some instances, loading each resource only in the last moment while scrolling can annihilate performance gains and result in higher server and network loads. It’s important to find the right balance based on a good understanding of how the app will be used in real life (e.g. whether users will generally rather continue to the next page after a quick first glance, or scroll all the way down before moving on).
- Monitor build size
Pre-processors come with the possibility to prevent a build to complete if its files are bigger than a variable threshold. Limits can be set both for the main boot script as well as the single chunks of CSS to be no bigger than a specific byte size after compilation. Any build surpassing those thresholds fails with a warning.
If a build is suspiciously big, a web designer can inspect it and identify which module contributes the most, as well as all its interdependencies. This information allows the programmer to optimise the parts of the websites in question.
- Eliminate unused code
One potential reason for excessive build sizes can be dozens of configuration files and code meant for scenarios that are never needed. Despite never being executed, this code still takes up bandwidth, thereby consuming extra energy.
Unused parts can be found in own source code but also (and often to a greater extent) in external libraries used as dependencies. Luckily, a technique called ‘tree shaking’ can be used to analyse the code and mark which parts are not referenced by other portions of the code.
Modern pre-processors perform ‘tree shaking’ to identify unused code but also to exclude it automatically from the build. This allows them to package only those parts of the code that are needed at runtime – but only if the code is modularised.
- Choose external libraries wisely
One common approach to speed up the development process is by using external libraries. They provide ready-to-use utilities written and tested by other people. However, some of these libraries can be unexpectedly heavy and weigh your code down.
One popular example is Moment.js, a very versatile legacy library for handling international date formats and time zones. Unfortunately, it is also quite big in size. Most of all, it is neither very compatible with the typical TypeScript world nor is it modular. This way, also the best pre-processors cannot reduce the weight that it adds to the code by means of ‘tree shaking’.
- Optimise content
Designs can also be optimised by avoiding excessive use of images and video material. Massive use of animation gimmicks such as parallax scrolling also has a negative effect. Depending on the implementation, such animations can massively increase the CPU and GPU load on the client. To test this, consider running the website on a 5 to 10-year-old computer. If scrolling is not smooth and/or the fans jump to maximum speed, this is a very good indication of optimisation potential.
The amount of energy that a website consumes — and thus its carbon footprint — depends, among other factors, on the amount of data that needs to be transmitted to display the requested content to users. By leveraging the six outlined techniques above, web designers can ‘slim’ their websites and contribute to the creation of a more sustainable web whilst boosting performance and user experience in the process.
The Role of Software Development in Shaping the FinTech Industry in 2023 and Beyond
Source: Finance Derivative
Paul Blowers, Commercial Director at Future Processing
As another year passes, now is the time for company leaders to look back at the last 12 months and consider what’s in store for their FinTech businesses in 2023. One of the biggest impacts of last year was undoubtedly the cost of living crisis and increasing interest rates, leading to UK FinTech investment dropping to $9.6 billion in the first half of 2022 – down from $27.8 billion in the same period in 2021. Whilst these challenges remain at the forefront of the industry, there are plenty of innovative developments and technologies evolving in the FinTech space right now that will continue the pace of change. It’s vital for organisations to keep abreast of these trends, to ensure they can remain competitive and continue providing customers with the highest quality products and services.
Innovations in FinTech
In recent years, we have seen larger banks begin to invest more heavily in BaaS (banking as a service). BaaS is a start-to-finish process that digital banks and third parties use to connect their own business infrastructure to a bank’s system via APIs. This allows digital banks or third parties to offer full-banking services directly through their non-bank business offerings. Typically, BaaS is associated with smaller banks due to the favourable interchange rates under $10 billion (in assets) that these banks have. With a bigger focus on commercial BaaS efforts, we can predict seeing more vertical partnerships with SaaS providers who already have existing relationships with businesses.
An alternative to providing BaaS is to pursue an embedded FinTech strategy. Embedded FinTech refers to the integration of FinTech products and services into financial institutions’ websites, mobile apps, and business processes. This has been growing at pace since the COVID pandemic and is expected to continue on its upward trajectory, accelerating eCommerce, financial digitalisation and consumer expectations. As a result, we can expect that more platforms will be diversifying their service offerings as they deepen their relationships with small business customers.
Another topic that has been circulating in the FinTech sector is the intervention of AI and chatbots. 2023 is set to be the year that this technology fully takes off and integrates with mainstream banks and FinTech. Chatbots can be defined as rule-based systems which can perform routine tasks with general FAQs. The primary goal of these AI-drive chatbots is to provide human-like support for customers, communicating with them, introducing services, answering their questions and receiving any complaints.
Software Development for FinTechs
As banks continue to invest in new technologies and leverage the benefits of adopting BaaS, embedded finance and AI, the focus on software development services also increases. Software is at the heart of every FinTech business, as each product or service demands a high-quality implementation, from both existing and potential customers. One of the biggest expectations is around user experience, as FinTech leaders aim to provide a straightforward, transparent and concise solution to their customer’s business problems. Additionally, security can not be underestimated with the FinTech industry under constant risk of cyberattacks and breaches. With exceptional software development, FinTech solutions can both comply with strict security and data encryption standards, whilst offering a polished and streamlined user experience for customers.
The finance industry also comes up against a constant stream of industry regulations, meaning a compliance strategy must be a priority for FinTech’s when considering their software development approach. This links to checking and implementing updates for frameworks and software architectures regularly to ensure app responsiveness, security and performance remain at the forefront. Ultimately, great software increases a FinTech’s opportunity to leverage emerging technologies and keep control over the quality of its service. Timely identification of key trends makes it possible to maximise the digitalisation of finance to drive long-term value for FinTech businesses and their customers.
The Future of FinTech
Whilst technological developments have been major drivers of FinTech innovation, now is the time to further digitise financial services and the banking sector to build a more inclusive and efficient industry that promotes economic growth. FinTech’s are stepping up to lead, navigate and disrupt the industry during this time of uncertainty, and software development will play a vital role in shaping the future landscape. With the help of software development, FinTech’s will build capabilities and applications that can be easily integrated into the environments where customers are already engaged, meeting their changing needs, new business goals and regulatory demands.
Will cyberattacks be uninsurable in 2023? Three steps that financial organisations can follow now
Source: Finance Derivative
By James Blake, Field CISO of EMEA, Cohesity
The growing number of cyber attacks and subsequent damage has led to an increasing demand for cyber insurance. Swiss Re Insurance expects total premiums paid to more than double from $10 billion from 2020 to $23 billion by 2025. But this is being questioned by both insurance companies and by customers: is insurance effective, is it feasible, what does it cover and what does it enable? The CEO of Zurich Insurance, Mario Greco, said in an interview with the Financial Times recently that cyberattacks will soon become “uninsurable”. Indeed, insurance and prevention have both proved ineffective in stopping cyberattacks like ransomware or in enabling organisations to recover afterwards. Instead organisations must shift their focus onto recovery, What can companies do to meet this challenge? James Blake, Field CISO of EMEA at data management and security provider Cohesity, has three recommendations.
More than 400 million US dollars – that’s how much damage the data leak at Capital One caused in 2019. And the number of such attacks, which have catastrophic consequences for the companies affected, has continued to increase since then. According to Check Point, in the third quarter of 2022 alone, global attacks increased significantly by 28% compared to the same quarter of the previous year.
Where cyber risk used to be limited to areas such as data breaches and third-party liability, ransomware attacks have shifted the damage to core business and accountability. Cyber insurers had to react to the increased risk and have adjusted their offers, as an analysis by Swiss Re Insurance shows. According to PWC, from the insurer perspective, the fast-increasing frequency of ransomware attacks (and the growing associated impacts and ransom demands) and business interruption claims has resulted in cyber becoming a less profitable area of insurance in recent times. The situation has stabilised over the past year as customers have had to pay higher premiums and meet stricter terms and conditions. Swiss Re Insurance expects total premiums paid to more than double from $10 billion to $23 billion by 2025.
More expensive and more difficult to qualify
This is bad news for the financial industry, as insurers are becoming stricter and asking for higher premiums. Cohesity’s legal experts looked at the leading ransomware insurance policies on the market at the end of 2022 and found that ultimately, such guarantees are little more than thinly veiled limitations of liability that benefit the providers – not the customers.
However, there are some measures that companies can use to protect themselves effectively in this new market situation:
- The 3-2-1 strategy remains current: keep an isolated copy of the data
In some cases, organisations are required to quarantine an offsite copy of their production records as part of a 3-2-1 strategy to qualify for cyber insurance.
To do this, they can use a SaaS service which keeps an encrypted copy of the production data in the cloud, isolated by a virtual air gap. The data stored there is monitored with multi-layered security functions and machine learning, and anomalies are reported immediately.
- Tear down silos and merge data with zero-trust in mind
In general, financial organisations should consolidate all their distributed data on a scalable data management platform and ensure they can backup their data across all their infrastructure and assets. Furthermore, the data must be protected in a zero trust model, where the data is encrypted during transfer and on this storage, access is strictly regulated with rules and multi-factor authentication. In addition, all data stored in it can be managed according to compliance requirements and, thanks to immutable storage, is better protected against ransomware.
- Improve collaboration between IT and SecOps teams for cyber resiliency
In addition to these technical measures, financial organisations should optimise the collaboration between their IT and security teams and adopt a data-centric focus on cyber resilience. For too long, many security teams have focused primarily on preventing cyberattacks while IT teams have focused on protecting data including backup and recovery.
A comprehensive data security strategy must unite these two worlds and IT and SecOps teams must work together before the attack takes place. Both teams should be guided by the NIST framework. This holistic approach defines five core disciplines: Identify, Protect, Detect, Respond and Recover.
If a financial company can demonstrate such a mature data security strategy, this will not only have a positive effect on insurance cover, but will generally reduce the risk of incidents and possible consequential damage through failure or data loss.