Basics of Quantum Computing
Source: Finance Derivative
Martin Lukac, Associate Professor from Nazarbayev University School of Engineering and Digital Sciences
A quantum computer is a device that performs quantum computations, harnessing the power of atomic and subatomic particles to perform high speed parallel computing.
Conceptually introduced by Richard Feynman in the 1980s as a method for solving instances of the many-body problem, it was not until recently that quantum computing became widely known to the public. The many-body problem is a general name for a large category of physical problems represented by systems of microscopic interacting particles.
When compared to other technology candidates designed to tackle the heat dissipation and the Moore’s limit of the current transistor-based computers, such as DNA computing, 3D transistor or carbon nano-tube, quantum computing has several advantages not available to these “more classical” technologies.
These advantages can be described by four basic postulates defining the principles and possibilities of quantum computing.
The first postulate regards information representation. Classical information in digital computers is represented by logical binary digits (bits). A logical bit can take the value of 1 or 0 depending on whether the voltage in the wire of a logic circuit is High or Low; think of the classic binary coding of 1s and 0s. In contrast, a quantum bit (qubit) is represented by a quantum state described by a wave equation: with and being complex numbers subject to . The quantum state specified by this wave equation is a point on the surface of something called a Bloch sphere.
The second postulate expands on the idea of quantum states: when multiple qubits are used together the space of their states expands exponentially. This means that a set of qubits can represent in the superposition all the combinations of the basis states.
The third postulate specifies that qubits and their states are manipulated using a set of unitary matrix operators: square matrices that are self-inverse. These matrix operators turn the qubit state along the three axes of the Bloch sphere.
The final postulate indicates that quantum information exists until it is not observed. This means that a quantum state of a qubit can contain both of the basis states or at the same time but when one wants to read the quantum state, the result will be either or .
The advantages of quantum computing were first demonstrated by David Deutsch’s algorithm, followed by the Deutsch-Jozsa algorithm, which demonstrated that quantum computers can answer in a single computational step the question of whether a given function is balanced or constant. A balanced function has half of its outputs 0(1) and a constant function has all outputs 0(1). For a classical computer to determine the answer to this question, it would need to examine at least half of the outputs of a given function. If more than half of the outputs are the same, the function is constant; if not, it is balanced.
Further developments in the nineties, and leading up to today, popularized quantum computing further. Peter Shor’s algorithm for exponentially accelerating integer factorization, Lov Grover’s quadratically accelerating search in un-ordered database, and recently the demonstration of quantum supremacy has made quantum computing very attractive for wider research and the investor community.
While Shor’s algorithm is one of the main motivations behind large governmental funding to quantum computing (think exponentially accelerating decryption of current encryption standards), Grover’s algorithm’s wider application spurred a large number of search acceleration optimization. Finally, the demonstration of quantum supremacy showed that it is indeed possible to construct quantum computers that perform much faster than any classical computer.
There are two main reasons behind the difficulty for quantum computers to break into the mainstream. Firstly, quantum computing computes in the quantum space. This implies that classical inputs have to be prepared (made quantum before processing) and quantum outputs of the computation have to be measured to be made classical and, therefore, available for further processing. This severely limits the amount of information that can be extracted from the quantum states which, in turn, limits the possible acceleration of computing using quantum computers.
Secondly, quantum states require an almost perfect vacuum, near-absolute zero temperature, and they do not like to remain in the desired quantum state due to decoherence, which means qubits interacting with the environment lose information. These issues are being gradually solved by progress in material science and by improving control protocols of quantum operations.
Near-future applications are already visible in the form of quantum security, quantum communication, quantum cryptography and large-scale quantum computation. Quantum computing has the potential to solve many of the current big data problems by accelerating the processing and storing it on even a denser space. Quantum supercomputers will be the first to appear within the next 10 years.
Enhancing cybersecurity in investment firms as new regulations come into force
Source: Finance Derivative
Christian Scott, COO/CISO at Gotham Security, an Abacus Group Company
The alternative investment industry is a prime target for cyber breaches. February’s ransomware attack on global financial software firm ION Group was a warning to the wider sector. Russia-linked LockBit Ransomware-as-a-Service (RaaS) affiliate hackers disrupted trading activities in international markets, with firms forced to fall back on expensive, inefficient, and potentially non-compliant manual reporting methods. Not only do attacks like these put critical business operations under threat, but firms also risk falling foul of regulations if they lack a sufficient incident response plan.
To ensure that firms protect client assets and keep pace with evolving challenges, the Securities and Exchange Commission (SEC) has proposed new cybersecurity requirements for registered advisors and funds. Codifying previous guidance into non-negotiable rules, these requirements will cover every aspect of the security lifecycle and the specific processes a firm implements, encompassing written policies and procedures, transparent governance records, and the timely disclosure of all material cybersecurity incidents to regulators and investors. Failure to comply with the rules could carry significant financial, legal, and national security implications.
The proposed SEC rules are expected to come into force in the coming months, following a notice and comment period. However, businesses should not drag their feet in making the necessary adjustments – the SEC has also introduced an extensive lookback period preceding the implementation of the rules, meaning that organisations should already be proving they are meeting these heightened demands.
For investment firms, regulatory developments such as these will help boost cyber resilience and client confidence in the safety of investments. However, with a clear expectation that firms should be well aligned to the requirements already, many will need to proactively step up their security oversight and strengthen their technologies, policies, end-user education, and incident response procedures. So, how can organisations prepare for enforcement and maintain compliance in a shifting regulatory landscape?
In today’s complex, fast-changing, and interconnected business environment, the alternative investment sector must continually take account of its evolving risk profile. Additionally, as more and more organisations shift towards more distributed and flexible ways of working, traditional protection perimeters are dissolving, rendering firms more vulnerable to cyber-attack.
As such, the new SEC rules provide firms with additional instruction around very specific prescriptive requirements. Organisations need to implement and maintain robust written policies and procedures that closely align with ground-level security issues and industry best practices, such as the NIST Cybersecurity framework. Firms must also be ready to gather and present evidence that proves they are following these watertight policies and procedures on a day-to-day basis. With much less room for ambiguity or assumption, the SEC will scrutinise security policies for detail on how a firm is dealing with cyber risks. Documentation must therefore include comprehensive coverage for business continuity planning and incident response.
As cyber risk management comes increasingly under the spotlight, firms need to ensure it is fully incorporated as a ‘business as usual’ process. This involves the continual tracking and categorisation of evolving vulnerabilities – not just from a technology perspective, but also from an administrative and physical standpoint. Regular risk assessments must include real-time threat and vulnerability management to detect, mitigate, and remediate cybersecurity risks.
Another crucial aspect of the new rules is the need to report any ‘material’ cybersecurity incidents to investors and regulators within a 48-hour timeframe – a small window for busy investment firms. Meeting this tight deadline will require firms to quickly pull data from many different sources, as the SEC will demand to know what happened, how the incident was addressed, and its specific impacts. Teams will need to be assembled well in advance, working together seamlessly to record, process, summarise, and report key information in a squeezed timeframe.
Funds and advisors will also need to provide prospective and current investors with updated disclosures on previously disclosed cybersecurity incidents over the past two fiscal years. With security leaders increasingly being held to account over lack of disclosure, failure to report incidents at board level could even be considered an act of fraud.
Organisations must now take proactive steps to prepare and respond effectively to these upcoming regulatory changes. Cybersecurity policies, incident response, and continuity plans need to be written up and closely aligned with business objectives. These policies and procedures should be backed up with robust evidence that shows organisations are actually following the documentation – firms need to prove it, not just say it. Carefully thought-out policies will also provide the foundation for organisations to evolve their posture as cyber threats escalate and regulatory demands change.
Robust cybersecurity risk assessments and continuous vulnerability management must also be in place. The first stage of mitigating a cyber risk is understanding the threat – and this requires in-depth real-time insights on how the attack surface is changing. Internal and external systems should be regularly scanned, and firms must integrate third-party and vendor risk assessments to identify any potential supply chain weaknesses.
Network and cloud penetration testing is another key tenet of compliance. By imitating how an attacker would exploit a vantage point, organisations can check for any weak spots in their strategy before malicious actors attempt to gain an advantage. Due to the rise of ransomware, phishing, and other sophisticated cyber threats, social engineering testing should be conducted alongside conventional penetration testing to cover every attack vector.
It must also be remembered that security and compliance is the responsibility of every person in the organisation. End-user education is a necessity as regulations evolve, as is multi-layered training exercises. This means bringing in immersive simulations, tabletop exercises and real-world examples of security incidents to inform employees of the potential risks and the role they play in protecting the company.
To successfully navigate the SEC cybersecurity rules – and prepare for future regulatory changes – alternative investment firms must ensure that security is woven into every part of the business. They can do this by establishing robust written policies and adhesion, conducting regular penetration testing and vulnerability scanning, and ensuring the ongoing education and training of employees.
Gearing up for growth amid economic pressure: 10 top tips for maintaining control of IT costs
Source: Finance Derivative
By Dirk Martin, CEO and Founder of Serviceware
Three years on from the pandemic and economic pressure is continuing to mount more than ever. With the ongoing threat of a global recession looming, inflation rising, and supply chain disruption continuing to take its toll, cutting costs and optimizing budgets remains a top priority amongst the c-suite. Amid such turbulence, the Chief Financial Officer (CFO) and Chief Innovation Officer (CIO) stand firmly at the business’s helm, not only to steady the ship but to steer it into safer, more profitable waters. These vital roles have truly been pulled into the spotlight in recent years, with new hurdles and challenges being constantly thrown their way. This spring, for example, experts expect British businesses to face an energy-cost cliff edge as the winter support package set out by the government is replaced.
Whilst purse strings are being drawn ever tighter to overcome these obstacles, there is no denying that the digitalization and innovation spurred on by the pandemic are still gaining momentum. In fact, according to Gartner, four out of five CEOs are increasing digital technology investments to counter current economic pressures. Investing in a digital future, driven by technologies such as the Cloud, Artificial Intelligence (AI), Blockchains and the Internet of Things (IoT), however, comes at a cost and to be able to do so – funds must be released through effective optimization of existing assets.
With that in mind, and with the deluge of cost and vendor data descending on businesses who adopt these technologies, never has it been more important for CIOs and CFOs to have a complete, detailed and transparent view of all IT costs. In doing so, business leaders can not only identify the right investment areas but increase the performance of existing systems and technology to tackle the impact of spiralling running costs.
Follow the below 10 steps to gain a comprehensive, detailed and transparent overview of all IT costs to boost business performance and enable your IT to reach the next level.
1: Develop an extensive IT service and product catalogue
The development of an IT service and product catalogue is the most effective way to kick-start your cost-optimization journey. This catalogue should act as a precise overview of all individual IT services and what they entail to directly link IT service costs to IT service performance and value. By offering a clear set of standards as to what services are available and comprised of, consumers can gain an understanding of the costs and values of the IT services they deploy.
2: Monitor IT costs closely
By mastering the value chain, a concept that aims to visualise the flow of IT costs from its most basic singular units through to realised business units and capabilities, businesses can keep track of where IT costs stem from. With the help of service catalogues, benchmarks, the use of a cost model focussing on digital value in IT Financial Management (ITFM) or what is often referred to as Technology Business Management (TBM) solutions, comprehensive access to this data can be guaranteed, creating a ‘cost-to-service flow’ that identifies and controls the availability of IT costs.
3: Determine IT budget management
Knowledge of IT cost allocation is a vital factor when making informed spending decisions and adjustments to existing budgets. There are, however, different approaches that can be taken to this including – centralized, decentralized and iterative. A centralized approach means that the budget is determined in advance and distributed to operating cost centres and projects in a top-down process, allowing for easy, tight budget allocation. A decentralized approach reverses this process – operating costs are precisely calculated before budgeting and projects are determined. Both approaches come with their own risks, for centralized overlooking projects that offer potential growth opportunities and for decentralized budget demands that might exceed available resources.
The iterative approach tries to unify both methods. Although the most lucrative approach, it also requires the most resources. So, the chosen approach is very much dependent on the available resources, and the enterprise’s structural organization.
4: Defining ‘run’ vs ‘grow’ costs
Before IT budget can be allocated, costs should be split into two distinct categories: running costs (i.e. operating costs) and costs for growing the business (i.e. products or services used to transform or grow the business). Once these categories have been defined, decisions should be made on how the budget should be split between them. A 70% run/30% grow split is fairly typical across most enterprises, but there is no one-size-fits-all approach, and this decision should be centred around the businesses’ overall strategies and end goals.
5: Ensuring investments result in a profit
By carrying out the aforementioned steps, complete transparency can be achieved over which products and services are offered, where IT costs stem from, and where budgets are allocated. From here, organizations can review how much of the IT budget is being used and where costs lead to profits and losses. By maintaining a positive profit margin, the controlling processes can be further optimized. If the profit margin is negative, appropriate, or timely, corrective measures can be initiated.
6: Staying on top of regulation
For a company that operates internationally (E.g. it markets IT products and services abroad), it is extremely important that it stays on top of country-specific compliance and adheres to varying international tax rules. To do so correctly it is necessary to provide correct transfer price documentation. This requires three factors:
- Transparent analysis and calculation of IT services based on the value chain
- Evaluation of the services used and the associated billing processes
- Access to the management of service contracts between providers and consumers as the legal basis for IT services.
7: Stay competitive
Closely linked to the profit mentioned in step five is the question of how to price IT services in order to stay competitive whilst avoiding losses. This begins with benchmark data which can be researched or determined using existing ITFM solutions that can automatically extract them from different – interconnected – databases. From there, a unit cost calculation can be used to define exactly and effectively what individual IT services – and their preliminary products – cost. This allows organizations to easily compare internal unit cost calculations with the benchmarks and competitor prices, before making pricing decisions.
8: Identify and maintain key cost drivers
Another aspect of IT cost control that is streamlined via the comprehensive assessment of the cost-to-service flow is the identification and management of main IT cost drivers. A properly modelled value chain makes it clear which IT services or associated preliminary products and cost centres incur the greatest costs and why. This analysis allows for concise adjustment to expenditure and helps to avoid misunderstandings about cost drivers. Using this as a basis, strategies can be developed to reduce IT costs effectively and determine a better use of expensive resources.
9: Showback/Chargeback IT costs
By controlling IT costs using the value chain, efficient usage-based billing and invoicing of IT services and products can be achieved. If IT costs are visualized transparently, they can easily be assigned to IT customers, therefore increasing the clarity of the billing process, and providing opportunities to analyze the value of IT in more detail. When informing managers and users about their consumption there are two options: either through the ‘showback’ process – highlighting the costs generated and how they are incurred – or through the ‘chargeback’ process, in which costs incurred are sent directly to customers and subcontractors.
10: Analyse supply vs. demand
By following the processes above, transparency regarding IT cost control is further extended and discussions around the value of IT services are made possible across the organization. A more holistic analysis of IT service consumption allows conclusions to be drawn promptly to enable the optimization of supply and demand for IT services in various business areas. This, in turn, will enable a more comprehensive value analysis and optimization of IT service utilization.
Following these 10 cost management steps, a secure, transparent, and sustainable IT cost control environment can be developed, resulting in fully optimized budgets and in turn – significant cost savings. Cost-cutting aside, automating the financial management process in such an environment can boost productivity substantially freeing up time to focus on valuable work, thus leading to overall business growth.
The business and economic landscape is full of uncertainty right now, but business leaders can regain control via cost management, not only to weather current storms but to set themselves up for success beyond today’s turbulence.
Mortgage digitalization: How mortgage lenders are automating the lending process
Source: Finance Derivative
By Fernando Zandona, Chief Product and Technology Officer at Mambu
The mortgage market has a long history, but its future is digital. As tech capabilities grow and consumer expectations evolve, mortgage providers are increasingly turning to digital solutions to attract and retain customers and streamline the lending process. According to research from the 2022 Celent Origination Study, over half of banks and 75% of building societies expect to make significant changes to their mortgage origination systems within 24 months. So, how is the mortgage industry transforming and what must lenders do to future-proof their business?
The acceleration of digitalisation in mortgage lending
There are several factors that have accelerated the digitalisation of mortgage lending. One is changes to consumer behaviour: customers have come to expect smooth digital experiences across all areas of their life (accelerated by the pandemic). As such, they seek similar ease, speed and efficiency when it comes to home buying.
Then there’s the arrival of fintechs. Newer fintechs are beginning to enter the mortgage sector – often through acquisitions, such as Starling Bank’s acquisition of Fleet Mortgage or Zoopla acquiring YourKeys. They are also bringing with them innovative digital solutions, which raise the bar for the whole industry. At the same time, regulatory changes are helping accelerate and facilitate digitalisation, such as the Bank of England’s decision to withdraw its affordability test recommendation and cut some of the red tape around mortgage lending, and HM Land Registry’s acceptance of electronic signatures. The combination of these forces have played a significant role in accelerating the lending process and making it more efficient.
Today’s financial institutions are offering a wide range of digital options, through online and mobile platforms, to their mortgage customers. Services include easier ways for customers to access and manage their mortgages, schedule a session with a mortgage advisor, find personalised recommendations, and access improved security measures to protect sensitive customer information.
That’s not to mention the embrace of open banking has enabled seamless integration of customer data into the lending process. This innovation is helping reduce the number of steps needed to collect data and resulting in faster processing times, less rekeying of information and lower origination costs. Offering faster, cheaper loan decisions is a crucial advantage in an increasingly-crowded mortgage market and automated processes reduce teams’ manual work and eliminate costly human errors.
Digitalising in the right way
The success of these new products and processes relies on the way mortgage lenders introduce and configure them. Agility is key – lenders need to prioritise configurability and scalability when building new products and choosing technology partners, as they must be able to quickly launch new features or make adjustments, in line with evolving customer expectations, emerging trends and changing industry regulations. The use of software-as-a-service (SaaS) platforms and application programming interface (API) integrations helps with this, allowing for faster feature launches and less internal friction.
APIs are just part of future-proofing the mortgage market. According to Forbes, 55% of senior executives in the US mortgage industry think that AI will make their firm, and the industry overall, more competitive. AI and machine learning can assist lenders in analysing data more quickly, leading to more efficient decision-making and forecasting, although as with all AI applications, providers must be vigilant about encoded bias that can radically increase discrimination.
The mortgage landscape is transforming through digitalisation, and this is bound to continue. Lenders who want to keep up the pace with this change – and reap the benefits of faster, smoother processes as well as keep satisfied, loyal customers – will be future-proofing their processes through lending automation and putting customer ease at the centre of their offering.