Connect with us


2023 Tech and Industry Predictions from Teradata Experts

By: Teradata experts

From advances in AI/ML tied to digital twins & simulations to the expansion of satellite/cellular partnerships to expand coverage to remote or under-served areas, our tech & industry experts weigh in on what they think are the game-changing predictions for 2023.

Technology & Business

Dan Spurling, SVP, Product Engineering

Trusted Social Connectedness: While Twitter is imploding, and social media is generally seen as a negative, I believe that humans still crave connectedness in this space – especially when it is intentionally curated to ensure dependence – but that we will require both 1) greater transparency into who is stating the information that we consume, while 2) ensuring some form of security and privacy only with those whom we select (obvious potential conflict)

Digital Twins: I believe there will be advances in the ML/AI evolution tied to digital twins or simulations; moving beyond just sensors that predict machine failure or buying propensities, and moving into predictions of economic markets, food production, population health, etc.

Data Reduction: There is an exponentially increasing amount of data, but I believe we will see rise of solutions that deduce the meaningful bits of data from the overall mass of data collected, or even reduce the footprint of data using new technologies beyond current classic data storage techniques

Personal Security: (Unfortunately) Driven by greater government destabilisation and associated erosion of trust in government, I believe we will see increasing tech advances in the areas of personnel security and security monitoring

Risk Aversion: I predict that there will be reduced willingness to take large risks or make investments into risky ideas, thereby increasing the success of entrenched incumbents and decreasing the broad proliferation of new tech adoption across the large enterprises, resulting in reduced startup growth and flat to growing revenue for large software or service providers.

Michael Hay, VP, Product Management

Consolidation, Concurrency and Currency: With the looming recession, there is a natural tendency to figure out how to do more with less. How to focus on profit overgrowth. As a result, customers will shrink their footprints and seek to do the same or more work. This speaks to deploying Data and Analytics systems that can incrementally scale, but return a benefit significantly larger than a nominal incremental investment. Another way to look at this is platforms that have the virtues of being cheaper to perform queries, experiment, and avoid the data copy tax will win.

More, not less, Cloud providers:

Two global patterns, increased protectionism and a strong shift towards profitability to weather the looming recession, point to the genesis of more, not less, cloud providers. These new providers can be one of:
• General providers focused on meeting country or region-specific protectionist policies and avoiding laws and regulations with global reach, like the USA Cloud Act.
• Cloud provider plays that emphasise a special focus on unique industry requirements. For example, Energy or Healthcare companies could shift their business towards providing cloud and analytics services with acute emphasis on their respective industries and regulatory regimes.
• SaaS companies who have reached sufficient scale and must become profitable to survive.
These providers will be looking for software and services that enable them to be successful as cloud providers, and companies who are capable of supplying them, will win.


Mike Skypala, Industry Lead, EMEA

Hybrid is here to stay: People are now using both online and offline formats to shop, with in-store experiences seen as a chance to touch, feel and see the products. Many retailers are following IKEA’s lead by showing consumers what a full “at-home room” could look like in their retail spaces, making it a more visually led interaction. This blended approach to shopping is likely to stick around, which adds a certain element of complexity for retailers looking to track and interact with customers on their purchase journey and understanding the profitability of each, with analytics helping to comprehend these shifts and changes in behaviour.

Cost conscious shopping will intensify in 2023: As the cost-of-living crisis continues, there will be a sustained focus on value and cost-effective shopping as we head into a New Year. With the launch of an “Essentials” range in almost every supermarket speaking to this ongoing focus, consumer spending on non-essential goods, including fashion, homeware and beauty is likely to also continue to fall. As a result, retailers should ensure a steady flow of canned foods and cupboard essentials as these remain the priority items for many.

Sustainability remains a priority: Though sustainability has been at the forefront of consumer minds for years now, we’ve yet to see it truly become a systemic part of a retailer’s business and baked into every decision made; instead, it is often a siloed group of ad-hoc initiatives. By collecting and examining data on a range of sustainability-related issues — from energy use and carbon emissions to mobile consumption habits — companies can generate insights that would drive their sustainability initiatives and inform their long-term strategy moving forward. It’s likely that some form of legislative policy will come in either within this coming year or the next, meaning retailers will have to reach a certain level of sustainable practice in order to keep trading.

Convenience shopping is set to get more convenient: It’s likely that automatic, “scan as you go” and self-check-out options will continue to increase around the country as consumers continue to demand more convenient, faster and streamlined shopping experiences. There’s an opportunity for retailers to expand on personalisation elements in real time, based on actions as consumers walk round the shop, moving away from static data and towards contextual data. Additionally, the U.S. is leading the way with computer vision and smart trollies in particular, which pick up both what is being put in a shopper’s trolley, as well as what needs replenishing on the shelves.

Dave Spear & David King, Senior Industry Consultants for the Retail, CPG & Hospitality

Industries at Teradata

Revenge of the CEO: Unlimited free returns? 15-minute delivery? Metaverse? Expect intense scrutiny from Finance on the ROI and NPV of such investments, with a tougher hurdle due to rising interest rates. Expect “sure” cost reduction proposals to win over “wishful” growth projects as investors crave cashflow and profitability.

Healthy Dose of Retail: Health retailing continues to blur the line between traditional healthcare providers and general retailers. We’ll see more small and large acquisitions by companies like Amazon, Walmart, Target, CVS and Walgreens, all trying to deliver new health services at affordable prices.

QR Beyond James Bond: QR-codes make a giant leap forward in retail. These square codes will unlock huge amounts of data for consumers to engage with and fuel new innovation in supply chain analytics.

Techies More Approachable: Silicon Valley layoffs and tougher work policies provide a window for traditionally less sexy retail tech teams to attract strong talent on the rebound.


Nadine Manjaro, Director, Industry Consultant in Telecommunications and IoT

Fixed Wireless Access: In 2023 US operators will deploy more Fixed Wireless Access solutions.

They will focus on streamlining offers to areas where they have excess network capacity to prevent negative impacts to mobile voice and data services. T-Mobile will continue the lead in the US with over 1.5 million FWA customers through September 2022, followed by Verizon with 1 million FWA customers. Both companies have publicly shared FWA subscribers’ projections. Verizon’s plans to reach 4 to 5 million subscribers by 2025 and T-Mobile’s plans to reach 7 or 8 million within a similar period.

Private 5G: There will also be an expansion of Private 5G services in manufacturing and retail enterprises to optimise manufacturing processes and retail experiences. Large enterprises are seeking end-to-end visibility throughout the manufacturing process as well as the supply chain process. Private 5G will enable more consistent coverage and support more advanced capabilities such as machine vision analytics which enables manufacturers to spot defects earlier and take corrective actions before the produce reached finished goods status.

Cellular/Satellite Partnerships: Expansion of cellular/satellite partnerships to extend coverage to remote and underserved areas. SpaceX and T-Mobile are teaming up to deploy cellular systems on low orbit satellites, this will fill in some coverage gaps in remote areas along some less travelled roads, national parks, and deserts.

Telcos in the cloud: Many Telcos will continue migrating their data to the cloud as a means of reducing costs and enabling wider use of data insights for decision making throughout the different departments. They will encounter cost overruns since some of the providers selected demonstrated value with small, limited workloads. As they move to scale the workloads, they will encounter migration issues, cost over-runs and performance limitations.

Security: Security management will continue to be a major concern in terms of who has access to their environment. This will delay the movement of some workloads to the cloud. The next generation data architecture will be multi-cloud, hybrid with on-prem, multi-vendor ecosystem which enables internal enterprise data marketplaces.

ARPU erosion: In the US, mobile data, and voice ARPU will decrease as operators compete to win subscribers in an oversubscribed market. Customers are more cost conscious because of inflationary pressures and will be more likely to switch providers based on free device offers and lower service charges. This will drive operators to lower the cost of mobile services which will erode ARPU.

C-band deployments: Verizon and AT&T will continue to expand C-band deployments to cover a larger segment of the US population and to gain ground on T-Mobile who has the best spectrum assets in the low and mid bands. They will also need C-band to expand Fixed

Wireless Access services with higher data rates than lower band spectrum.

Consumers win: Consumers will benefit with lower prices and better service. Those who are in remote areas with limited access to broadband will have more options with FWA and satellite to cellular partnerships such as the announced partnership with T-Mobile and SpaceX Starlink satellites. As more devices with both satellite and cellular capabilities proliferate, users can access service from anywhere on earth or even at sea. In addition, businesses will be able to track shipments across the entire route without coverage gaps. Initial coverage with start with test and multi-media but will later expand to voice and data coverage.


John Matthews, Managing Director Healthcare & Life Sciences

Shifts to digital: We will continue to see more shifts to digital settings across industries, but in particular for healthcare as virtual visits and digital consults have made a huge difference in a supply constrained regulated environment. Who wants to actually drive to the doctor when one can video chat just as effectively for many needs?

The politics of healthcare: The politics of healthcare remains so we’ll continue to see big fights over government spending in Medicare and Medicaid, as well as increasing debate over drug pricing. This fight, the lobbying dollars, the election season megaphones will simply not go away as entrenched interests, change agents, and economic realities contend in the public square.


Simon Axon, Industry Consulting Director, EMEA

ESG will continue to define banking in 2023: Governments and world leaders are under increasing pressure to implement stronger regulation and legislation that will demonstrate real change and commitment. Ultimately, governments see financial services as a vehicle to implement net zero policies, as well as to accelerate the path to net zero. We will see the cost of money becoming much higher for carbon damaging activity in the coming year, with more favourable rates provided to those implementing sustainable activities. To do so, banks will need granular information on a host of factors that determine the level of environmental impacts over time and risk across all sectors and all kinds of assets and investments.

Disruption as the “New Normal”: The repeated disruptions felt as a result of COVID-19, Brexit, war and political turmoil have, unsurprisingly, had a detrimental impact on the financial industry – as we’re seeing now with the ongoing rise of inflation and the increased cost of living. While ad-hoc crises are nothing new, these back-to-back and sometimes simultaneous crises is not something the industry has ever had to contend with. In 2023, the banking industry will need to further adapt as the definition of who is categorised as a ‘vulnerable’ customer changes. Banks will need smarter analytics in order to identify these vulnerable customers, with new factors calculating these scores, centred around reliability of income, as opposed to income vs. expenditure. The data needed to understand your customer base, therefore, will need to be more nuanced than it previously was.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *


Enhancing cybersecurity in investment firms as new regulations come into force

Source: Finance Derivative

Christian Scott, COO/CISO at Gotham Security, an Abacus Group Company


The alternative investment industry is a prime target for cyber breaches. February’s ransomware attack on global financial software firm ION Group was a warning to the wider sector. Russia-linked LockBit Ransomware-as-a-Service (RaaS) affiliate hackers disrupted trading activities in international markets, with firms forced to fall back on expensive, inefficient, and potentially non-compliant manual reporting methods. Not only do attacks like these put critical business operations under threat, but firms also risk falling foul of regulations if they lack a sufficient incident response plan. 

To ensure that firms protect client assets and keep pace with evolving challenges, the Securities and Exchange Commission (SEC) has proposed new cybersecurity requirements for registered advisors and funds. Codifying previous guidance into non-negotiable rules, these requirements will cover every aspect of the security lifecycle and the specific processes a firm implements, encompassing written policies and procedures, transparent governance records, and the timely disclosure of all material cybersecurity incidents to regulators and investors. Failure to comply with the rules could carry significant financial, legal, and national security implications.

The proposed SEC rules are expected to come into force in the coming months, following a notice and comment period. However, businesses should not drag their feet in making the necessary adjustments – the SEC has also introduced an extensive lookback period preceding the implementation of the rules, meaning that organisations should already be proving they are meeting these heightened demands.

For investment firms, regulatory developments such as these will help boost cyber resilience and client confidence in the safety of investments. However, with a clear expectation that firms should be well aligned to the requirements already, many will need to proactively step up their security oversight and strengthen their technologies, policies, end-user education, and incident response procedures. So, how can organisations prepare for enforcement and maintain compliance in a shifting regulatory landscape?


Changing demands

In today’s complex, fast-changing, and interconnected business environment, the alternative investment sector must continually take account of its evolving risk profile. Additionally, as more and more organisations shift towards more distributed and flexible ways of working, traditional protection perimeters are dissolving, rendering firms more vulnerable to cyber-attack.    

As such, the new SEC rules provide firms with additional instruction around very specific prescriptive requirements. Organisations need to implement and maintain robust written policies and procedures that closely align with ground-level security issues and industry best practices, such as the NIST Cybersecurity framework. Firms must also be ready to gather and present evidence that proves they are following these watertight policies and procedures on a day-to-day basis. With much less room for ambiguity or assumption, the SEC will scrutinise security policies for detail on how a firm is dealing with cyber risks. Documentation must therefore include comprehensive coverage for business continuity planning and incident response.

As cyber risk management comes increasingly under the spotlight, firms need to ensure it is fully incorporated as a ‘business as usual’ process. This involves the continual tracking and categorisation of evolving vulnerabilities – not just from a technology perspective, but also from an administrative and physical standpoint. Regular risk assessments must include real-time threat and vulnerability management to detect, mitigate, and remediate cybersecurity risks.  

Another crucial aspect of the new rules is the need to report any ‘material’ cybersecurity incidents to investors and regulators within a 48-hour timeframe – a small window for busy investment firms. Meeting this tight deadline will require firms to quickly pull data from many different sources, as the SEC will demand to know what happened, how the incident was addressed, and its specific impacts. Teams will need to be assembled well in advance, working together seamlessly to record, process, summarise, and report key information in a squeezed timeframe.

Funds and advisors will also need to provide prospective and current investors with updated disclosures on previously disclosed cybersecurity incidents over the past two fiscal years. With security leaders increasingly being held to account over lack of disclosure, failure to report incidents at board level could even be considered an act of fraud. 


Keeping pace

Organisations must now take proactive steps to prepare and respond effectively to these upcoming regulatory changes. Cybersecurity policies, incident response, and continuity plans need to be written up and closely aligned with business objectives. These policies and procedures should be backed up with robust evidence that shows organisations are actually following the documentation – firms need to prove it, not just say it. Carefully thought-out policies will also provide the foundation for organisations to evolve their posture as cyber threats escalate and regulatory demands change.

Robust cybersecurity risk assessments and continuous vulnerability management must also be in place. The first stage of mitigating a cyber risk is understanding the threat – and this requires in-depth real-time insights on how the attack surface is changing. Internal and external systems should be regularly scanned, and firms must integrate third-party and vendor risk assessments to identify any potential supply chain weaknesses.

Network and cloud penetration testing is another key tenet of compliance. By imitating how an attacker would exploit a vantage point, organisations can check for any weak spots in their strategy before malicious actors attempt to gain an advantage. Due to the rise of ransomware, phishing, and other sophisticated cyber threats, social engineering testing should be conducted alongside conventional penetration testing to cover every attack vector.

It must also be remembered that security and compliance is the responsibility of every person in the organisation. End-user education is a necessity as regulations evolve, as is multi-layered training exercises. This means bringing in immersive simulations, tabletop exercises and real-world examples of security incidents to inform employees of the potential risks and the role they play in protecting the company.

To successfully navigate the SEC cybersecurity rules – and prepare for future regulatory changes – alternative investment firms must ensure that security is woven into every part of the business. They can do this by establishing robust written policies and adhesion, conducting regular penetration testing and vulnerability scanning, and ensuring the ongoing education and training of employees.

Continue Reading


Gearing up for growth amid economic pressure: 10 top tips for maintaining control of IT costs

Source: Finance Derivative

By Dirk Martin, CEO and Founder of Serviceware

Three years on from the pandemic and economic pressure is continuing to mount more than ever. With the ongoing threat of a global recession looming, inflation rising, and supply chain disruption continuing to take its toll, cutting costs and optimizing budgets remains a top priority amongst the c-suite. Amid such turbulence, the Chief Financial Officer (CFO) and Chief Innovation Officer (CIO) stand firmly at the business’s helm, not only to steady the ship but to steer it into safer, more profitable waters. These vital roles have truly been pulled into the spotlight in recent years, with new hurdles and challenges being constantly thrown their way. This spring, for example, experts expect British businesses to face an energy-cost cliff edge as the winter support package set out by the government is replaced.  

Whilst purse strings are being drawn ever tighter to overcome these obstacles, there is no denying that the digitalization and innovation spurred on by the pandemic are still gaining momentum. In fact, according to Gartner, four out of five CEOs are increasing digital technology investments to counter current economic pressures. Investing in a digital future, driven by technologies such as the Cloud, Artificial Intelligence (AI), Blockchains and the Internet of Things (IoT), however, comes at a cost and to be able to do so – funds must be released through effective optimization of existing assets.

With that in mind, and with the deluge of cost and vendor data descending on businesses who adopt these technologies, never has it been more important for CIOs and CFOs to have a complete, detailed and transparent view of all IT costs. In doing so, business leaders can not only identify the right investment areas but increase the performance of existing systems and technology to tackle the impact of spiralling running costs.

Follow the below 10 steps to gain a comprehensive, detailed and transparent overview of all IT costs to boost business performance and enable your IT to reach the next level.

1: Develop an extensive IT service and product catalogue

The development of an IT service and product catalogue is the most effective way to kick-start your cost-optimization journey. This catalogue should act as a precise overview of all individual IT services and what they entail to directly link IT service costs to IT service performance and value. By offering a clear set of standards as to what services are available and comprised of, consumers can gain an understanding of the costs and values of the IT services they deploy.

2: Monitor IT costs closely

By mastering the value chain, a concept that aims to visualise the flow of IT costs from its most basic singular units through to realised business units and capabilities, businesses can keep track of where IT costs stem from. With the help of service catalogues, benchmarks, the use of a cost model focussing on digital value in IT Financial Management (ITFM) or what is often referred to as Technology Business Management (TBM) solutions, comprehensive access to this data can be guaranteed, creating a ‘cost-to-service flow’ that identifies and controls the availability of IT costs.

3: Determine IT budget management

Knowledge of IT cost allocation is a vital factor when making informed spending decisions and adjustments to existing budgets. There are, however, different approaches that can be taken to this including – centralized, decentralized and iterative. A centralized approach means that the budget is determined in advance and distributed to operating cost centres and projects in a top-down process, allowing for easy, tight budget allocation. A decentralized approach reverses this process – operating costs are precisely calculated before budgeting and projects are determined. Both approaches come with their own risks, for centralized overlooking projects that offer potential growth opportunities and for decentralized budget demands that might exceed available resources.

The iterative approach tries to unify both methods. Although the most lucrative approach, it also requires the most resources. So, the chosen approach is very much dependent on the available resources, and the enterprise’s structural organization.

4: Defining ‘run’ vs ‘grow’ costs

Before IT budget can be allocated, costs should be split into two distinct categories: running costs (i.e. operating costs) and costs for growing the business (i.e. products or services used to transform or grow the business). Once these categories have been defined, decisions should be made on how the budget should be split between them. A 70% run/30% grow split is fairly typical across most enterprises, but there is no one-size-fits-all approach, and this decision should be centred around the businesses’ overall strategies and end goals.

5: Ensuring investments result in a profit

By carrying out the aforementioned steps, complete transparency can be achieved over which products and services are offered, where IT costs stem from, and where budgets are allocated. From here, organizations can review how much of the IT budget is being used and where costs lead to profits and losses. By maintaining a positive profit margin, the controlling processes can be further optimized. If the profit margin is negative, appropriate, or timely, corrective measures can be initiated.

6: Staying on top of regulation

For a company that operates internationally (E.g. it markets IT products and services abroad), it is extremely important that it stays on top of country-specific compliance and adheres to varying international tax rules. To do so correctly it is necessary to provide correct transfer price documentation. This requires three factors:

  1. Transparent analysis and calculation of IT services based on the value chain
  2.  Evaluation of the services used and the associated billing processes
  3. Access to the management of service contracts between providers and consumers as the legal basis for IT services.

7: Stay competitive

Closely linked to the profit mentioned in step five is the question of how to price IT services in order to stay competitive whilst avoiding losses. This begins with benchmark data which can be researched or determined using existing ITFM solutions that can automatically extract them from different – interconnected – databases. From there, a unit cost calculation can be used to define exactly and effectively what individual IT services – and their preliminary products – cost. This allows organizations to easily compare internal unit cost calculations with the benchmarks and competitor prices, before making pricing decisions.

8: Identify and maintain key cost drivers

Another aspect of IT cost control that is streamlined via the comprehensive assessment of the cost-to-service flow is the identification and management of main IT cost drivers. A properly modelled value chain makes it clear which IT services or associated preliminary products and cost centres incur the greatest costs and why. This analysis allows for concise adjustment to expenditure and helps to avoid misunderstandings about cost drivers. Using this as a basis, strategies can be developed to reduce IT costs effectively and determine a better use of expensive resources.

9: Showback/Chargeback IT costs

By controlling IT costs using the value chain, efficient usage-based billing and invoicing of IT services and products can be achieved. If IT costs are visualized transparently, they can easily be assigned to IT customers, therefore increasing the clarity of the billing process, and providing opportunities to analyze the value of IT in more detail. When informing managers and users about their consumption there are two options: either through the ‘showback’ process – highlighting the costs generated and how they are incurred – or through the ‘chargeback’ process, in which costs incurred are sent directly to customers and subcontractors.

10: Analyse supply vs. demand

By following the processes above, transparency regarding IT cost control is further extended and discussions around the value of IT services are made possible across the organization. A more holistic analysis of IT service consumption allows conclusions to be drawn promptly to enable the optimization of supply and demand for IT services in various business areas. This, in turn, will enable a more comprehensive value analysis and optimization of IT service utilization.

Following these 10 cost management steps, a secure, transparent, and sustainable IT cost control environment can be developed, resulting in fully optimized budgets and in turn – significant cost savings. Cost-cutting aside, automating the financial management process in such an environment can boost productivity substantially freeing up time to focus on valuable work, thus leading to overall business growth.

The business and economic landscape is full of uncertainty right now, but business leaders can regain control via cost management, not only to weather current storms but to set themselves up for success beyond today’s turbulence.

Continue Reading


Banking on legacy – The risks posed by ‘stone age’ banking infrastructure

Source: Finance Derivative

By Andreas Wuchner, Angel Investor of Venari Security


If you consider the most significant motivating factors behind cyber-attacks – the promise of large financial reward and the opportunity to cause maximum business and social disruption – it’s little wonder that banks and financial institutions are amongst the most inviting targets for would-be cyber criminals. In fact, according to IBM’s recent report, ‘banking and finance’ was the most attacked industry for the five years between 2015 and 2020 – surpassed only by threats to critical infrastructure in recent years. Successful attacks can provide aggressors with a mass of sensitive personal and financial information, and even access to people’s money itself. Furthermore, a suspension of withdrawals and deposits can cause huge social disruption and reputational damage. 

As banks have reacted to years of new regulation and emerging technologies, they often operate with a hugely complicated and disparate technology estates. This provides malicious actors with a wealth of potential attack vectors. A small breach from anywhere in this network can have enormous consequences, and lead to entire systems being overrun. As such, it’s crucial that security teams operate with the highest-grade security possible, including ensuring the strongest level of encryption standards. Banks need to look beyond regulatory tick-box commitments and ensure they are taking proactive and preventative steps to monitor and combat malicious attacks across their entire network.

Andreas Wuchner

However, the ability to react to cyber-threats across a vast estate requires speed and flexibility to quickly react and update security protocols. The sheer volume of legacy infrastructure slows this process down considerably leaving many security teams in a vicious cycle. 

The threat of legacy infrastructure

A sizeable proportion of the banking industry still maintains a reliance on systems first developed more than 40 years ago. In fact, many ‘core banking’ systems, like payments, loans, mortgages and the associated technologies, are still coded using COBOL (Common Business-Orientated Language), an otherwise defunct programming language that is older than the internet itself. In the UK and Europe, COBOL remains the ‘backbone of banking services,’ while in the USA, as much as 43% of banking systems are built on COBOL, meaning it underpins much of our financial system.

This presents a huge security risk. While code has been regularly updated over the years, these systems were built when security threats were far less sophisticated, less well-financed and the burden of data was far less pronounced. For several years, governments have pointed towards legacy systems, built using COBOL, as a major cybersecurity threat, incompatible with modern security best practices and solutions, including multi-factor authentication. For example, data from Kaspersky found that businesses with outdated technology are much more likely to have suffered a data breach (65%) than those who keep their technology updated (29%).

A further security consideration is the diminishing number of people who are trained in maintaining COBOL systems. Every year, experienced professionals exit the industry, making it increasingly difficult to service legacy technologies and creating significant delays in patching threats once they’re identified. This lack of supply of sufficiently trained experts, and the demand they face, makes any updates extremely expensive and time consuming.

Furthermore, legacy infrastructure is preventing the secure application of encryption, posing its own distinct cybersecurity and regulatory risks. Encryption is often heralded as a silver bullet solution for data privacy and has been a continuing area of focus for regulatory bodies in recent years. However, banks remain guilty of poor deployment, maintenance and management of encryption – using outdated protocols and inefficient methods of analysing and understanding network traffic. This, coupled with legacy ‘core banking’ systems that are incompatible with modern encryption techniques, equates to a regulatory and security headache for security teams.

Adopting a new mindset

The risks posed by legacy systems and the volume of cybersecurity threats facing banks, mean a concentrated re-think of overall cybersecurity strategy is needed to prevent breaches and ensure data is protected long-term. Traditionally, banks have taken an ‘outside-in’ view – dedicating capacity, finances and knowledge to dealing with threats that are existing, known and well publicised. However, to aid long-term security, this should be superseded by an ‘inside-out’ proactive approach, whereby security teams are cognisant of their own internal systems and where the key vulnerabilities are found. Once banks have a detailed view of the security risks posed by their legacy systems, and specifically what data is threatened, they can address flaws, update these systems and build a stronger overall security posture.

The secure path ahead

Many of our successful high-street banks today have centuries of experience in dealing with social, economic and regulatory upheaval. However, the rapid development and deployment of technology continues to present a unique challenge. Many ‘traditional’ banks have built a complex technology infrastructure through decades of adjustment to new legislation and emerging technologies. While serviceable in the past, fintech start-ups are pushing the long-term viability of these systems to the limit.

Challenger banks have the luxury of being built from the ground-up, prioritising convenient digital services and features, and modern security processes. As the user base of these banks increase, customers are increasingly expecting these features and security from their existing banks, meaning even more complexity added to legacy infrastructures. As outlined by Deloitte, existing firms simply aren’t positioned to support the rising expectation of the market, exposing banks to additional risk and liability.

What’s more, it’s estimated that banks spend as much as 80% of their yearly IT budgets on the maintenance of legacy systems. While an immediate switch away from these systems is unrealistic, there is an opportunity to reduce wasted spend and divert spend towards modernisation efforts. However, while traditional banks may want to adapt quicker to technological advancements, they need to do so while continuing to minimise cyber risk and without jeopardising the security of their data or systems. This means placing cybersecurity at the heart of any modernisation efforts and maintaining a steady rate of change. As more of the technology estate begins to be modernised, the potential risks of regulatory non-compliance will also reduce.

Legacy systems need a considered update

Banking systems have heavily relied on legacy infrastructure for too long now, bringing difficulties in maintaining the highest-grade cybersecurity and in facilitating innovation. The risks presented by novel cybersecurity attack vectors and competition from new and emerging digital services offered by challenger banks are exacerbating these issues. As such, legacy systems need a managed modernisation in the long-term, facilitated in part by a managed redistribution of existing IT spend. However, to ensure long-term security overall, cybersecurity needs to be central to be at the very heart of modernisation efforts.


Continue Reading

Copyright © 2021 Futures Parity.