Connect with us

Technology

WHY RANSOMWARE READINESS IN THE FINANCE SECTOR IS CRITICAL

Source: Finance Derivative

By Piers Wilson, Head of Product Management at Huntsman Security

Ransomware attacks have been making headlines recently. From AXA to CNA Financial, no part of the finance sector is impervious to the risks. For many organisations, initial worries focus on the logistics and the cost of a ransom, however, the wider damage and costs increasingly relate to rectification, revenue loss and reputational damage. Attacks, such as in the Kaseya case, have also shown the increasing risks that “trusted” service providers and 3rd party supply chain participants can bring – multiplier effects that can quickly  impact one million endpoints, with a ransom set at US$70m.

The network effect in the financial services sector benefits all stakeholders – from institutions to consumers. The increase in shared data and services, however, compounds the risks of successful cyber attacks. And, as we have seen with the impact of ransomware on pipelines and even food processors, the impact on organisations, and individuals, of being locked out of systems is huge. If customers cannot access funds or transact with service providers across the supply chain, anxiety and costs can escalate and commercial reputations quickly trashed.

An easy way out?

Businesses might have once seen the payment of a ransom as a potential ‘quick fix’ to the problem of ransomware attacks. This option, however, is now likely to become a thing of the past as bans on ransom payments are being contemplated in France and in the US by the SEC and OFAC. . In Australia, there are calls for mandatory notifications of ransom payments by ransomware victims.

Finance sector organisations also need to consider that even when ransoms are paid, the decryption process and returning to business as usual can be so slow that the ability to reinstate operations from their own internal backups and security safeguards can be achieved in the same time. As the scale of attacks and disruption of those impacted by supply chain ransomware attacks escalates, the message is increasingly that time is of the essence. If you can’t trust the decryption key from an attacker, then you are best advised to invest your time and effort in reconstructing, reconfiguring and securing your IT systems and services from the ground up so as to be confident in their integrity.

Despite the possibility that the payment of ransoms will become unlawful, cyber insurance will remain an effective tool for organisations to fund the process of getting back up and running quickly and reducing disruption. Insurers are demanding that prior to issuing a cyber policy, organisations must now show evidence of their having adequate cyber security controls in place. In fact, growing ransomware threats make it likely that insurance premiums will increase even further, so getting verifiable cyber risk management capabilities in place is likely to move even further up the list of board priorities.

A challenging environment

The financial sector also faces some other more particular challenges. Many financial institutions hold vast amounts of personal data, whether on accounts, transactions, users or reports. Complicating this is open banking legislation, like PSD2 in the UK/EU and CDR in Australia, which requires that the process of customer approved sharing of their personal data, is easy and accessible. These rights for consumers to have their personal information held and transmitted between financial sector participants will necessarily redistribute the responsibilities for cyber security in the sector and as a result, increase the levels of cyber security risk during this period of adjustment to a changing environment.

The financial services sector is already – and indeed, always has been – an attractive target for criminals at all levels. The requirement that customers have greater control over access to their data adds the requirement for whole new level of ransomware readiness. Organisations could face anything from disgruntled employees, to fraud, to criminal ransomware attacks seeking to enable the wholesale theft of personal data. The stakes couldn’t be higher; so what can the sector do to protect itself?

Preparing for ransomware attacks

Putting in place anti-virus software and network defences – alongside the rise of endpoint detection and response – can certainly help manage attacks. But these solutions rely on detecting malicious activity in the first place. What if your endpoint or network solution misses the attack, without warning? Do you have visibility into what’s happening? Are there other controls in place that can mitigate the threat? Are they monitored and managed as part of an IT risk management program?

More attention must be given to preventing or at least limiting successful ransomware attacks before they do serious damage.  Getting the basic cyber security controls in place and working to protect recognised threat vectors, really pays dividends as these are precisely the weaknesses that ransomware attackers are likely to exploit.

There are three areas to focus on. The first two are the prevention of any initial infection and containment or limitation of the spread if one does occur. These strategies need to be coupled to a third, recovery, which ensures systems and data can be restored and an incident can be successfully managed. The core principles of effective risk management apply – identify and triage the risks and manage them accordingly.

There are some key safeguards organisations can adopt to support each of these elements:

Prevention

  • Application control – ensuring only approved software can run on a computer system, securing systems by limiting what they can execute.
  • Application patching – applications must be regularly updated to prevent intruders using known vulnerabilities in software.
  • Macro security – checking that macro and document settings are correctly configured and to prevent the activation of malicious code.
  • Harden user applications and browsers – use effective security policies to limit user access to active content and web code.
  • Firewalls/perimeter – and even physical on-site security – limit user access outbound and remote connections inbound.
  • Staff awareness – while not a technical control, building a “cyber culture” and a better understanding by staff of cyber security, the threats and mitigation strategies that can minimise cyber attacks, is vital.

Containment

  • Restrict administrative privileges – limit admin privileges by allowing only those staff needing to access systems to do so, and then solely for specified purposes and within controlled access.
  • Operating system patching – fully patched operating systems will significantly reduce the likelihood of malware or ransomware spreading across the network from system to system.
  • Multi-factor authentication – used to manage user access to highly sensitivity accounts and systems (including remote users).
  • Endpoint protection – install anti-virus software and keep it updated.

Recovery

  • Regular backups – secure data and system backups off-site and test your recovery processes.
  • Incident response – in planning for a worst case scenario make sure everyone is well versed in the incident management playbook.

Gaining assurance in controls

Businesses must make sure they are monitoring their security controls to ensure that they are working effectively. If one control is ineffective, the IT teams need to know quickly to mitigate any shortcomings and reinstate an adequate cyber posture. A “cyber security culture” that ensures these risks are a board level issue will improve overall corporate ransomware preparedness.

The board should receive reports that provide clear visibility of these controls, and leverage these KPIs as part of their cyber security risk management process. They can be used as part of a continuous cyber security improvement program. Being able to monitor readiness and assess the risk of attack provides early warning defence and confirmation that cyber security risk management processes are in hand.

Summary

The financial services sector faces many challenges when it comes to putting in place comprehensive cyber security risk management practices. If a bank or insurer was affected by a significant ransomware attack, the wider implications on the economy could be significant. Recent fuel shortages resulting from the Colonial Pipeline incident gave us a glimpse of the resulting widespread public panic and concern. It was reminiscent of the run on Northern Rock bank branches in the UK in 2007, at the start of the financial crisis. It doesn’t take much to imagine the level of public panic that would ensue if a massive ransomware attack locked consumers out from accessing their funds.

Organisations in the sector must have comprehensive cyber defences and controls, backed up by regular monitoring to make sure they are working effectively, and ensure that if one control fails to identify or prevent an attack, other complementary controls are operational and able to limit its impact.

That way the risk of a successful attack can be minimised, and organisations can maintain effective IT governance to better prevent costly disruption to their systems, operations and reputations.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

Hype, Hysteria & Hope: AI’s Evolutionary Journey and What it Means for Financial Services

Source: Finance Derivative

Written by Gabriel Hopkins, Chief Product Officer at Ripjar

Almost a year to the day since ChatGPT launched, the hype, hysteria, and hope around the technology shows little signs of abating. In recent weeks OpenAI chief Sam Altman was removed from his position, only to return some days later. Rishi Sunak hosted world leaders at the UK’s AI Safety Summit, interviewing the likes of Elon Musk in front of an assembly of world leaders and tech entrepreneurs. While behind the scenes, AI researchers are rumoured to be close to even more breakthroughs within weeks.

What does it all mean for those industries that want to benefit from AI but are unsure of the risks?

It’s possible that some forms of machine learning – what we used to call AI – have been around for a century. Since the early 1990s, those tools have been a key operational element of some banking, government, and corporate processes, while being notably absent from others.

So why the uneven adoption? Generally, that has been related to risk. For instance, AI tools are great for tasks like fraud detection. It’s a well-established that an algorithm can do things that analysts simply can’t by reviewing vast swathes of data in milliseconds. And that has become the norm, particularly because it is not essential to understand each and every decision in detail.

Other processes have been more resistant to change. Usually, that’s not because an algorithm couldn’t do better, but rather because – in areas such as credit scoring or money laundering detection – the potential for unexpected biases to creep in is unacceptable. That is particularly acute in credit scoring when a loan or mortgage can be declined due to non-financial characteristics.

While the adoption of older AI techniques has been progressing year after year, the arrival of Generative AI, characterised by ChatGPT, has changed everything. The potential for the new models – both good and bad – is huge, and commentary has divided accordingly. What is clear is that no organisation wants to miss out on the upside. Despite the talk about Generative and Frontier models, 2023 has been brimming with excitement about the revolution ahead.



Two Objectives

A primary use case for AI in the financial crime space is to detect and prevent fraudulent and criminal activity. Efforts are generally concentrated around two similar but different objectives. These are thwarting fraudulent activity – stopping you or your relative from getting defrauded – and adhering to existing regulatory guidelines to support anti-money laundering (AML), and combatting the financing of terrorism (CFT).

Historically, AI deployment in the AML and CFT areas has faced concerns about potentially overlooking critical instances compared to traditional rule-based methods. Within the past decade, and other regulators initiated a shift by encouraging innovation to help with AML and CFT cases. Despite the use of machine learning models in fraud prevention over the past decades, adoption in AML/CFT has been much slower with a prevalence for headlines and predications over actual action. The advent of Generative AI looks likely to change that equation dramatically.

One bright spot for AI in compliance over the last 5 years, has been in customer and counterparty screening, particularly when it comes to the vast quantities of data involved in high-quality Adverse Media (aka Negative News) screening where organisations look for the early signs of risk in the news media to protect themselves from potential issues.

The nature of high-volume screening against billions of unstructured documents has meant that the advantages of machine learning and artificial intelligence far outweigh the risks and enable organisations to undertake checks which would simply not be possible otherwise.

Now banks and other organisations want to go a stage further. As Generation AI models start to approach AGI (Artificial General Intelligence) where they can routinely outperform human analysts, the question is when, and not if, they can use the technology to better support decisions and potentially even make the decisions unilaterally.


AI Safety in Compliance

The 2023 AI Safety Summit was a significant milestone in acknowledging the importance of AI. The Summit resulted in 28 countries signing a declaration to continue meetings to address AI risks. The event led to the inauguration of the AI Safety Institute, which will contribute to future research and collaboration to ensure its safety.

Though there are advantages to having an international focus on the AI conversation, the GPT transformer models were the primary focus areas during the Summit. This poses a risk of oversimplifying or confusing the broader AI spectrum for unaccustomed individuals. There is a broad range of AI technologies with hugely varying characteristics. Regulators and others need to understand that complexity. Banks, government agencies, and global companies must exert a thoughtful approach to AI utilisation. They must emphasise its safe, careful, and explainable use when leveraged inside and outside of compliance frameworks.


The Road Ahead

The compliance landscape demands a review of standards for responsible AI use. It is essential to establish best practices and clear objectives to help steer organisations away from hastily assembled AI solutions that compromise accuracy. Accuracy, reliability, and innovation are equally important to mitigate fabrication or potential misinformation.

Within the banking sector, AI is being used to support compliance analysts already struggling with time constraints and growing regulatory responsibilities. AI can significantly aid teams by automating mundane tasks, augmenting decision-making processes, and enhancing fraud detection.

The UK can benefit from the latest opportunity. We should cultivate an innovation ecosystem with is receptive to AI innovation across fintech, regtech, and beyond. Clarity from government and thought leaders on AI tailored to practical implementations in the industry is key. We must also be open to welcoming new graduates from the growing global talent pool for AI to fortify the country’s position in pioneering AI-driven solutions and integrating them seamlessly. Amid industry change, prioritising and backing responsible AI deployment is crucial for the successful ongoing battle against all aspects of financial crime.

Continue Reading

Business

Using AI to support positive outcomes in alternative provision

By Fleur Sexton

Fleur Sexton, Deputy Lieutenant West Midlands and CEO of dynamic training provider, PET-Xi, with a reputation for success with the hardest to reach,

discusses using AI to support excluded pupils in alternative provision (AP)

Exclusion from school is often life-changing for the majority of vulnerable and disadvantaged young people who enter alternative provision (AP). Many face a bleak future, with just 4% of excluded pupils achieving a pass in English and maths GCSEs, and 50% becoming ‘not in  education, employment or training’ (NEET) post-16.

Often labelled ‘the pipeline to prison’, statistics gathered from prison inmates are undeniably convincing: 42% of prisoners were expelled or permanently excluded from school; 59% truanted; 47% of those entering prison have no school qualifications. With a prison service already in crisis, providing children with the ‘right support, right place, right time’, is not just an ethical response, it makes sound financial sense. Let’s invest in education, rather than incarceration.

‘Persistent disruptive behaviour’ – the most commonly cited reason for temporary or permanent exclusion from mainstream education – often results from unmet or undiagnosed special educational needs (SEN) or social, emotional and mental health (SEMH) needs. These pupils find themselves unable to cope in a mainstream environment, which impacts their mental health and personal wellbeing, and their abilities to engage in a positive way with the curriculum and the challenges of school routine. A multitude of factors all adding to their feelings of frustration and failure.

Between 2021/22 and 2022/23, councils across the country recorded a 61% rise in school exclusions, with overall exclusion figures rising by 50% compared to 2018/19. The latest statistics from the Department for  Education (DfE), show pupils with autism in England are nearly three times as likely to be suspended than their neurotypical peers. With 82% of young people in state-funded alternative provision (AP) with identified special educational needs (SEN) and social emotional and mental health (SEMH) needs, for many it is their last chance of gaining an education that is every child’s right.

The Department for Education’s (DfE) SEND and AP Improvement Plan (March 2023).reported, ‘82% of children and young people in state-place funded alternative provision have identified special educational needs (SEN) 2, and it (AP) is increasingly being used to supplement local SEND systems…’

Some pupils on waiting lists for AP placements have access to online lessons or tutors, others are simply at home, and not receiving an education. In oversubscribed AP settings, class sizes have had to be increased to accommodate demand, raising the pupil:teacher ratio, and decreasing the levels of support individuals receive. Other unregulated settings provide questionable educational advantage to attendees.

AI can help redress the balance and help provide effective AP. The first challenge for teachers in AP is to engage these young people back into learning. If the content of the curriculum used holds no relevance for a child already struggling to learn, the task becomes even more difficult. As adults we rarely engage with subjects that do not hold our interest – but often expect children to do so.

Using context that pupils recognise and relate to – making learning integral to the real world and more specifically, to their reality, provides a way in. A persuasive essay about school uniforms, may fire the debate for a successful learner, but it is probably not going to be a hot topic for a child struggling with a chaotic or dysfunctional home life. If that child is dealing with high levels of adversity – being a carer for a relative, keeping the household going, dealing with pressure to join local gangs, being coerced into couriering drugs and weapons around the neighbourhood – school uniform does not hold sway. It has little connection to their life.  

Asking the group about the subjects they feel strongly about, or responding to local news stories from their neighbourhoods, and using these to create tasks, will provide a more enticing hook to pique their interest. After all, in many situations, the subject of a task is  just the ‘hanger’ for the skills they need to learn – in this case, the elements of creating a persuasive piece, communicating perspectives and points of view.

Using AI, teachers have the capacity to provide this individualised content and personalised instruction and feedback, supporting learners by addressing their needs and ‘scaffolding’ their learning through adaptive teaching.

If the learner is having difficulty grasping a concept – especially an abstract one – AI can quickly produce several relevant analogies to help illustrate and explain. It can also be used to develop interactive learning modules, so the learner has more control and ownership over their learning. When engaged with their learning, pupils begin to build skills, increasing their confidence and commitment.

Identifying and discussing these skills and attitudes towards learning, with the pupil reflecting on how they learn and the ways they learn best, also gives them more agency and autonomy, thinking metacognitively.

Gaps in learning are often the cause of confusion, misunderstandings and misconceptions. If a child has been absent from school they may miss crucial concepts that form the building blocks to more complex ideas later in their school career. Without providing the foundations by filling in these gaps and unravelling the misconceptions, new learning may literally be impossible for them to understand, increasing frustration and feelings of failure. AI can help identify those gaps, scaffold learning and build understanding.

AI is by no means a replacement for teachers or teaching assistants, it is purely additional support. Coupled with approaches that promote engagement with learning, AI can enable these disadvantaged young people to access an education previously denied them.

According to the DfE, ‘All children are entitled to receive a world-class education that allows them to reach their potential and live a fulfilled life, regardless of their background.’ AI can help support the most disadvantaged young people towards gaining the education they deserve, and creating a pathway towards educational and social equity.

Continue Reading

Business

How collaborative robots can support productivity for 2024 

By Stacey Moser, Chief Commercial Officer, Universal Robots 

The past year has been no easy feat for businesses, with an unpredictable global economy hitting manufacturers particularly hard. As we soon move into 2024, there are no signs of this relenting, with the sector recording it’s worst performance in the UK since the 1980s. Simultaneously, the manufacturing industry is battling 74,000 unfilled vacancies, creating a £6.5 billion economic shortfall which is hindering the ability of companies to fight back. 

Increasing productivity will be a key focus in the pursuit of profitability for 2024. Accelerating adoption of automation within the manufacturing process offers a solution, with one study suggesting that automation has the potential to contribute $15 trillion USD to the global economy by 2030. However, many businesses at this time do not have the funds or resources to adopt large, expensive industrial robots. 

Enter collaborative robots (cobots), lightweight robotic arms that can automate repetitive tasks usually requiring the skills and manpower of human workers. Compatible with traditional industrial robots commonly employed by manufacturers, cobots work alongside humans to offer a wide range of benefits, all underpinned by the ability to improve productivity, as well as assure staff safety, and improve worker wellbeing.  

Reimagining production lines for 2024 

Due to the nature of the industry, manufacturing workers often face so-called dull, dirty, dangerous jobs. For example, palletisation – the process of stacking, loading and securing goods onto pallets – has traditionally been a manual operation that requires staff to perform strenuous tasks repetitively. Long term, this can cause issues for employees such as musculoskeletal damage, as well as mental fatigue. As to the factory output, goods of inconsistent quality subsequently become more commonplace. 

Working alongside humans, cobots can take on these undesirable tasks, preventing staff injuries while improving manufacturing quality as human error is reduced. With minimal human effort necessary to operate cobots, these autonomous colleagues can help mitigate the labour shortage issue that is currently leaving many manufacturers vulnerable. Cobots enable workers to be more productive as they focus on more valuable tasks that require more cognition, dexterity and reason, in turn unlocking further business value.  

As employees take on more rewarding work, job satisfaction naturally improves. In a sector that faces labour gaps, this can go far in improving employee retention. Prospective workers who may have previously been drawn away from the manufacturing industry by the necessity of performing risky and monotonous tasks, may now see a different future ahead, where the prospect of using modern collaborative technologies excites and enthuses. Reducing turnover of employees and creating a stable pipeline of future talent also ensures manufacturers can not only maintain, but increase productivity, as another year of global economic volatility approaches.  

Reducing gaps between competitors 

With the current cost of living crisis projected to persist through 2024, consumer behaviour is becoming increasingly unpredictable. Cobots provide the flexibility to adapt to whatever circumstances manufacturers find themselves in, whether production must be scaled up or down, or an expansion into new markets is required. Those that cannot swiftly adapt, risk being left behind. Automation therefore becomes a necessity for manufacturers if they want to keep up with the competition.  

Not only can cobots provide flexibility in scaling the size of production, but from palletisation and machine tending, to quality inspection, cobots can perform any task without a need for rest. If necessary, cobots can be programmed and redeployed to perform multiple tasks and can switch between these with ease. By eliminating potential pain points between different functions, cobots can slash factory downtime, improving productivity across the board.  

With less human labour required to man machinery, the possibility of longer machine tending shifts is unlocked. This could mean the development of 24/7 factories, with downtime kept to a bare minimum. Manufacturers could see goods produced and out of the door faster than ever before, a colossal productivity boost to consider as 2024 approaches.  

Challenging current misconceptions 

Despite cobots being widely available to manufacturers of all sizes, these modern automation solutions have yet to be considered a possibility by many. Research shows that the biggest hurdles to implementing automation are capital cost and lack of internal knowledge and experience with the technology. However, on the cost issue, it’s been proven that cobots can achieve return on investment (ROI) in as little as 12 months.  

A common myth is that cobots are difficult to implement or use. However, the technology is designed specifically with an approachable user interface, meaning employees do not require specialist expertise to operate or interact. Meanwhile, software designed for use alongside cobots, such as Universal Robots’ PolyScope, means that programming cobots to complete tasks becomes intuitive. Modern cobots – such as the UR20 – are lightweight and have a small footprint, eliminating common issues around a lack of factory floor space too.  

Finally, the advent of AI and automation tools such as Chat-GPT has fuelled fears that robots may soon takeover human jobs. Manufacturers who fear backlash may be hesitant to begin deploying and reaping the benefits of collaborative automation. It is important to remember that cobots are designed to work alongside humans, not replace them. Using cobots on a factory floor for example, allows employees to work with robots, not like them. As manufacturers look to improve business resilience in 2024, welcoming cobots into the workforce can also facilitate massive productivity gains.  

Forecasting for the year ahead 

As manufacturers carefully consider budgets for 2024, automation and cobot technologies should be at the very top of the list. Deployment of these technologies is now essential for increasing productivity, quality and efficiency. Organisations that hesitate to invest today risk losing hard-gained momentum to more innovative competitors.  

The future of manufacturing is here. And belongs to those who will innovate and automate. 

Continue Reading

Copyright © 2021 Futures Parity.