Camilla Winlo, Head of Data Privacy at Gemserv
The past year has been a turbulent one for cybersecurity, with a number of high profile breaches hitting the headlines. The pandemic has of course played a central role in conversations around data privacy, while entire industries have been accused of data handling malpractice. So, what are the stories we can expect to see shaping the agenda in 2022?
Here are four developments we expect to see in the year ahead:
- Polarisation around Covid vaccination data will increase
While there are early indications that the combination of vaccination, immunity from previous infection and the evolution of the virus may cause less significant symptoms in most, it still appears that Covid is capable of making unvaccinated and immunocompromised individuals very ill. In winter, this puts huge pressure on the NHS and the public purse, and we expect this to translate over time into increased pressure to encourage the unvaccinated to get vaccinated, and for society to impose different rules on the unvaccinated. We started to see this in 2021 and expect it to continue into 2022.
With different quarantine rules for vaccinated and unvaccinated employees and the possibility of compulsory vaccination on the horizon, more and more organisations are going to find themselves processing Covid vaccination data. Quarantine measures are set to continue to help stop the spread of the virus, which in turn, will mean that organisations will still need to incorporate a hybrid approach to work. Some employees will test positive for Covid and therefore will not be allowed to leave their homes, but they won’t have symptoms that would otherwise stop them from working. Others may have come into close contact with a positive case and will also need to isolate. All of this will have an impact on employers.
Going into 2022, we expect tensions between pro and anti-vaxxers to rise. This is unlikely to be mitigated much by the amount of real-world vaccine safety data that is available, which is what a lot of vax-hesitant people say they are waiting for, due to the polarisation of information availability and the fact that in some cases, vax hesitancy will be rooted in genuine and founded concerns, for example where individuals have health conditions that make taking the vaccine a more personally risky choice. That makes Covid status an employee safeguarding issue due to the risk of discrimination between employees. There will also be companies that want – or are compelled – to terminate unvaxxed employees, as well as some that will do the reverse. We can expect to see these decisions appealed as the ‘grey area’ around what counts as a medical exemption is clarified.
- Tighter regulations around ad tech
The European Data Protection Board (EDPB) published its 2021-2023 strategy in December, and part of that strategy includes more proactive monitoring of ad tech. Ad tech is under huge pressure to tighten up its data protection practices after the Irish Council for Civil Liberties sued a branch of the Interactive Advertising Bureau (IAB) and others over what it described as “the world’s largest data breach” in 2021.
IAB found itself under fire for its role facilitating a process known as real-time bidding, where personal data is passed between hundreds of ad brokers and related firms during an auction process in the moments before a website loads, on behalf of paying brands. During the milliseconds between clicking on a page and it loading, everything from the type of device an individual is using to limited location data and browsing history can be shared with brokers to better target that person.
The breach spurred numerous complaints from the likes of none of your business (noyb), the European centre for digital rights, and various court cases after finding that the ad tech industry is fundamentally unlawful because of the way it is structured. Better regulation around ad tech needs to be put into place not just for the advertisers themselves, but for online retailers, too. It’s going to be incredibly important for the economy as a whole that the ad tech industry gets this right, but there is a lot of work to do to get there.
- Regulatory action around Artificial Intelligence (AI) will ramp up
While the European Data Protection Board (EDPB) strategy highlights the need for more proactive monitoring of AI use, the UK National Data Strategy focuses on making sure AI works and that the UK is a leader. Data privacy must be a priority or the result will be poor quality solutions that don’t work as intended. AI-specific regulations are set to be enforced and I think we’ll see some interesting actions.
After facial recognition company Clearview AI was issued a Notice of Intent to fine following a number of breaches of national data protection law, conversations around the practices of ethical data collection and analysis have come to the forefront of public attention. It’s essential that organisations that want to harness the possibilities of AI and data-driven innovation in the UK do so in a way that protects individuals.
Organisations should be entitled to trust that providers like Clearview AI are engaging in ethical practices and that their services can be used lawfully. It’s very reassuring to see the regulator taking strong action to make AI innovators trustworthy. Whether it’s fighting crime, preventing fraud or other forms of safeguarding through data, when the public and private sector combine, they must ensure the right processes are put in place in order to comply with data protection regulation.
- Privacy Shield 2: A new basis for sharing data between the EU and the US
The Privacy Shield framework was the second attempt by the EU and the US to create a secure mechanism for data sharing. It was thrown out in court after judges deemed the framework insufficient to provide adequate safeguards for the transfer of personal data from the EU to the US, and they’ve been working ever since to replace it.
The exchange is a one-way deal – there are cultural differences between the EU and the US that mean that personal data in the EU is protected in different ways to personal data in the US. The purpose of Privacy Shield is to provide a way to allow EU data to be processed by US companies without losing those protections. I expect we will see some major announcements coming next year, which will include technological changes by Big Tech household names, and that in turn will lead to work for UK and EU businesses.
Regulations are indispensable to the proper functioning of economies and societies, and to protect those structures, we need to implement the right data protection measures. Having the right data protection regulations in place is critical to ensuring the proper functioning of organisations, and for ensuring that both customer and employee data is handled correctly. As businesses and governments continue to generate more data than ever, we need to take regulatory action to create secure, ethical data storage and sharing practices.
Building a Greener Web: Six Way to Put Your Website on an Emissions Diet
By Roberta Haseleu, Practice Lead Green Technology at Reply, Fiorenza Oppici, Live Reply, and Lars Trebing, Vanilla Reply
Most people are unaware or underestimate the impact of the IT sector on the environment. According to the BBC: “If we were to rather crudely divide the 1.7 billion tonnes of greenhouse gas emissions estimated to be produced in the manufacture and running of digital technologies between all internet users around the world, it would mean each of us is responsible for 414kg of carbon dioxide a year.” That’s equivalent to 4.7bn people charging their smartphone 50,000 times.
Every web page produces a carbon footprint that varies depending on its design and development. This must be more closely considered as building an energy efficient website also increases loading speeds which leads to better performance and user experience.
Following are six practical steps developers can take to reduce the environmental impact of their websites.
- Implement modularisation
With traditional websites that don’t rely on single page apps, each page and view of the site is saved in individual html files. The code only runs, and the data is only downloaded, for the page that the user is visiting, avoiding unnecessary requests. This reduces transmitted data volume and saves energy.
However, this principle is no longer the standard in modern web design which is dominated by single page apps which dynamically display all content to the user at runtime. This approach is easier and faster to code and more user-friendly but, without any precautions, it creates unnecessary overheads. In the worst case, accessing the homepage of a website may trigger the transmission of the entire code of the application, including parts that may not be needed.
Modularisation can help. By dividing the code of a website into different modules, i.e. coherent code sections, only the relevant code is referenced. Using modules offers distinct benefits: they keep the scope of the app clean and prevent ‘scope creeps’; they are loaded automatically after the page has been parsed but before the Document Object Model (DOM) is rendered; and, most importantly for green design, they facilitate ‘lazy loading’.
- Adopt lazy loading
The term lazy loading describes a strategy of only loading resources at the moment they are needed. This way, a large image at the bottom of the page will not be loaded unless the user scrolls down to that section.
If a website only consists of a routing module and an app module which contain all views, the site will become very heavy and slow at first load. Smart modularisation, breaking down the site into smaller parts, in combination with lazy loading can help to load only the relevant content when the user is viewing that part of the page.
However, this should not be exaggerated either as, in some instances, loading each resource only in the last moment while scrolling can annihilate performance gains and result in higher server and network loads. It’s important to find the right balance based on a good understanding of how the app will be used in real life (e.g. whether users will generally rather continue to the next page after a quick first glance, or scroll all the way down before moving on).
- Monitor build size
Pre-processors come with the possibility to prevent a build to complete if its files are bigger than a variable threshold. Limits can be set both for the main boot script as well as the single chunks of CSS to be no bigger than a specific byte size after compilation. Any build surpassing those thresholds fails with a warning.
If a build is suspiciously big, a web designer can inspect it and identify which module contributes the most, as well as all its interdependencies. This information allows the programmer to optimise the parts of the websites in question.
- Eliminate unused code
One potential reason for excessive build sizes can be dozens of configuration files and code meant for scenarios that are never needed. Despite never being executed, this code still takes up bandwidth, thereby consuming extra energy.
Unused parts can be found in own source code but also (and often to a greater extent) in external libraries used as dependencies. Luckily, a technique called ‘tree shaking’ can be used to analyse the code and mark which parts are not referenced by other portions of the code.
Modern pre-processors perform ‘tree shaking’ to identify unused code but also to exclude it automatically from the build. This allows them to package only those parts of the code that are needed at runtime – but only if the code is modularised.
- Choose external libraries wisely
One common approach to speed up the development process is by using external libraries. They provide ready-to-use utilities written and tested by other people. However, some of these libraries can be unexpectedly heavy and weigh your code down.
One popular example is Moment.js, a very versatile legacy library for handling international date formats and time zones. Unfortunately, it is also quite big in size. Most of all, it is neither very compatible with the typical TypeScript world nor is it modular. This way, also the best pre-processors cannot reduce the weight that it adds to the code by means of ‘tree shaking’.
- Optimise content
Designs can also be optimised by avoiding excessive use of images and video material. Massive use of animation gimmicks such as parallax scrolling also has a negative effect. Depending on the implementation, such animations can massively increase the CPU and GPU load on the client. To test this, consider running the website on a 5 to 10-year-old computer. If scrolling is not smooth and/or the fans jump to maximum speed, this is a very good indication of optimisation potential.
The amount of energy that a website consumes — and thus its carbon footprint — depends, among other factors, on the amount of data that needs to be transmitted to display the requested content to users. By leveraging the six outlined techniques above, web designers can ‘slim’ their websites and contribute to the creation of a more sustainable web whilst boosting performance and user experience in the process.
The Role of Software Development in Shaping the FinTech Industry in 2023 and Beyond
Source: Finance Derivative
Paul Blowers, Commercial Director at Future Processing
As another year passes, now is the time for company leaders to look back at the last 12 months and consider what’s in store for their FinTech businesses in 2023. One of the biggest impacts of last year was undoubtedly the cost of living crisis and increasing interest rates, leading to UK FinTech investment dropping to $9.6 billion in the first half of 2022 – down from $27.8 billion in the same period in 2021. Whilst these challenges remain at the forefront of the industry, there are plenty of innovative developments and technologies evolving in the FinTech space right now that will continue the pace of change. It’s vital for organisations to keep abreast of these trends, to ensure they can remain competitive and continue providing customers with the highest quality products and services.
Innovations in FinTech
In recent years, we have seen larger banks begin to invest more heavily in BaaS (banking as a service). BaaS is a start-to-finish process that digital banks and third parties use to connect their own business infrastructure to a bank’s system via APIs. This allows digital banks or third parties to offer full-banking services directly through their non-bank business offerings. Typically, BaaS is associated with smaller banks due to the favourable interchange rates under $10 billion (in assets) that these banks have. With a bigger focus on commercial BaaS efforts, we can predict seeing more vertical partnerships with SaaS providers who already have existing relationships with businesses.
An alternative to providing BaaS is to pursue an embedded FinTech strategy. Embedded FinTech refers to the integration of FinTech products and services into financial institutions’ websites, mobile apps, and business processes. This has been growing at pace since the COVID pandemic and is expected to continue on its upward trajectory, accelerating eCommerce, financial digitalisation and consumer expectations. As a result, we can expect that more platforms will be diversifying their service offerings as they deepen their relationships with small business customers.
Another topic that has been circulating in the FinTech sector is the intervention of AI and chatbots. 2023 is set to be the year that this technology fully takes off and integrates with mainstream banks and FinTech. Chatbots can be defined as rule-based systems which can perform routine tasks with general FAQs. The primary goal of these AI-drive chatbots is to provide human-like support for customers, communicating with them, introducing services, answering their questions and receiving any complaints.
Software Development for FinTechs
As banks continue to invest in new technologies and leverage the benefits of adopting BaaS, embedded finance and AI, the focus on software development services also increases. Software is at the heart of every FinTech business, as each product or service demands a high-quality implementation, from both existing and potential customers. One of the biggest expectations is around user experience, as FinTech leaders aim to provide a straightforward, transparent and concise solution to their customer’s business problems. Additionally, security can not be underestimated with the FinTech industry under constant risk of cyberattacks and breaches. With exceptional software development, FinTech solutions can both comply with strict security and data encryption standards, whilst offering a polished and streamlined user experience for customers.
The finance industry also comes up against a constant stream of industry regulations, meaning a compliance strategy must be a priority for FinTech’s when considering their software development approach. This links to checking and implementing updates for frameworks and software architectures regularly to ensure app responsiveness, security and performance remain at the forefront. Ultimately, great software increases a FinTech’s opportunity to leverage emerging technologies and keep control over the quality of its service. Timely identification of key trends makes it possible to maximise the digitalisation of finance to drive long-term value for FinTech businesses and their customers.
The Future of FinTech
Whilst technological developments have been major drivers of FinTech innovation, now is the time to further digitise financial services and the banking sector to build a more inclusive and efficient industry that promotes economic growth. FinTech’s are stepping up to lead, navigate and disrupt the industry during this time of uncertainty, and software development will play a vital role in shaping the future landscape. With the help of software development, FinTech’s will build capabilities and applications that can be easily integrated into the environments where customers are already engaged, meeting their changing needs, new business goals and regulatory demands.
Will cyberattacks be uninsurable in 2023? Three steps that financial organisations can follow now
Source: Finance Derivative
By James Blake, Field CISO of EMEA, Cohesity
The growing number of cyber attacks and subsequent damage has led to an increasing demand for cyber insurance. Swiss Re Insurance expects total premiums paid to more than double from $10 billion from 2020 to $23 billion by 2025. But this is being questioned by both insurance companies and by customers: is insurance effective, is it feasible, what does it cover and what does it enable? The CEO of Zurich Insurance, Mario Greco, said in an interview with the Financial Times recently that cyberattacks will soon become “uninsurable”. Indeed, insurance and prevention have both proved ineffective in stopping cyberattacks like ransomware or in enabling organisations to recover afterwards. Instead organisations must shift their focus onto recovery, What can companies do to meet this challenge? James Blake, Field CISO of EMEA at data management and security provider Cohesity, has three recommendations.
More than 400 million US dollars – that’s how much damage the data leak at Capital One caused in 2019. And the number of such attacks, which have catastrophic consequences for the companies affected, has continued to increase since then. According to Check Point, in the third quarter of 2022 alone, global attacks increased significantly by 28% compared to the same quarter of the previous year.
Where cyber risk used to be limited to areas such as data breaches and third-party liability, ransomware attacks have shifted the damage to core business and accountability. Cyber insurers had to react to the increased risk and have adjusted their offers, as an analysis by Swiss Re Insurance shows. According to PWC, from the insurer perspective, the fast-increasing frequency of ransomware attacks (and the growing associated impacts and ransom demands) and business interruption claims has resulted in cyber becoming a less profitable area of insurance in recent times. The situation has stabilised over the past year as customers have had to pay higher premiums and meet stricter terms and conditions. Swiss Re Insurance expects total premiums paid to more than double from $10 billion to $23 billion by 2025.
More expensive and more difficult to qualify
This is bad news for the financial industry, as insurers are becoming stricter and asking for higher premiums. Cohesity’s legal experts looked at the leading ransomware insurance policies on the market at the end of 2022 and found that ultimately, such guarantees are little more than thinly veiled limitations of liability that benefit the providers – not the customers.
However, there are some measures that companies can use to protect themselves effectively in this new market situation:
- The 3-2-1 strategy remains current: keep an isolated copy of the data
In some cases, organisations are required to quarantine an offsite copy of their production records as part of a 3-2-1 strategy to qualify for cyber insurance.
To do this, they can use a SaaS service which keeps an encrypted copy of the production data in the cloud, isolated by a virtual air gap. The data stored there is monitored with multi-layered security functions and machine learning, and anomalies are reported immediately.
- Tear down silos and merge data with zero-trust in mind
In general, financial organisations should consolidate all their distributed data on a scalable data management platform and ensure they can backup their data across all their infrastructure and assets. Furthermore, the data must be protected in a zero trust model, where the data is encrypted during transfer and on this storage, access is strictly regulated with rules and multi-factor authentication. In addition, all data stored in it can be managed according to compliance requirements and, thanks to immutable storage, is better protected against ransomware.
- Improve collaboration between IT and SecOps teams for cyber resiliency
In addition to these technical measures, financial organisations should optimise the collaboration between their IT and security teams and adopt a data-centric focus on cyber resilience. For too long, many security teams have focused primarily on preventing cyberattacks while IT teams have focused on protecting data including backup and recovery.
A comprehensive data security strategy must unite these two worlds and IT and SecOps teams must work together before the attack takes place. Both teams should be guided by the NIST framework. This holistic approach defines five core disciplines: Identify, Protect, Detect, Respond and Recover.
If a financial company can demonstrate such a mature data security strategy, this will not only have a positive effect on insurance cover, but will generally reduce the risk of incidents and possible consequential damage through failure or data loss.