Connect with us

Business

A zero trust environment is critical for financial services

Source: Finance Derivative

Boris Bialek, Managing Director of Industry Solutions at MongoDB

Not long ago security professionals were still focused on protecting their IT in a similar formation to mediaeval guards protecting a walled city – concentrating on making it as difficult as possible to get inside. Once past this perimeter though, access to what was within was endless. For financial services, this means access to everything from personal identifiable information (PII) including credit card numbers, names, social security information and more ‘marketable data’. Unfortunately, we have many examples of how this type of security doesn’t work, the castle gets stormed and the data isn’t protected. The most famous is still the Equifax incident, where a small breach has led to years of unhappy customers.

Thankfully the mindset has shifted spurred on by the proliferation of networks and applications across geographies, devices and cloud platforms. This has made the classic point to point security obsolete. The perimeter has changed, it is fluid, so reliance on a wall for protection also has to change.

Zero trust presents a new paradigm for cybersecurity. In this context, it is already assumed that the perimeter is breached,no users are trusted, and trust cannot be gained simply by physical or network location. Every user, device and connection must be continually verified and audited.

What might seem obvious, but begs repeating, with the amount of confidential customer and client data that financial institutions hold – not to mention the regulations – this should be an even bigger priority. The perceived value of this data also makes financial services organisations a primary target for data breaches.

But how do you create a zero trust environment?

Keeping the data secure 

While ensuring that access to banking apps and online services is vital, it is actually the database that is the backend of these applications that is a key part of creating a zero trust environment. The database contains so much of an organisation’s sensitive, and regulated, information, as well as data that may not be sensitive but is critical to keeping the organisation running. This is why it is imperative that a database is ready and able to work in a zero trust environment.

As more databases are becoming cloud based services, a big part of this is ensuring that the database is secure by default, meaning it is secure out of the box. This takes some of the responsibility for security out of the hands of administrators because the highest levels of security are in place from the start, without requiring attention from users or administrators. To allow access, users and administrators must proactively make changes – nothing is automatically granted.

As more financial institutions embrace the cloud, this can get more complicated. The  security responsibilities are divided between the clients’ own organisation, the cloud providers and the vendors of the cloud services being used. This is known as the shared responsibility model. This moves away from the classic model where IT owns hardening the servers and security, then needs to harden the software on top – say the version of the database software – and then needs to harden the actual application code. In this model, the hardware (CPU, network, storage) are solely in the realm of the cloud provider that provisions these systems. The service provider for a Data-as-a-Service model then delivers the database hardened to the client with a designated endpoint. Only then does the actual client team and their application developers and DevOps team come into play for the actual “solution”.

Security and resilience in the cloud are only possible when everyone is clear on their roles and responsibilities. Shared responsibility recognizes that cloud vendors ensure that their products are secure by default, while still available, but also that organisations take appropriate steps to continue to protect the data they keep in the cloud.

Authenticate Everyone  

In banks and finance organisations, there is always lots of focus on customer authentication, making sure that accessing funds is as secure as possible. But it is also important to make sure that access to the database on the other end is secure. An IT organisation can use any number of methods to allow users to authenticate themselves to a database. Most often that includes a username and password, but given the increased need to maintain the privacy of confidential customer information by financial services organisations this should only be viewed as a base layer.

At the database layer, it is important to have transport layer security and SCRAM authentication which enables traffic from clients to the database to be authenticated and encrypted in transit.

Passwordless authentication is also something that should be considered – not just for customers, but internal teams as well. This can be done in multiple ways with the database, either auto-generated certificates that are needed to access the database or advanced options for organisations already using X.509 certificates and have a certificate management infrastructure.

Tracking is a key component 

As a highly regulated industry, it is also important to monitor your zero trust environment to ensure that it remains in force and exompasses your database. The database should be able to log all actions or have functionality to apply filters to capture only specific events, users or roles.

Role-based auditing lets you log and report activities by specific roles, such as userAdmin or dbAdmin, coupled with any roles inherited by each user, rather than having to extract activity for each individual administrator. This approach makes it easier for organisations to enforce end-to-end operational control and maintain the insight necessary for compliance and reporting.

Next level encryption

With large amounts of valuable data, financial institutions also need to make sure that they are embracing encryption – in flight, at rest and even in use. Securing data with client-side field-level encryption allows you to move to managed services in the cloud with greater confidence. The database only works with encrypted fields and organisations control their own encryption keys, rather than having the database provider manage them. This additional layer of security enforces an even more fine-grained separation of duties between those who use the database and those who administer and manage it.

Also, as more data is being transmitted and stored in the cloud – some of which are highly sensitive workloads – additional technical options to control and limit access to confidential and regulated data is needed. However, this data still needs to be used. So ensuring that in-use data encryption is part of your zero trust solution is vital. This also enables organisations to confidently store sensitive data, meeting compliance requirements, while also enabling different parts of the business to gain access and insights from it.

Securing data is only going to continue to become more important for all organisations, but for those in financial services the stakes can be even higher. Leaving the perimeter mentality to the history books and moving towards zero trust – especially as cloud and as-a-service infrastructure permeates the industry – is the only way to protect such valuable data.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

Time is running out: NHS and their digital evolution journey

By Nej Gakenyi, CEO and Founder of GRM Digital

Many businesses have embarked on their digital evolution journey, transforming their technology offerings to upgrade their digital services in an effective and user-friendly way. Whilst this might be very successful for smaller and newer businesses, but for large corporations with long-standing legacy infrastructure, what does this mean? Recently the UK government pledged £6bn of new funding for the NHS, and the impact this funding and investment could have if executed properly, could revolutionise the UK public healthcare sector.

The NHS has always been a leader in terms of technology for medical purposes but where it has fallen down is in the streamlining of patient data, information and needs, which can lead to a breakdown in trust and the faith that the healthcare system is not a robust one. Therefore, the primary objective of additional funding must be to implement advanced data and digital technologies, to improve the digital health of the NHS and the overall health of the UK population, as well as revitalise both management efficiency and working practices.

Providing digital care

Digitalisation falls into two categories when it comes to the NHS – digitising traditionally ‘physical’ services like offering remote appointments and keeping electronic paper records, and a greater reliance on more innovative approaches driven by advances in technology. It is common knowledge that electronic services differ in GP practices across the country; and to have a drastically good or bad experience which is solely dependent on a geographical lottery contradicts the very purpose of offering an overarching healthcare provision to society at large.

By streamlining services and investing in proper infrastructure, a level playing field can be created which is vital when it comes to patients accessing both the care they need and their own personal history of appointments, GP interactions, diagnoses and medications. Through this approach, the NHS focus on creating world-leading care, provision of that care and potentially see waiting lists decrease due to the effective diagnosis and management enabled by slick and efficient technology.

This is especially important when looking at personalisedhealth support and developing a system that enables patients to receive care wherever they are and helps them monitor and manage long-term health conditions independently. This, alongside ensuring that technology and data collection supports improvements in both individual and population-level patient care, can only serve to streamline NHS efforts and create positive outcomes for both the patient and workforce.

Revolutionising patient experiences

A robust level of trust is critical to guaranteeing the success of any business or provision. If technology fails, so does the faith the customer or consumer has in the technology being designed to improve outcomes for them. An individual will always have some semblance of responsibility and ownership over their lives, well-being and health. Still, all of these key pillars can only stand strong when there is infrastructure in place to help drive positive results. Whilst there may be risks of excluding some groups of individuals with a digital-first approach, technology solutions can empower people to take control of their healthcare enabling the patient and NHS to work together. Tandem efforts between humans and technology

Technology must work in tandem with a workforce for it to be effective. This means the NHS workforce must be digitally savvy and have patient-centred care at the front and centre of all operations. Alongside any digital transformation the NHS adopts to improve patient outcomes, comes the need to assess current and future capability and capacity challenges, and build a workforce with the right skills to help shape an NHS that is fit for purpose.

This is just the beginning. With more invtesement and funding being allocated for the NHS this is the starting point, but for NHS decision-makers to ensure real benefits for patients, more still needs to be done. Effective digital evolution holds the key. Once the NHS has fully harnessed the poer of new and evolving technologies to change patient experiences throught the UK, with consistent communication and care, this will set the UK apart and will mark the NHS has a diriving example for accessible, digital healthcare.

Continue Reading

Business

Driving Business Transformation Through AI Adoption – A Roadmap for 2024

Author: Edward Funnekotter, Chief Architect and AI Officer at Solace

From the development of new products and services, to the establishment of competitive advantages, Artificial intelligence (AI) can fundamentally reshape business operations across industries. However, each organisation is unique and as such navigating the complexities of AI, while applying the technology in an efficient and effective way, can be a challenge.

To unlock the transformational potential of AI in 2024 and integrate it into business operations in a seamless and productive way, organisations should seek to follow these five essential steps:

  • Prioritise Data Quality and Quantity

Usefulness of AI models is directly correlated to the quantity and quality of the data used to train them, necessitating effective integration solutions and strong data governance practices. Organisations should seek to implement tools that provide a wealth of clean, accessible and high-quality data that can power quality AI.

Equally, AI systems cannot be effective if an organisation has data silos. These impede the ability for AI to digest meaningful data, and then provide the insights that are needed to drive business transformation. Breaking down data silos needs to be a business priority – with investment in effective data management, and an application of effective data integration solutions.

  • Develop your own unique AI platform

The development of AI applications can be a laborious process, impacting the value that businesses are gaining from them in the immediate term. This can be expedited by platform engineering, which modernises enterprise software delivery to facilitate digital transformation, optimising developer experience and accelerating the ability to deliver customer value for product teams. The use of platform engineering offers developers pre-configured tools, pre-built components and automated infrastructure management, freeing them up to tackle their main objective; building innovative AI solutions faster.

While the development of AI applications that can help streamline infrastructure, automate tasks, and provide pre-built components for developers is the end goal, it’s only possible if the ability to design and develop is there in the first place. Gartner’s prediction that Platform Engineering will come of age in 2024 is a particularly promising update.

  • Put business objectives at the heart of AI adoption – can AI deliver?

Any significant business change needs to be managed strategically, and with a clear indication of the aims and benefits they will bring. While a degree of experimentation is always necessary to drive business growth, these shouldn’t be at the expense of operational efficiency.

Before onboarding AI technologies, look internally at the key challenges that your business is facing and question “how can AI help to address this?” You may wish to enhance the customer experience, streamline internal processes or use AI systems to optimise internal decision-making. Be sure the application of AI is going to help, not hinder you on this journey

Also remember that AI remains in its infancy, and cannot be relied upon as a silver bullet for all operational challenges. Aim to build a sufficient base knowledge of AI capabilities today, and ensure these are contextualised within your own business requirements. This ensures that AI investments aren’t made prematurely, providing an unnecessary cost.

  1. Don’t be limited by legacy systems

Owing to the complex mix of legacy and/or siloed systems that organisations employ, they may be restricted in their ability to use real-time and AI-driven operations to drive business value. For example, IDC found that only 12% of organisations connect customer data across departments.

Amidst the ‘AI data rush’ there will be a greater need for event-driven integration, however, only an enterprise architecture pattern will ensure new and legacy systems are able to work in tandem. Without this, organisations will be prevented from offering seamless, real-time digital experiences, linking events across departments, locations, on-premises systems, IoT devices, in a cloud or even multi-cloud environment.

  • Leverage real-time technology

Keeping up with the real-time demands of AI can pose a challenge for legacy data architectures used by many organisations. Event mesh technology – an approach to distributed networks that enable real-time data sharing and processing – is a proven way of reducing these issues. By applying event-driven architecture (EDA), organisations can unlock the potential of real-time AI, with automated actions and informed decision making using relevant insights and automated actions.

By applying AI in this way, businesses can offer stronger, more personalised experiences – including the delivery of specialised offers, real-time recommendations and tailored support based on customer requirements. An example of this is in predictive maintenance, in which AI is able to analyse and anticipate future problems or business-critical failures, ahead of them affecting operations, and dedicate the correct resources to fix the issue, immediately. By implementing EDA as a ‘central nervous system’ for your data, not only is real-time AI possible, but adding new AI agents becomes significantly easier.

Ultimately, AI adoption needs to be strategic, avoiding chasing trends and focusing instead on how and where the technology can deliver true business value. Following the steps above, organisations can ensure they are leveraging the full transformative benefit of AI and driving business efficiency and growth in a data driven era.

AI can be a highly effective tool. However, its success is dependent on how it is being applied by organisations, strategically,  to meet clearly defined and specific business goals.

Continue Reading

Auto

Preparing for the Surge: Meeting the MCS Requirements of Electric Trucks

John Granby, Director of eTruck & Van, EO Charging and Erik Kanerva, Sales Director at Kempower

Auto electrification is moving at a rapid pace, with electric vehicles (EVs) going from a passion project for early technology adopters to the mainstream – especially when you consider the need to electrify consumer and commercial vehicles ahead of the government’s 2035 Zero Emission Vehicle mandate.

Electrification is also starting to play a vital role in public policy and commercial plans, leading to vehicle availability and a variety of improvements and increasing interest among commercial fleets’ prospective customers. As a result, all of the main car and van manufacturers have a respectable EV offering, and the eBus industry is well on its way to proposing a similarly credible offering for citizens.

Heavy-duty vehicle electrification has progressed slowly, but the pace has picked up over the last year, with several of the major truck manufacturers testing completely electric heavy trucks that are now near-ready to enter the general market.

This is a critical shift in the move towards net zero, given that heavy commercial vehicles account for around 25% of CO2 emissions from road transport emissions in the EU and approximately 6% of the region’s overall emissions. It’s a similar situation in the US, where medium and heavy-duty trucks account for around 29% of total road transport emissions or approximately 7% of the country’s total but make up fewer than 5% of all vehicles on the road.

Having clear goals and objectives in place for fleet electrification will be vital to ensuring the transport sector is on track. For example, Scania’s goal is that 50% of all vehicles it sells annually by 2030 will be electric. Despite Scania being the slowest into the market with battery electric vehicles, other vehicle manufacturers are following the same target, with Volvo Trucks setting itself a target for 50% fully electric vehicles by 2030 and the same with Renault, for example.

Meeting this ambitious goal will require the appropriate charging infrastructure in place so customers have the confidence to invest in the large-scale electrification of their fleets. That is one of the reasons why charging system manufacturer Kempower expects the commercial vehicle DC charging market in Europe and North America to have a 37% compound annual growth rate until 2030.

Trucks require substantial battery packs to provide a similar range as traditional engines, and having the right infrastructure in place to keep them regularly charged is certainly a key factor to consider when electrifying truck fleets. According to the European Automobile Manufacturers’ Association (ACEA), trucks will require up to 279,000 charging outlets by 2030, with 84% located in fleet hubs. By 2030, buses will require up to 56,000 charging outlets, with fleet hubs accounting for 92% of the total.

The Charging Interface Initiative (CharIN) is a global organisation that has been working on a standard for the rapid charging of trucks for several years. CharIN developed the Megawatt Charging System (MCS) concept, which serves as the foundation for the ISO and IEC standards which govern the design, installation, and operation of truck fast charging infrastructures.

The MCS is intended to standardise the quick delivery of enormous amounts of charging power to vehicles and provide stronger communication, which minimises downtime caused by unsuccessful charging events.

Customers who drive commercial vehicles follow particular driving habits. By taking advantage of the required break time from the hours-of-service restrictions governing their drivers, customers can travel further each day thanks to the increased charge rate that MCS offers. Better electrification of commercial cars is made possible by legislation that mandates that drivers take rest breaks. As a result, shorter charging durations to accommodate these breaks are beneficial.

The MCS will operate at up to 3,000A and 1,25 KV at its final development stage, delivering up to 3,75 MW of power when charging. With the backing of a significant segment of the industry, MCS is founded on an international consensus on technical standards. An internationally recognised standard is essential to promote harmonised solutions that reduce costs and boost interoperability without sacrificing safety and uptime.

Trucks on the highway are a key focus of the MCS, not only depot pricing. Large truck units operating long-haul routes and some smaller rigid trucks operating cross-border short-haul deliveries—such as logistics organisations operating deliveries between the United Kingdom and continental Europe—pay particular attention to this issue.

Most MCS charging occurs while drivers take breaks from their routes, but some depots may have a single MCS charger on site to do a flash charge if a truck needs to be turned around quickly. In order to balance this unit’s demand against other chargers on site, load management is crucial because it will require a power supply of at least 1 MW+.

Fleet operators should look to consider incorporating MCS into their whole charging ecosystem and solutions, regardless of whether they are thinking about how electrification will affect their fleet of vehicles on the road or how their depots will operate.

Adopting cutting-edge energy management technology solutions will enable effective fleet electrification, particularly at depots. Investing in effective load management technologies will be critical to maximising existing grid infrastructure capacity while decreasing the need for additional investments in generation or distribution capacity.

Investing in and deploying effective energy management technologies is the key to a smoother, more efficient shift for commercial fleet operators. They are critical in lowering energy expenses, both economically and environmentally.

Energy management solutions for charging electric fleets will also help maximise existing grid capacity, reducing the need to invest in new generation or distribution capacity. This will be an essential factor for fleet managers to consider as eTruck fleets expand and other commercial vehicle fleets, such as buses, increase demands on infrastructure.

With unprecedented energy and investment going into electrification, 2024 looks to be a pivotal year for picking up the momentum of progress around MCS in the logistics sector. If done right, it will create a shift of optimism in the market to accelerate the electrification of commercial fleets and promises to positively impact other sectors, such as marine and aviation, contributing significantly to reducing carbon emissions.

Continue Reading

Copyright © 2021 Futures Parity.