Connect with us

Business

A zero trust environment is critical for financial services

Source: Finance Derivative

Boris Bialek, Managing Director of Industry Solutions at MongoDB

Not long ago security professionals were still focused on protecting their IT in a similar formation to mediaeval guards protecting a walled city – concentrating on making it as difficult as possible to get inside. Once past this perimeter though, access to what was within was endless. For financial services, this means access to everything from personal identifiable information (PII) including credit card numbers, names, social security information and more ‘marketable data’. Unfortunately, we have many examples of how this type of security doesn’t work, the castle gets stormed and the data isn’t protected. The most famous is still the Equifax incident, where a small breach has led to years of unhappy customers.

Thankfully the mindset has shifted spurred on by the proliferation of networks and applications across geographies, devices and cloud platforms. This has made the classic point to point security obsolete. The perimeter has changed, it is fluid, so reliance on a wall for protection also has to change.

Zero trust presents a new paradigm for cybersecurity. In this context, it is already assumed that the perimeter is breached,no users are trusted, and trust cannot be gained simply by physical or network location. Every user, device and connection must be continually verified and audited.

What might seem obvious, but begs repeating, with the amount of confidential customer and client data that financial institutions hold – not to mention the regulations – this should be an even bigger priority. The perceived value of this data also makes financial services organisations a primary target for data breaches.

But how do you create a zero trust environment?

Keeping the data secure 

While ensuring that access to banking apps and online services is vital, it is actually the database that is the backend of these applications that is a key part of creating a zero trust environment. The database contains so much of an organisation’s sensitive, and regulated, information, as well as data that may not be sensitive but is critical to keeping the organisation running. This is why it is imperative that a database is ready and able to work in a zero trust environment.

As more databases are becoming cloud based services, a big part of this is ensuring that the database is secure by default, meaning it is secure out of the box. This takes some of the responsibility for security out of the hands of administrators because the highest levels of security are in place from the start, without requiring attention from users or administrators. To allow access, users and administrators must proactively make changes – nothing is automatically granted.

As more financial institutions embrace the cloud, this can get more complicated. The  security responsibilities are divided between the clients’ own organisation, the cloud providers and the vendors of the cloud services being used. This is known as the shared responsibility model. This moves away from the classic model where IT owns hardening the servers and security, then needs to harden the software on top – say the version of the database software – and then needs to harden the actual application code. In this model, the hardware (CPU, network, storage) are solely in the realm of the cloud provider that provisions these systems. The service provider for a Data-as-a-Service model then delivers the database hardened to the client with a designated endpoint. Only then does the actual client team and their application developers and DevOps team come into play for the actual “solution”.

Security and resilience in the cloud are only possible when everyone is clear on their roles and responsibilities. Shared responsibility recognizes that cloud vendors ensure that their products are secure by default, while still available, but also that organisations take appropriate steps to continue to protect the data they keep in the cloud.

Authenticate Everyone  

In banks and finance organisations, there is always lots of focus on customer authentication, making sure that accessing funds is as secure as possible. But it is also important to make sure that access to the database on the other end is secure. An IT organisation can use any number of methods to allow users to authenticate themselves to a database. Most often that includes a username and password, but given the increased need to maintain the privacy of confidential customer information by financial services organisations this should only be viewed as a base layer.

At the database layer, it is important to have transport layer security and SCRAM authentication which enables traffic from clients to the database to be authenticated and encrypted in transit.

Passwordless authentication is also something that should be considered – not just for customers, but internal teams as well. This can be done in multiple ways with the database, either auto-generated certificates that are needed to access the database or advanced options for organisations already using X.509 certificates and have a certificate management infrastructure.

Tracking is a key component 

As a highly regulated industry, it is also important to monitor your zero trust environment to ensure that it remains in force and exompasses your database. The database should be able to log all actions or have functionality to apply filters to capture only specific events, users or roles.

Role-based auditing lets you log and report activities by specific roles, such as userAdmin or dbAdmin, coupled with any roles inherited by each user, rather than having to extract activity for each individual administrator. This approach makes it easier for organisations to enforce end-to-end operational control and maintain the insight necessary for compliance and reporting.

Next level encryption

With large amounts of valuable data, financial institutions also need to make sure that they are embracing encryption – in flight, at rest and even in use. Securing data with client-side field-level encryption allows you to move to managed services in the cloud with greater confidence. The database only works with encrypted fields and organisations control their own encryption keys, rather than having the database provider manage them. This additional layer of security enforces an even more fine-grained separation of duties between those who use the database and those who administer and manage it.

Also, as more data is being transmitted and stored in the cloud – some of which are highly sensitive workloads – additional technical options to control and limit access to confidential and regulated data is needed. However, this data still needs to be used. So ensuring that in-use data encryption is part of your zero trust solution is vital. This also enables organisations to confidently store sensitive data, meeting compliance requirements, while also enabling different parts of the business to gain access and insights from it.

Securing data is only going to continue to become more important for all organisations, but for those in financial services the stakes can be even higher. Leaving the perimeter mentality to the history books and moving towards zero trust – especially as cloud and as-a-service infrastructure permeates the industry – is the only way to protect such valuable data.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

How will regulations effect the open banking sector?

Source: Finance Derivative

Martin Hartley – Group CCO of emagine Consulting 

Comments on the future of the open banking sector and how it will affect the UK market. 

“The UK Open Banking Sector is still primarily driven by regulation. In my view, two of the major current regulations will remain at the forefront moving forward, namely the CMA (Competition and Markets Authority), which mandated the major banks to provide open banking access to authorised third-party providers, and PSD2 (Second Payment Services Directive), which set the standards for secure data sharing. Cybersecurity regulations will only increase in importance, as will Brexit-related changes as any divergence between UK and EU standards could impact open banking.

“Over the upcoming months, increased data sharing through open banking will add crucial pressures to cybersecurity, likely creating a surge in the sector once again.

“I expect ongoing scrutiny and efforts to enhance data protection measures, potentially leading to more stringent cybersecurity regulations being adopted by businesses. I expect to see more partnerships between traditional banks and FinTechs or consultancy firms as they collaborate to enhance cybersecurity or offer innovative services to plug the gap. Conversely, there could be consolidation within the FinTech industry as companies merge to gain market share.

“When it comes to the size of the business and how it is affected, history has shown us that there are certainly positives and negatives of being an SMB when responding to new regulations. On the positive side, they can leverage their agility and they will have a more personal relationship with their customers, potentially leading to a higher level of trust. However, SMBs may face challenges due to their limited budgets and resources. The larger firms will have much larger budgets, allowing them to have more advanced IT systems and IT security, making it easier for them to integrate APIs and develop the necessary infrastructure.

“The benefits of open banking are endless, and the UK Government is showing their forward-thinking mentality in exploring the idea of implementing the technology to streamline wider services. But, much like anything, there are always pros and cons.

“Open banking would simplify payments for public services, making transactions quicker and more convenient for everyone. As it relies on APIs and authentication protocols, open banking would make payments more secure for the public and it would allow access to digital payments for members of the public who have smartphones but possibly no bank accounts. For any digital implementation, it goes without saying that we need to be aware of the risk of cyber attacks and data breaches. These, combined with the exclusion of non-tech savvy individuals, could mean that certain members of the public may not embrace the change, which poses a risk. There is also the additional cost of providing the infrastructure and this will have to be managed carefully to avoid burdening the taxpayer.

“We have already seen digital transformations in areas such as the GOV.UK Pay System and  there are two main indicators of the success of any digital implementation; adoption rates and incidents. There haven’t been any high profile incidents that have hit the headlines in recent times so that to me is a huge positive and provides a level of confidence. It would be interesting to see how many government departments and agencies have adopted GOV.UK Pay for their payment processing needs to understand the system’s usefulness and acceptance within the government. The government must be committed to continuous improvement and to ensure that the system continues to comply with regulations and consciously drives the adoption rate to hit at least 90% of government departments and agencies.

“A favourable regulatory environment will encourage more banks and third-party providers to participate in open banking initiatives, leading to growth in the UK market and positioning the nation as industry leaders.”

Continue Reading

Business

Advancing green mobility for a sustainable future

Accelerating decarbonisation, the transition to SDVs and reshaping urban ecosystems, are helping revolutionise the global automotive industry

By Amit Chadha, CEO & Managing Director, L&T Technology Services

The world is changing. There is an urgent need for a transition toward sustainable practices to combat the threat of climate change. As global temperatures rise and weather patterns evolve, achieving net-zero emissions by 2050 could still help prevent irreversible damage to our planet.

With global carbon emission levels continuing to rise at an accelerated rate, there is a growing momentum toward addressing the scenario on war footing. As the most visible source of emissions, the automotive industry, and, consequently, the future of mobility, is in focus. By helping accelerate decarbonisation, reshape evolving urban ecosystems, and redefine the global automotive industry – we can help reverse the trend and preserve our shared future.

Green mobility has emerged as a major enabler in this direction. Leading stakeholders are becoming increasingly invested in developing a deeper understanding of the multifaceted realm of green mobility and its potential to shape a sustainable future.

Accelerating decarbonisation: A global mandate

Decarbonising the transportation sector is crucial to mitigate the harmful effects of climate change. Fossil fuel-based vehicles are responsible for a substantial portion of carbon dioxide emissions, exacerbating the greenhouse effect. To accelerate decarbonisation, governments and businesses today need to prioritise the adoption of clean, renewable energy sources, such as electricity and hydrogen, for powering vehicles and other modes of public transportation.

Automakers, recovering from the impact of the pandemic and global supply chain disruptions, are therefore exploring new avenues to meet the rising demand for electric mobility. Electric vehicles (EVs), by eliminating the need for fossil fuel-powered engines, play a vital role in improving overall air quality and have emerged as a promising solution for reducing carbon emission levels. They are capable of meeting the diverse needs of all kinds of drivers and offer affordable mobility and maintenance options. Recent advancements in battery technology, including the growing availability of charging infrastructure and incentives for adoption, have led to a significant rise in the EVs popularity.

However, to achieve widespread adoption of electric vehicles, there is a need to address key issues such as battery disposal, supply chain sustainability, and equitable access to EV technology.

Reshaping urban ecosystems: Driving the frontiers of change

Urban areas are central to the momentum around green mobility transformation. As growing global populations gravitate towards cities – congestion, pollution, and limited availability of green spaces have emerged as major challenges. As a result, cities must increasingly reinvent themselves to promote sustainable mobility and improve the quality of life for their residents.

Smart technologies and vertical green systems can contribute to a reduction in the energy demands of buildings by providing shade and insulation, mitigating urban heat islands, and cooling down public spaces. They also enable carbon sequestration, a reduction in pollution levels, and improvements in biodiversity.

Implementing efficient transportation systems, such as buses and trains powered by clean energy, can further reduce individual vehicle usage, traffic congestion, and emissions. Pedestrian-friendly infrastructures, cycling lanes, and micro-mobility solutions like e-scooters and bike-sharing programs can further help promote eco-friendly transportation choices. At a macro-infra level, smart city technologies and data-driven urban planning practices are helping optimise traffic flow, reduce idling times, and minimise fuel consumption.

Integrating green mobility into urban ecosystems is therefore a win-win proposition – fostering cleaner air, enhanced mobility options, and healthier communities.

From a public health perspective, improved air quality can drive a decline in respiratory and cardiovascular diseases linked to air pollution. Healthier citizens translate to a more productive workforce and reduced healthcare costs, further strengthening the growing impetus for vehicle electrification. The shift towards vehicle electrification offers significant economic benefits, including greater job creation, enhanced research and development, and greater investments in sustainable innovations. A consequent reduction in the demand for fossil fuels, scarce in terms of availability and mostly imported, in turn, helps enhance energy security and stabilise fuel prices.

Software Defined Vehicles: Pioneering the change

The global automotive industry is at the core of driving the emerging frontiers of green mobility. Traditional automakers and new entrants are racing to produce eco-friendly vehicles, and this competitive spirit, in turn, is transforming the industry landscape.

Automakers worldwide need to embrace sustainable practices by reducing their carbon footprint during the production process and implementing circular economy principles. Moreover, investing in research and development of alternative materials and manufacturing processes can lead to lighter, more energy-efficient vehicles. The rise of autonomous vehicles presents an opportunity to optimise transportation networks, enhance traffic flow, and reduce accidents. Leveraging this technology, in combination with electric and shared mobility solutions, can lead to a more sustainable and efficient future for transportation.

Software would play a key role in this direction, delivering a streamlined passenger and driver experience paradigm while ensuring conformity with the evolving regulatory standards. With Software Defined Vehicles (SDVs) increasingly constituting a focus area for major automakers worldwide, the future would witness a greater demand for digital engineering services to unlock new value streams.

The importance of ecosystem partnerships

Automotive industry stakeholders are already working with ER&D partners who can deliver across the value chain and understand each of the key parameters in the EV/SDV ecosystem. However, approaching separate vendors for product conceptualisation, design and development, testing, maintenance, manufacturing and after-sales support can increase costs and complexities.

An ER&D partner, equipped with multi-industry expertise, digital engineering capabilities, and a co-innovation commitment, can help drive transformation initiatives for transportation enterprises, overcoming technology constraints with cross-vertical learnings. Leveraging global delivery capabilities, the partner can also provide computing models that consume less energy, boost performance, and optimise data-led algorithms. In addition, they can enable scalable software stacks that leverage sensors and physical components to provide the safety and performance that electric vehicles need.

ER&D companies are also increasingly being called upon to help redefine focus areas with software, ensuring third-party integration, driving feature deployment, enabling CloudOps and fast over-the-air updates. The rising complexities within the connected car landscape further call for adopting software-defined designs that can overcome multi-layered challenges – ranging from development to subsequent deployment, maintenance, and updates.

A multi-stakeholder approach

Achieving the goal of green mobility demands collaboration among various stakeholders. Governments play a crucial role in enacting policies and regulations that incentivise the adoption of sustainable practices and technologies. Subsidies for EVs, emission standards, and urban planning regulations are some of the ways governments can drive the transition towards greener mobility.

Private sector involvement is equally critical. Corporate sustainability initiatives, investment in research and development, and partnerships for innovative mobility solutions can accelerate the transformation. Additionally, consumer awareness and support for eco-friendly practices are essential in shaping market demands and influencing business decisions.

Advancing green mobility is a pivotal step towards a sustainable future. By accelerating decarbonisation, embracing the transition to SDvs, reshaping urban ecosystems, and revolutionsing the automotive industry, this can combat climate change on a significant battleground. The collective efforts of governments, industries, and individuals are crucial in driving this transformation.

Embracing green mobility is therefore not just about reducing emissions, but rather, about fostering a healthier, cleaner, and more resilient world. It is about our common future –striving together toward a prosperous, inclusive, and sustainable tomorrow.

Continue Reading

Business

How Turning Your Core Data into a Product Drives Business Impact

By Venki Subramanian, SVP of Product Management at Reltio

Data drives efficiencies, improves customer experience, enables companies to identify and manage risks, and helps everyone from human resources to sales make informed decisions. It is the lifeblood of most organisations today. Sometime during the last few years, however, organisations turned a corner from embracing data to fearing it as the volume spiralled out of control. By 2025, for example, it is estimated that the world will produce 463 exabytes of data daily compared to 3 exabytes a decade ago.

Too much enterprise data is locked up, inaccessible, and tucked away inside monolithic, centralised data lakes, lake houses, and warehouses. Since almost every aspect of a business relies on data to make decisions, accessing high-quality data promptly and consistently is crucial for success. But finding it and putting it to use is often easier said than done.

That’s why many organisations are turning to “distributed data” and creating “data products” to solve these challenges, especially for core data, which is any business’s most valuable data asset. Core data or master data refers to the foundational datasets that are used by most business processes and fall into four major categories – organisations, people (individuals), locations, and products. A data product is a reusable dataset used by analysts or business users for specific needs. Most organisations are undergoing massive digital and cloud transformations. Putting high-quality core data at the centre of these transformations—and treating it as a product can yield a significant return on investment.

The Inefficiency of Monolithic Data Architectures

Customer data is one example of core or master data that firms rely on to generate outstanding customer experiences and accelerate growth by providing better products and services to consumers. However, leveraging core customer data becomes extremely challenging without timely, efficient access. The data is often trapped inside monolithic, centralised data storage systems. This can result in incomplete, inaccurate, or duplicative information. Once hailed as the saviour to the data storage and management challenge, monolithic systems escalate these problems as the volume of data expands and the urgent need for making data-driven decisions rises.

The traditional approaches for addressing data challenges entail extracting the data from the system of records and moving it to different data platforms, such as operational data stores, data lakes, or data warehouses, before generating use case-specific views or data sets. In addition, because of the creation of use case-specific data sets that are subsequently exploited by use case-specific technologies, the overall inefficiency of this process increases.

One inefficiency arises from the complexity of such a landscape, which involves the movement of data from many sources to various data platforms, the creation of use case-specific data sets, and the use of multiple technologies for consumption. Core data for each domain, such as customer, is duplicated and reworked or repackaged for almost every use case instead of producing a consistent representation of the data used across various use cases and consumption models – analytical, operational, and real-time.

There’s also a disconnect between data ownership and the subject matter experts that need it for decision-making. Data stewards and scientists understand how to access data, move it around and create models. But they’re often unfamiliar with the specific use cases in the business. In other words, they’re experts in data modelling, not finance, human resources, sales, product management, or marketing. They’re not domain experts and may not understand the information needed for specific use cases, leading to frustration and data going unused. It’s estimated, for example, that 20% or fewer of data models created by data scientists are deployed.

Distributed Data Architecture – An Elegant Solution to a Messy Problem

The broken promises of monolithic, centralised data storage have led to the emergence of a new approach called “distributed” data architectures, such as data fabric and data mesh. A data mesh can create a pipeline of domain-specific data sets, including core data, and deliver it promptly from its source to consuming systems, subject matter experts, and end users.

These data architectures have arisen as a viable solution for the issues created by inaccessible data locked away in siloed systems or rigid monolithic data architectures of the past. Data fabric decentralises the management and governance of data sets. It follows four core principles – domain ownership of data, treating data as a product and applying product principles to data, enabling a self-serve data infrastructure, and ensuring federated governance. These help data product owners create data products based on the needs of various data consumers and for data consumers to learn what data products are available and how to access and use these. Data quality, observability, and self-service capabilities for discovering data and metadata are built into these data products.

The rise of the concept of data products is helpful for analytics/artificial intelligence, and general business uses. The concept for either case is the same – the dataset can be reused without a major investment in time or resources. It can dramatically reduce the amount of time spent finding and fixing data. Data products can also be updated regularly, keeping them fresh and relevant. Some legacy companies have reported increased revenues or cost savings of over $100 million.

Trusted, Mastered Data as a Product

Data product owners have to create data products for core data to enable its activation for key initiatives and support various consumption models in a self-serve manner. The typical pattern that all these data pipelines enable can be summarised into the following three stages – collect, unify, and activate.

The process starts with identifying the core data sets – data domains like customer or product – and defining a unified data model for these. Then, data product owners need to identify the first-party data sources and the critical third-party data sets used to enrich the data. This data is assembled, unified, enriched, and provided to various consumers via APIs so that the data can be activated for various initiatives. Product principles such as the ability to consume these data products in a self-service manner, customise the base product for various usage scenarios, and deliver regular enhancements to the data are built into such data products.

Data product owners can use this framework to map out key company initiatives, identify the most critical data domains, identify the features (data attributes, relationships, etc.) and the sources of data – first and third party that needs to be assembled – to create a roadmap of data products and align them to business impact and value delivered.

With data coming from potentially hundreds of applications and the constantly evolving requirements of data consumers, poor quality data and slow and rigid architecture can cost companies in many ways, from lost business opportunities to regulatory fines to reputational risk from poor customer experience. That’s why organisations of all sizes and types need a modern, cloud-based master data management approach that can enable the creation of core data as products. A cloud-based MDM can reconcile data from hundreds of first and third-party sources and create a single trusted source of truth for an entire organisation. Treating core data as a product can help businesses drive value by treating it as a strategic asset and unlocking its immense potential to drive business impact.

Continue Reading

Copyright © 2021 Futures Parity.