Connect with us

Business

How Combining Data Mesh and Data Fabric with Modern MDM Turbocharges Digital Transformation

Ansh Kanwar, Senior Vice President of Technology at Reltio

“Getting information into the hands of experts who can drive maximum value from it remains one of the most significant barriers to digital transformation today. It’s an old problem and growing bigger by the minute as the volume of applications and silos expand exponentially. For example, one study predicts that enterprise data will increase at a 42% annualised growth rate in the next two years.

Despite having this information, many organisations are no closer to realising the data-driven holy grail.  Respondents in a recent survey, for example, reported that their organisations are making more “gut instinct” decisions versus just last year.

That’s why there has been much buzz recently around the potential for data “virtualisation” to fix these problems. Virtual data includes information architecture concepts such as data mesh and data fabric. Data virtualisation approaches do not solve the most significant point of friction most enterprises have, however, which is creating accurate, trustworthy core data in real-time.  Combining modern MDM solutions with a virtual approach can unleash the power of data and finally lead any organisation into the data-driven promised land.

Data Virtualisation Seeks to Solve the Silo Problem

Data silos are growing and creating big challenges for enterprises. Data is also always moving, it’s always morphing. Practitioners transform it, systems transform it, and much of it moves to the cloud, including cloud data warehouses, cloud data lakes, SaaS solutions, platform solutions, or master data management (MDM) solutions like Reltio. There is a secular and massive movement to the cloud as part of a broad trend toward digital transformation. As this march to the cloud continues, however, so too does the proliferation of apps, which collect increasingly more data every single day.

With so many data sources and storage locations, whether on-prem or in the cloud, accessing the data in an organised, structured and usable way remains problematic. Data virtualisation seeks to solve that through the seamless delivery of organised, actionable, and reusable data sets from data warehouses, data lakes, SaaS applications and other sources and place it into the hands of the right people for making critical business decisions. Virtual data can help organisations put the correct data into the hands of their experts and allow them to move towards data-led decision-making and give a competitive edge. These approaches seek to deliver fast access to actionable, domain-specific and reusable data products across an entire enterprise landscape to fuel analytics and other data-driven initiatives.

Enterprises are also demanding increasing speed. It’s a competitive advantage for companies to move from raw data to insight to action faster than their competitors. Innovation teams are demanding the ability to experiment faster, especially with machine learning, tweaking models to constantly optimise or feed the growth engine.

This is where data mesh and data fabric help bridge the gap from raw data pools to actionable data. So what are they?

Moving Data into the Hands of Domain Experts

Data lake architecture models have common failure modes that lead to unfulfilled promises at scale. Monolithic, centralised, data often resides outside the organisational domain that needs it. And the teams that manage the data storage and information pipelines aren’t well-versed in organisational domains that require quality, actionable. Data mesh is a concept that moves information from centralised lakes and warehouses and puts it into the hands of domain and subject matter experts. In this construct, data is treated as a product and owned by domain experts.  Data fabric is more akin to metadata, it is a catalogue system that identifies what information is available. Fabric can help domain experts, and analysts determine where data can be used.

Data mesh and fabric patterns apply to analytics – though fabric is more generic and stretches into operational use cases. Mesh and fabric approaches can be complementary and can be used together to build a mature enterprise-wide data program. Getting data into the hands of the domains that need it will help organisations move toward to promised land of digital transformation. 

MDM: Addressing the Root Cause of Untrustworthy Data

Data virtualisation concepts are catching on because the problem it seeks to solve continues to hamstring businesses in every sector – which is getting value from their own data. But organising, packaging and making data more readily available does not solve the data quality problem most enterprises have. Virtual data moves data through a more efficient pipeline but it doesn’t fix it.

Domain experts and business users not only need access to specific data, but they also need better quality information to make informed decisions. Core data, in particular, is the lifeblood of any enterprise. Core data is information about customers, vendors, locations, assets, and suppliers, among other things i.e. data that every organisation runs on – any of the vital information to a specific market segment.  The problem organisations have with core data is that it can reside in many different silos. And the core data is often inaccurate, outdated, or duplicated elsewhere.

This is why organisations undergoing digital transformations are frequently frustrated–poor quality core data is slowing them down, so– they spend more time and resources trying to fix those problems than they do gaining insight. Hence the recent trend we’ve seen where data-rich organisations are reverting to “gut instinct” decision-making. It’s easier to trust your gut than it is to trust bad data because you know the data is wrong. Simply migrating to a cloud data warehouse does not make that problem go away.

Clean Connected Data is the Foundation for any Data Architecture

Data virtualisation is here to stay. Data virtualisation promises to unlock the value of enterprise data, and deliver on the promise of evolving into a data-driven organisation. Mesh and fabric approaches can be complementary and can be combined to build a mature enterprise-wide data program. These concepts will help solve challenges organisations have today and will scale with them as they collect more data and create more silos.  Mastered core data, however, remains the foundational element for any virtualised data approach. That’s why we’re seeing MDM evolve from a reluctant to an indispensable spend. Every organisation is becoming a data-driven organisation, which requires domain expertise and high-quality, actionable information to make sound business decisions, satisfy its customers, and create more enterprise value.”

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

Adapt or fall behind: why embracing data-centric technology is key for investment firms

Source: Finance Derivative

By Murray Campbell, Product Manager at AutoRek

The investment sector has often relied on conventional procedures and stringent regulations. However, coping with obsolete legacy software can impede an organisation’s growth and development. Despite being aware of these challenges, investment companies worldwide tend to persist with these systems due to the perceived high cost and complexity in implementing modern technology. 

As technology continues to advance and the world becomes more digitally dependent, there is increasing pressure on firms to ensure their buy-side operating model is as efficient as possible. While investment firms have typically prioritised the front-end of their product, the back-office is equally important as this is the engine that drives any organisation. This is particularly key in today’s rapidly evolving markets where significant rewards await businesses that can successfully deliver innovation and efficiency within their organisation.

The unforeseen costs of manual processes

When investment firms operate independently, they often end up utilising various platforms that offer similar functions. However, this approach results in the accumulation of expensive and disjointed systems, leading to inefficient workflows, high costs, and the need to maintain multiple vendor relationships. Such inefficiencies can hinder a firm’s ability to adapt to new market challenges and demands, which can be a major problem for companies in the long-term.

For many, the lack of suitable IT systems is the most common operational challenge UK investment businesses face. Many face obstacles when it comes to reliance on manual processes, an absence of suitable solutions available in the market, or a lack of resources available to invest in such solutions. In the dynamic realm of data management, the choice of tools and solutions is crucial for steering business decision-making and operational efficiency. Investors need faster, more personalised customer experiences and investment firms need to focus on providing seamless journeys – even in the face of economic turbulence and increasing regulatory requirements.

One area where organisations can greatly benefit from advanced technology is by reducing their dependency on spreadsheets. Currently, many buy-side investment managers are still reconciling data in spreadsheets or using generic platforms that lack key features. In fact, more than nine in 10 agree that their firm relies too heavily on manual tasks and spreadsheets, meaning that the UK investment management industry still has some distance to go to remove reliance on manual reconciliations. Relying on outdated methods can be a costly mistake.

The expansion of the digital economy, increasing transactional volumes, and ever-changing regulatory obligations have made it necessary to adopt more sophisticated solutions. Excel, for instance, lacks key controls and has limited auditability, making it almost impossible to track and evidence actions. As a result, organisations end up spending more resources and money to fix errors, leading to higher costs in the long run. Therefore, transitioning to more advanced solutions is crucial to ensure data accuracy, integrity, and scalability as they continue to grow and evolve.

How is automation changing the investment industry?

In the current digital age, management of complex operations is heavily reliant on automation. With the help of data-driven insights, automation can enable investment managers to make informed decisions, identify market trends, and optimise portfolio performance. By automating tasks such as validations and cash transfers, investment managers can ensure that data-related tasks are executed with speed and accuracy, freeing up their time to focus on activities where their human expertise and creativity can add more value.

According to a recent report by AutoRek, UK-based investment managers claim they are continuing to invest in automation, with 100% of respondents either maintaining or increasing their automation expenditure in the years ahead. Continued investment in automation is promising given firms remain too reliant on manual processes, particularly when it comes to reconciliations. Nevertheless, successful implementation isn’t about adopting every automation tool available. Instead, companies should focus on strategically selecting applications and carefully refining processes that are in line with their corporate objectives and unique requirements.

Act now or fall behind

The promise of emerging technologies lies in the ability to unlock new insights and improve productivity. But to use this technology effectively, modern infrastructure that can capture and validate large volumes of data in a scalable manner is required. Replacing manual processes with end-to-end automation can drive significant benefits for investment firms as it presents an opportunity to eliminate much of the friction around reconciliations, reduce operating costs, and liberate staff from repetitive manual tasks.

To conclude, the integration of data-centric technology is crucial. If investment firms want to remain competitive and innovative they must keep up with the demands of fast-moving markets. They must clear their data clutter and evolve quickly – or risk being left behind.

Continue Reading

Business

Why email marketing remains one of the best forms of digital marketing

Crafting a strong email marketing strategy involves a real balance between creativity and making data-driven decisions, which, is just one of the roles undertaken by marketing and data company Go Live Data on behalf of its many clients.

Guiding some of the biggest corporates in the UK including Amazon Business, AxA and Premierline Business Insurance, Adam Herbert, CEO of Go Live Data, advises on the key components to a successful email campaign and why as one of the most effective marketing tools available, email still plays a crucial role in digital marketing:

Forming a direct means of communication, emails provides a and two-way access between businesses and their customers. And it may sound obvious to say, but unlike social media or other digital channels, every email allows marketers to reach their audience straight into their inbox, and this is where individuals are most likely to engage with the content they’re being shown.

Offering a high return on investment,  emails consistently deliver one of the highest ROI’s compared to other forms of digital marketing such as PPC and advertising. According to studies, the average is around £40 for every £1 spent, which is huge; and due to the low cost of email, its ability to drive conversions and to retain customers.

What’s more, with email segmentation and many personalisation techniques available, marketers can tailor their messages to specific groups of their audience, based on demographics, their behaviours, interests, and purchase history making them not only very targeted, but personalised too. The key is to deliver relevant content to subscribers, which means marketers can increase engagement, conversions, as well as customer satisfaction.

There are specific platforms which allow for automation, giving marketers the ability to set up automated workflows triggered by user actions and also means that marketers can deliver timely and relevant messages at scale, by nurturing leads, as an effective way to guide customers efficiently through the sales funnel.

Emails are also an excellent way to build customer relationships, by nurturing over time. By consistently delivering valuable content, exclusive offers, and personalised recommendations, businesses can strengthen the ‘bond’ with their audiences and increase brand loyalty. Email provides a means of two-way communication, which allows customers to send in their feedback, to ask any questions they may have and to  engage with a brand directly.

They are also a great way to drive traffic to your website, blog and social media, or any other digital channels connected to your business. By including attractive or compelling calls-to-action (CTAs) and relevant content, you can encourage subscribers to take action such as making a purchase, signing up for a webinar, or downloading a resource, which in turn will drive conversions and revenue for your business.

Email platforms offer substantial analytics and reporting functions that enable marketers to track the performance of their campaigns in real-time. Monitoring of key metrics such as open rates, click-through rates, conversion rates, and revenue generated, allows marketers to measure the effectiveness of their campaigns and of course make data-driven decisions to optimise and plan future activities.

Overall, emails are an integral component of a digital marketing and by leveraging email effectively, businesses can engage their audience, nurture leads, drive sales, and ultimately grow their businesses.

Continue Reading

Business

Conflicting with compliance: How the finance sector is struggling to implement GenAI

By James Sherlow, Systems Engineering Director, EMEA, for Cequence Security

GenerativeAI has multiple applications in the finance sector from product development to customer relations to marketing and sales. In fact, McKinsey estimates that GenAI has the potential to improve operating profits in the finance sector by between 9-15% and in the banking sector, productivity gains could be between 3-5% of annual revenues. It suggests AI tools could be used to boost customer liaison with AI integrated through APIs to give real-time recommendations either autonomously or via CSRs, to inform decision making and expedite day-to-day tasks for employees, and to decrease risk by monitoring for fraud or elevated instances of risk.

However, McKinsey also warns of inhibitors to adoption in the sector. These include the level of regulation applicable to different processes, which is fairly low with respect to customer relations but high for credit risk scoring, for example, and the data used, some of is in the public domain but some of which comprises personally identifiable information (PII) which is highly sensitive. If these issues can be overcome, the analyst estimates GenAI could more than double the application of expertise to decision making, planning and creative tasks from 25% without to 56%.

Hamstrung by regulations

Clearly the business use cases are there but unlike other sectors, finance is currently being hamstrung by regulations that have yet to catch up with the AI revolution. Unlike in the EU which approved the AI Act in March, the UK has no plans to regulate the technology. Instead, it intends to promote guidelines. The UK Financial Authorities comprising the Bank of England, PRA, and FCA have been canvassing the market on what these should look like since October 2022, publishing the results (FS2/23 – AI and Machine Learning) a year later which showed a strong demand for harmonisation with the likes of the AI Act as well as NIST’s AI Risk Management Framework.

Right now, this means financial providers find themselves in regulatory limbo. If we look at cyber security, for instance, firms are being presented with GenAI-enabled solutions that can assist them with incident detection and response but they’re not able to utilise that functionality because it contravenes compliance requirements. Decision-making processes are a key example as these must be made by a human, tracked and audited and, while the decision-making capabilities of GenAI may be on a par, accountability in remains a grey area. Consequently, many firms are erring on the side of caution and are choosing to deactivate AI functionality within their security solutions.

In fact, a recent EY report found one in five financial services leaders did not think their organisation was well-positioned to take advantage of the potential benefits. Much will depend on how easily the technology can be integrated into existing frameworks, although the GenAI and the Banking on AI: Financial Services Harnesses Generative AI for Security and Service report cautions this may take three to five years. That’s a long time in the world of GenAI, which has already come a long way since it burst on to the market 18 months ago.

Malicious AI

The danger is that while the sector drags its heels, threat actors will show no such qualms and will be quick to capitalise on the technology to launch attacks. FS2/23 makes the point that GenAI could see an increase in money laundering and fraud through the use of deep fakes, for instance, and sophisticated phishing campaigns. We’re still in the learning phase but as the months tick by the expectation is that we can expect to see high-volume self-learning attacks by the end of the year. These will be on an unprecedented scale because GenAI will lower the technological barrier to entry, enabling new threat actors to enter the fray.

Simply blocking attacks will no longer be a sufficient form of defence because GenAI will quickly regroup or pivot the attack automatically without the need to employ additional resource. If we look at how APIs, which are intrinsic to customer services and open banking for instance, are currently protected, the emphasis has been on detection and blocking but going forward we can expect deceptive response to play a far greater role. This frustrates and exhausts the resources of the attacker, making the attacks cost-prohibitive to sustain.

So how should the sector look to embrace AI given the current state of regulatory flux? As with any digital transformation project, there needs to be oversight of how AI will be used within the business, with a working group tasked to develop an AI framework. In addition to NIST, there are a number of security standards that can help here such as ISO 22989, ISO 23053, ISO 23984 and ISO 42001 and the oversight framework set out in DORA (Digital Operational Resilience Act) for third party providers. The framework should encompass the tools the firm has with AI functionality, their possible application in terms of use cases, and the risks associated with these, as well as how it will mitigate any areas of high risk.

Taking a proactive approach makes far more sense than suspending the use of AI which effectively places firms at the mercy of adversaries who will be quick to take advantage of the technology. These are tumultuous times and we can certainly expect AI to rewrite the rulebook when it comes to attack and defence. But firms must get to grips with how they can integrate the technology rather than electing to switch it off and continue as usual.

Continue Reading

Copyright © 2021 Futures Parity.