Designing Customer Experiences for the Metaverse
By Kaj van de Loo, CTO of UserTesting
It’s clear that people find change difficult, and it never ceases to amaze me how resistant we are to new technologies. The entire concept of the internet was derided as a passing fad and smartphones were expected to crash and burn. What once seemed outlandish is now ubiquitous. So what’s next?
The metaverse is the latest innovation set to change the face of modern lives. These 3D worlds, powered by virtual reality (VR) headsets, offer untold potential–from the opportunity to connect remote working teams for immersive meetings that mirror real life, to the ability to practice surgical techniques with real hand movements, without any risk to real patients.Think Habbo Hotel with major tech updates.
While Habbo Hotel is something the majority of us didn’t expect to make a comeback, this time virtual realities are here to stay, thanks to major investments from Facebook (now Meta), Microsoft and Google, among many others.
As the virtual world is gaining traction, brands are beginning to consider how to become early adopters of this next digital frontier. But amidst the hype, brands must stop to consider how they can create the best possible customer experience (CX)–and how they can avoid making predictable mistakes.
A common mistake many companies make is taking the current experience they provide and simply replicating it on a new channel. Most experiences are designed and optimised for a specific channel and developed to meet the audience’s needs. This ‘lift and shift’ strategy does not factor in the inherent differences between channels–not to mention the fact that subsegments of audiences gravitate towards different channels.
In this case, this technique is particularly dangerous as the immersive, virtual nature of the metaverse is vastly different from existing experiences, such as in-store shopping and smartphone apps. In addition, the metaverse is currently cutting-edge technology, so is not widely used by everyday consumers, meaning audiences found in virtual reality most likely are significantly different to a brand’s core audience.
It follows that brands who will see the most success in the metaverse in these early stages are those whose customers are already using virtual reality. Companies targeting younger, tech savvy consumers have a considerable advantage. On the other hand, those whose core market is pensioners will struggle to gain traction in the metaverse at this stage–it doesn’t matter how good the experience is if the customers aren’t there.
Not only do metaverse audiences look different to core audiences, they also expect a different experience. It’s important for companies to consider the edge the metaverse can provide. For example, a travel firm stands to benefit by offering immersive virtual tours of destinations and hotels. Meanwhile in the finance sector, it’s difficult to envision how the metaverse can enhance the experience offered by existing online and app banking facilities, aside from helping those in the extended reality worlds claim or represent ownership in digital items like non-fungible tokens (NFTS).
The retail industry has already undergone significant digitalisation with the advent of online shopping. Customers are being converted, thanks to the undeniable benefits like the ease of browsing multiple brands at once and the ability to use highly refinable search functions, not to mention shopping from the comfort of the home. However, it can be a challenge to really “see” a product online, leaving many customers frustrated with perceived (or real) discrepancies in size, texture, colour and quality–hence the popularity of ‘internet shopping fails’ videos. The metaverse has the potential to solve this problem by allowing customers to examine products virtually, giving a better, more accurate indication of the product before purchase.
While it is hard to see the applications of virtual reality technologies for some industries, it’s clear the metaverse offers significant potential for others. However, brands should proceed with caution. Rather than ‘lifting and shifting’, companies should design experiences to take advantage of the platform’s capabilities. For some sectors, this may mean creating a brand new experience. Any company which simply moves an existing experience into a new channel will fail to build customer empathy.
Brands should also test early and test often. To build an excellent experience, companies really need to understand their target audience. By testing with and talking to the right audiences, brands can tap into valuable insights that can help cultivate and optimise the customer experience. Video-based feedback platforms like UserTesting capture the perspectives and experiences of an individual in narrative form to help companies build greater customer empathy and a deeper understanding of their audience. They can get feedback on everything from early ideas to the actual experience–which will allow teams to gather the insight needed to customise experiences that overcome specific pain points, creating truly excellent customer experience.
In just a few years, the metaverse has transitioned from the stuff of futuristic sci-fi fantasy to legitimate technology that is already more widespread than we think–for example, many schools are already incorporating ‘VR goggles’ into learning experiences. With another few years under its belt, the metaverse could be a part of our everyday lives. So it’s important brands start considering future opportunities for incorporating the channel into its marketing mix and keep their finger on the pulse. But it won’t be that easy, as success in the metaverse will rely on building customer empathy into the core of any offering.
Enhancing cybersecurity in investment firms as new regulations come into force
Source: Finance Derivative
Christian Scott, COO/CISO at Gotham Security, an Abacus Group Company
The alternative investment industry is a prime target for cyber breaches. February’s ransomware attack on global financial software firm ION Group was a warning to the wider sector. Russia-linked LockBit Ransomware-as-a-Service (RaaS) affiliate hackers disrupted trading activities in international markets, with firms forced to fall back on expensive, inefficient, and potentially non-compliant manual reporting methods. Not only do attacks like these put critical business operations under threat, but firms also risk falling foul of regulations if they lack a sufficient incident response plan.
To ensure that firms protect client assets and keep pace with evolving challenges, the Securities and Exchange Commission (SEC) has proposed new cybersecurity requirements for registered advisors and funds. Codifying previous guidance into non-negotiable rules, these requirements will cover every aspect of the security lifecycle and the specific processes a firm implements, encompassing written policies and procedures, transparent governance records, and the timely disclosure of all material cybersecurity incidents to regulators and investors. Failure to comply with the rules could carry significant financial, legal, and national security implications.
The proposed SEC rules are expected to come into force in the coming months, following a notice and comment period. However, businesses should not drag their feet in making the necessary adjustments – the SEC has also introduced an extensive lookback period preceding the implementation of the rules, meaning that organisations should already be proving they are meeting these heightened demands.
For investment firms, regulatory developments such as these will help boost cyber resilience and client confidence in the safety of investments. However, with a clear expectation that firms should be well aligned to the requirements already, many will need to proactively step up their security oversight and strengthen their technologies, policies, end-user education, and incident response procedures. So, how can organisations prepare for enforcement and maintain compliance in a shifting regulatory landscape?
In today’s complex, fast-changing, and interconnected business environment, the alternative investment sector must continually take account of its evolving risk profile. Additionally, as more and more organisations shift towards more distributed and flexible ways of working, traditional protection perimeters are dissolving, rendering firms more vulnerable to cyber-attack.
As such, the new SEC rules provide firms with additional instruction around very specific prescriptive requirements. Organisations need to implement and maintain robust written policies and procedures that closely align with ground-level security issues and industry best practices, such as the NIST Cybersecurity framework. Firms must also be ready to gather and present evidence that proves they are following these watertight policies and procedures on a day-to-day basis. With much less room for ambiguity or assumption, the SEC will scrutinise security policies for detail on how a firm is dealing with cyber risks. Documentation must therefore include comprehensive coverage for business continuity planning and incident response.
As cyber risk management comes increasingly under the spotlight, firms need to ensure it is fully incorporated as a ‘business as usual’ process. This involves the continual tracking and categorisation of evolving vulnerabilities – not just from a technology perspective, but also from an administrative and physical standpoint. Regular risk assessments must include real-time threat and vulnerability management to detect, mitigate, and remediate cybersecurity risks.
Another crucial aspect of the new rules is the need to report any ‘material’ cybersecurity incidents to investors and regulators within a 48-hour timeframe – a small window for busy investment firms. Meeting this tight deadline will require firms to quickly pull data from many different sources, as the SEC will demand to know what happened, how the incident was addressed, and its specific impacts. Teams will need to be assembled well in advance, working together seamlessly to record, process, summarise, and report key information in a squeezed timeframe.
Funds and advisors will also need to provide prospective and current investors with updated disclosures on previously disclosed cybersecurity incidents over the past two fiscal years. With security leaders increasingly being held to account over lack of disclosure, failure to report incidents at board level could even be considered an act of fraud.
Organisations must now take proactive steps to prepare and respond effectively to these upcoming regulatory changes. Cybersecurity policies, incident response, and continuity plans need to be written up and closely aligned with business objectives. These policies and procedures should be backed up with robust evidence that shows organisations are actually following the documentation – firms need to prove it, not just say it. Carefully thought-out policies will also provide the foundation for organisations to evolve their posture as cyber threats escalate and regulatory demands change.
Robust cybersecurity risk assessments and continuous vulnerability management must also be in place. The first stage of mitigating a cyber risk is understanding the threat – and this requires in-depth real-time insights on how the attack surface is changing. Internal and external systems should be regularly scanned, and firms must integrate third-party and vendor risk assessments to identify any potential supply chain weaknesses.
Network and cloud penetration testing is another key tenet of compliance. By imitating how an attacker would exploit a vantage point, organisations can check for any weak spots in their strategy before malicious actors attempt to gain an advantage. Due to the rise of ransomware, phishing, and other sophisticated cyber threats, social engineering testing should be conducted alongside conventional penetration testing to cover every attack vector.
It must also be remembered that security and compliance is the responsibility of every person in the organisation. End-user education is a necessity as regulations evolve, as is multi-layered training exercises. This means bringing in immersive simulations, tabletop exercises and real-world examples of security incidents to inform employees of the potential risks and the role they play in protecting the company.
To successfully navigate the SEC cybersecurity rules – and prepare for future regulatory changes – alternative investment firms must ensure that security is woven into every part of the business. They can do this by establishing robust written policies and adhesion, conducting regular penetration testing and vulnerability scanning, and ensuring the ongoing education and training of employees.
Gearing up for growth amid economic pressure: 10 top tips for maintaining control of IT costs
Source: Finance Derivative
By Dirk Martin, CEO and Founder of Serviceware
Three years on from the pandemic and economic pressure is continuing to mount more than ever. With the ongoing threat of a global recession looming, inflation rising, and supply chain disruption continuing to take its toll, cutting costs and optimizing budgets remains a top priority amongst the c-suite. Amid such turbulence, the Chief Financial Officer (CFO) and Chief Innovation Officer (CIO) stand firmly at the business’s helm, not only to steady the ship but to steer it into safer, more profitable waters. These vital roles have truly been pulled into the spotlight in recent years, with new hurdles and challenges being constantly thrown their way. This spring, for example, experts expect British businesses to face an energy-cost cliff edge as the winter support package set out by the government is replaced.
Whilst purse strings are being drawn ever tighter to overcome these obstacles, there is no denying that the digitalization and innovation spurred on by the pandemic are still gaining momentum. In fact, according to Gartner, four out of five CEOs are increasing digital technology investments to counter current economic pressures. Investing in a digital future, driven by technologies such as the Cloud, Artificial Intelligence (AI), Blockchains and the Internet of Things (IoT), however, comes at a cost and to be able to do so – funds must be released through effective optimization of existing assets.
With that in mind, and with the deluge of cost and vendor data descending on businesses who adopt these technologies, never has it been more important for CIOs and CFOs to have a complete, detailed and transparent view of all IT costs. In doing so, business leaders can not only identify the right investment areas but increase the performance of existing systems and technology to tackle the impact of spiralling running costs.
Follow the below 10 steps to gain a comprehensive, detailed and transparent overview of all IT costs to boost business performance and enable your IT to reach the next level.
1: Develop an extensive IT service and product catalogue
The development of an IT service and product catalogue is the most effective way to kick-start your cost-optimization journey. This catalogue should act as a precise overview of all individual IT services and what they entail to directly link IT service costs to IT service performance and value. By offering a clear set of standards as to what services are available and comprised of, consumers can gain an understanding of the costs and values of the IT services they deploy.
2: Monitor IT costs closely
By mastering the value chain, a concept that aims to visualise the flow of IT costs from its most basic singular units through to realised business units and capabilities, businesses can keep track of where IT costs stem from. With the help of service catalogues, benchmarks, the use of a cost model focussing on digital value in IT Financial Management (ITFM) or what is often referred to as Technology Business Management (TBM) solutions, comprehensive access to this data can be guaranteed, creating a ‘cost-to-service flow’ that identifies and controls the availability of IT costs.
3: Determine IT budget management
Knowledge of IT cost allocation is a vital factor when making informed spending decisions and adjustments to existing budgets. There are, however, different approaches that can be taken to this including – centralized, decentralized and iterative. A centralized approach means that the budget is determined in advance and distributed to operating cost centres and projects in a top-down process, allowing for easy, tight budget allocation. A decentralized approach reverses this process – operating costs are precisely calculated before budgeting and projects are determined. Both approaches come with their own risks, for centralized overlooking projects that offer potential growth opportunities and for decentralized budget demands that might exceed available resources.
The iterative approach tries to unify both methods. Although the most lucrative approach, it also requires the most resources. So, the chosen approach is very much dependent on the available resources, and the enterprise’s structural organization.
4: Defining ‘run’ vs ‘grow’ costs
Before IT budget can be allocated, costs should be split into two distinct categories: running costs (i.e. operating costs) and costs for growing the business (i.e. products or services used to transform or grow the business). Once these categories have been defined, decisions should be made on how the budget should be split between them. A 70% run/30% grow split is fairly typical across most enterprises, but there is no one-size-fits-all approach, and this decision should be centred around the businesses’ overall strategies and end goals.
5: Ensuring investments result in a profit
By carrying out the aforementioned steps, complete transparency can be achieved over which products and services are offered, where IT costs stem from, and where budgets are allocated. From here, organizations can review how much of the IT budget is being used and where costs lead to profits and losses. By maintaining a positive profit margin, the controlling processes can be further optimized. If the profit margin is negative, appropriate, or timely, corrective measures can be initiated.
6: Staying on top of regulation
For a company that operates internationally (E.g. it markets IT products and services abroad), it is extremely important that it stays on top of country-specific compliance and adheres to varying international tax rules. To do so correctly it is necessary to provide correct transfer price documentation. This requires three factors:
- Transparent analysis and calculation of IT services based on the value chain
- Evaluation of the services used and the associated billing processes
- Access to the management of service contracts between providers and consumers as the legal basis for IT services.
7: Stay competitive
Closely linked to the profit mentioned in step five is the question of how to price IT services in order to stay competitive whilst avoiding losses. This begins with benchmark data which can be researched or determined using existing ITFM solutions that can automatically extract them from different – interconnected – databases. From there, a unit cost calculation can be used to define exactly and effectively what individual IT services – and their preliminary products – cost. This allows organizations to easily compare internal unit cost calculations with the benchmarks and competitor prices, before making pricing decisions.
8: Identify and maintain key cost drivers
Another aspect of IT cost control that is streamlined via the comprehensive assessment of the cost-to-service flow is the identification and management of main IT cost drivers. A properly modelled value chain makes it clear which IT services or associated preliminary products and cost centres incur the greatest costs and why. This analysis allows for concise adjustment to expenditure and helps to avoid misunderstandings about cost drivers. Using this as a basis, strategies can be developed to reduce IT costs effectively and determine a better use of expensive resources.
9: Showback/Chargeback IT costs
By controlling IT costs using the value chain, efficient usage-based billing and invoicing of IT services and products can be achieved. If IT costs are visualized transparently, they can easily be assigned to IT customers, therefore increasing the clarity of the billing process, and providing opportunities to analyze the value of IT in more detail. When informing managers and users about their consumption there are two options: either through the ‘showback’ process – highlighting the costs generated and how they are incurred – or through the ‘chargeback’ process, in which costs incurred are sent directly to customers and subcontractors.
10: Analyse supply vs. demand
By following the processes above, transparency regarding IT cost control is further extended and discussions around the value of IT services are made possible across the organization. A more holistic analysis of IT service consumption allows conclusions to be drawn promptly to enable the optimization of supply and demand for IT services in various business areas. This, in turn, will enable a more comprehensive value analysis and optimization of IT service utilization.
Following these 10 cost management steps, a secure, transparent, and sustainable IT cost control environment can be developed, resulting in fully optimized budgets and in turn – significant cost savings. Cost-cutting aside, automating the financial management process in such an environment can boost productivity substantially freeing up time to focus on valuable work, thus leading to overall business growth.
The business and economic landscape is full of uncertainty right now, but business leaders can regain control via cost management, not only to weather current storms but to set themselves up for success beyond today’s turbulence.
Mortgage digitalization: How mortgage lenders are automating the lending process
Source: Finance Derivative
By Fernando Zandona, Chief Product and Technology Officer at Mambu
The mortgage market has a long history, but its future is digital. As tech capabilities grow and consumer expectations evolve, mortgage providers are increasingly turning to digital solutions to attract and retain customers and streamline the lending process. According to research from the 2022 Celent Origination Study, over half of banks and 75% of building societies expect to make significant changes to their mortgage origination systems within 24 months. So, how is the mortgage industry transforming and what must lenders do to future-proof their business?
The acceleration of digitalisation in mortgage lending
There are several factors that have accelerated the digitalisation of mortgage lending. One is changes to consumer behaviour: customers have come to expect smooth digital experiences across all areas of their life (accelerated by the pandemic). As such, they seek similar ease, speed and efficiency when it comes to home buying.
Then there’s the arrival of fintechs. Newer fintechs are beginning to enter the mortgage sector – often through acquisitions, such as Starling Bank’s acquisition of Fleet Mortgage or Zoopla acquiring YourKeys. They are also bringing with them innovative digital solutions, which raise the bar for the whole industry. At the same time, regulatory changes are helping accelerate and facilitate digitalisation, such as the Bank of England’s decision to withdraw its affordability test recommendation and cut some of the red tape around mortgage lending, and HM Land Registry’s acceptance of electronic signatures. The combination of these forces have played a significant role in accelerating the lending process and making it more efficient.
Today’s financial institutions are offering a wide range of digital options, through online and mobile platforms, to their mortgage customers. Services include easier ways for customers to access and manage their mortgages, schedule a session with a mortgage advisor, find personalised recommendations, and access improved security measures to protect sensitive customer information.
That’s not to mention the embrace of open banking has enabled seamless integration of customer data into the lending process. This innovation is helping reduce the number of steps needed to collect data and resulting in faster processing times, less rekeying of information and lower origination costs. Offering faster, cheaper loan decisions is a crucial advantage in an increasingly-crowded mortgage market and automated processes reduce teams’ manual work and eliminate costly human errors.
Digitalising in the right way
The success of these new products and processes relies on the way mortgage lenders introduce and configure them. Agility is key – lenders need to prioritise configurability and scalability when building new products and choosing technology partners, as they must be able to quickly launch new features or make adjustments, in line with evolving customer expectations, emerging trends and changing industry regulations. The use of software-as-a-service (SaaS) platforms and application programming interface (API) integrations helps with this, allowing for faster feature launches and less internal friction.
APIs are just part of future-proofing the mortgage market. According to Forbes, 55% of senior executives in the US mortgage industry think that AI will make their firm, and the industry overall, more competitive. AI and machine learning can assist lenders in analysing data more quickly, leading to more efficient decision-making and forecasting, although as with all AI applications, providers must be vigilant about encoded bias that can radically increase discrimination.
The mortgage landscape is transforming through digitalisation, and this is bound to continue. Lenders who want to keep up the pace with this change – and reap the benefits of faster, smoother processes as well as keep satisfied, loyal customers – will be future-proofing their processes through lending automation and putting customer ease at the centre of their offering.