Technology
Goodbye excel spreadsheets, hello performance management tools

By Harald Matzke, Executive
Adviser at Serviceware Performance
Whether it’s the implementation of new software or the use of innovative technologies such as RPA, the opportunities and challenges that arise in the context of digitalisation are manifold. Many companies face challenges due to the complexity of converting systems and processes: high costs as well as investment effort and the lack of time and resources. The project landscape for many companies is becoming increasingly confusing, which repeatedly leads to errors in project management. For example, project resources are planned twice, schedules and deadlines are not met, the budget is not adhered to or, in the worst-case scenario, the project fails completely. Unsuccessful IT projects are not uncommon, especially with large-scale projects, such as initiatives for digital transformation. Despite this, research from Citrix has revealed that three in four IT leaders (77%) see opportunities for success in past digital transformation failures.
IT projects fail – but why?
There are many reasons why IT projects fail. Errors occur both before and during the project phase. Often it is due to the scope of work, which was inadequately defined in advance. Companies also repeatedly underestimate the scope and impact that IT projects have on the entire business. Quite often, they plan too little time, so that even at the beginning, important targets can only be met with difficulty.
So how should companies proceed? First of all, they should ask themselves two questions:
- Are we running the right projects?
- Is our project implementation result-oriented?
The intersection of these two core issues is the project portfolio, which maps the projects already underway and those awaiting a decision. A good portfolio management should actively add and remove projects in order to achieve the intended transformation goal. Portfolio management is an important basis for comparing resource supply and demand and making it transparent for all stakeholders. In project scoring, defined criteria can be used to make a comparison of different project alternatives as objective as possible. Especially when so-called hard and soft facts have to be taken into account, project scoring provides valuable support for the most diverse investment scenarios.
The goal at the beginning of planning is to find a project or product portfolio that is as balanced as possible in terms of opportunities and risks whilst also promising long-term success. Project costs must also be calculated here. In addition to classic cost types such as personnel, travel or material costs, these also include those that have a special significance in the project context, for example external consulting services. Project cost management includes both planning and actual plan comparison as well as regular revision during the project in order to have a clear picture of the costs incurred at all times. Adjustments only work if there is continuous and complete project reporting. Here, the achievement of project goals is to ensure that the business is attaining a desired outcome.
The importance of keeping an optimal overview of projects
To get a clear overview of the status and development of projects, companies often use a number of different tools and applications such as Excel or PowerPoint. In principle, both are solid tools for calculating projects and creating reports. However, they quickly reach their limits as soon as the requirements increase. Modern tools are therefore essential, especially when managing complex IT project portfolio. If a tool from the performance management area is chosen, non-financial indicators can also be taken into account and serve as a basis for business decisions. Parameters such as “service level performance” not only indicate the pure cost aspects of a new project, but also take into account the scope and quality of the service provided.
But what should performance tools do in order to make the described planning steps more efficient? First of all, the most important requirement is integration into existing systems, making sure that it meets the needs and requirements of the company and the respective projects. Often, individual systems (product data management, enterprise resource planning or operational project management) already exist in the company and the data only needs to be merged and prepared.
The chosen solution should also provide a transparent view of the entire project portfolio in relation to the resource and capacity situation. Information should be stored “multidimensionally” (project view, organizational view, time, data types in forecast versions) and analyzed using standard reports and ad hoc evaluations. The forecast view also helps to simulate potential future portfolios and predict their impact on the future cost situation and resource utilization.
Furthermore, the tool should offer the possibility to develop business cases that can serve as a basis for comparison for later versions of the project. By filing them in a central database, the assumptions in the business case can be continuously refined over time and supplemented with facts and key figures such as net present value, payback period or internal rate of return (IRR) can be calculated. Organizations should also be careful not to use business cases only as an initial means of defining the project scope and evaluating the economic viability, but to keep an eye on them on an ongoing basis. Unfortunately, experience shows that few organizations open up the initial business case at the milestones and, in particular, review the initial assumptions and objectives after the project has been completed. In some cases, this would be important in order to see that projects are no longer goal-oriented and would possibly contribute more to success if they were stopped, thus freeing up the resources used for other projects and tasks.
Two sides of the same coin: project and people
With performance management tools, a close link to business strategy and operational planning and budgeting can be achieved, which brings more transparency to react in time to rapidly changing developments. Besides all the technical possibilities that can be used to implement IT projects, however, the human factor must not be forgotten. Changes and transformation are usually unpopular because they often trigger concerns about being replaceable or having to give up privileges and routines. Involving the affected groups of people and open communication regarding the introduction of new software that will impact the company and the work is crucial. Managers should always deal honestly and openly with employees’ concerns and wishes and communicate changes in the course of the project promptly. Then nothing will stand in the way of project portfolio success!
You may like
Business
Enhancing cybersecurity in investment firms as new regulations come into force

Source: Finance Derivative
Christian Scott, COO/CISO at Gotham Security, an Abacus Group Company
The alternative investment industry is a prime target for cyber breaches. February’s ransomware attack on global financial software firm ION Group was a warning to the wider sector. Russia-linked LockBit Ransomware-as-a-Service (RaaS) affiliate hackers disrupted trading activities in international markets, with firms forced to fall back on expensive, inefficient, and potentially non-compliant manual reporting methods. Not only do attacks like these put critical business operations under threat, but firms also risk falling foul of regulations if they lack a sufficient incident response plan.
To ensure that firms protect client assets and keep pace with evolving challenges, the Securities and Exchange Commission (SEC) has proposed new cybersecurity requirements for registered advisors and funds. Codifying previous guidance into non-negotiable rules, these requirements will cover every aspect of the security lifecycle and the specific processes a firm implements, encompassing written policies and procedures, transparent governance records, and the timely disclosure of all material cybersecurity incidents to regulators and investors. Failure to comply with the rules could carry significant financial, legal, and national security implications.
The proposed SEC rules are expected to come into force in the coming months, following a notice and comment period. However, businesses should not drag their feet in making the necessary adjustments – the SEC has also introduced an extensive lookback period preceding the implementation of the rules, meaning that organisations should already be proving they are meeting these heightened demands.
For investment firms, regulatory developments such as these will help boost cyber resilience and client confidence in the safety of investments. However, with a clear expectation that firms should be well aligned to the requirements already, many will need to proactively step up their security oversight and strengthen their technologies, policies, end-user education, and incident response procedures. So, how can organisations prepare for enforcement and maintain compliance in a shifting regulatory landscape?
Changing demands
In today’s complex, fast-changing, and interconnected business environment, the alternative investment sector must continually take account of its evolving risk profile. Additionally, as more and more organisations shift towards more distributed and flexible ways of working, traditional protection perimeters are dissolving, rendering firms more vulnerable to cyber-attack.
As such, the new SEC rules provide firms with additional instruction around very specific prescriptive requirements. Organisations need to implement and maintain robust written policies and procedures that closely align with ground-level security issues and industry best practices, such as the NIST Cybersecurity framework. Firms must also be ready to gather and present evidence that proves they are following these watertight policies and procedures on a day-to-day basis. With much less room for ambiguity or assumption, the SEC will scrutinise security policies for detail on how a firm is dealing with cyber risks. Documentation must therefore include comprehensive coverage for business continuity planning and incident response.
As cyber risk management comes increasingly under the spotlight, firms need to ensure it is fully incorporated as a ‘business as usual’ process. This involves the continual tracking and categorisation of evolving vulnerabilities – not just from a technology perspective, but also from an administrative and physical standpoint. Regular risk assessments must include real-time threat and vulnerability management to detect, mitigate, and remediate cybersecurity risks.
Another crucial aspect of the new rules is the need to report any ‘material’ cybersecurity incidents to investors and regulators within a 48-hour timeframe – a small window for busy investment firms. Meeting this tight deadline will require firms to quickly pull data from many different sources, as the SEC will demand to know what happened, how the incident was addressed, and its specific impacts. Teams will need to be assembled well in advance, working together seamlessly to record, process, summarise, and report key information in a squeezed timeframe.
Funds and advisors will also need to provide prospective and current investors with updated disclosures on previously disclosed cybersecurity incidents over the past two fiscal years. With security leaders increasingly being held to account over lack of disclosure, failure to report incidents at board level could even be considered an act of fraud.
Keeping pace
Organisations must now take proactive steps to prepare and respond effectively to these upcoming regulatory changes. Cybersecurity policies, incident response, and continuity plans need to be written up and closely aligned with business objectives. These policies and procedures should be backed up with robust evidence that shows organisations are actually following the documentation – firms need to prove it, not just say it. Carefully thought-out policies will also provide the foundation for organisations to evolve their posture as cyber threats escalate and regulatory demands change.
Robust cybersecurity risk assessments and continuous vulnerability management must also be in place. The first stage of mitigating a cyber risk is understanding the threat – and this requires in-depth real-time insights on how the attack surface is changing. Internal and external systems should be regularly scanned, and firms must integrate third-party and vendor risk assessments to identify any potential supply chain weaknesses.
Network and cloud penetration testing is another key tenet of compliance. By imitating how an attacker would exploit a vantage point, organisations can check for any weak spots in their strategy before malicious actors attempt to gain an advantage. Due to the rise of ransomware, phishing, and other sophisticated cyber threats, social engineering testing should be conducted alongside conventional penetration testing to cover every attack vector.
It must also be remembered that security and compliance is the responsibility of every person in the organisation. End-user education is a necessity as regulations evolve, as is multi-layered training exercises. This means bringing in immersive simulations, tabletop exercises and real-world examples of security incidents to inform employees of the potential risks and the role they play in protecting the company.
To successfully navigate the SEC cybersecurity rules – and prepare for future regulatory changes – alternative investment firms must ensure that security is woven into every part of the business. They can do this by establishing robust written policies and adhesion, conducting regular penetration testing and vulnerability scanning, and ensuring the ongoing education and training of employees.
Business
Gearing up for growth amid economic pressure: 10 top tips for maintaining control of IT costs

Source: Finance Derivative
By Dirk Martin, CEO and Founder of Serviceware
Three years on from the pandemic and economic pressure is continuing to mount more than ever. With the ongoing threat of a global recession looming, inflation rising, and supply chain disruption continuing to take its toll, cutting costs and optimizing budgets remains a top priority amongst the c-suite. Amid such turbulence, the Chief Financial Officer (CFO) and Chief Innovation Officer (CIO) stand firmly at the business’s helm, not only to steady the ship but to steer it into safer, more profitable waters. These vital roles have truly been pulled into the spotlight in recent years, with new hurdles and challenges being constantly thrown their way. This spring, for example, experts expect British businesses to face an energy-cost cliff edge as the winter support package set out by the government is replaced.
Whilst purse strings are being drawn ever tighter to overcome these obstacles, there is no denying that the digitalization and innovation spurred on by the pandemic are still gaining momentum. In fact, according to Gartner, four out of five CEOs are increasing digital technology investments to counter current economic pressures. Investing in a digital future, driven by technologies such as the Cloud, Artificial Intelligence (AI), Blockchains and the Internet of Things (IoT), however, comes at a cost and to be able to do so – funds must be released through effective optimization of existing assets.
With that in mind, and with the deluge of cost and vendor data descending on businesses who adopt these technologies, never has it been more important for CIOs and CFOs to have a complete, detailed and transparent view of all IT costs. In doing so, business leaders can not only identify the right investment areas but increase the performance of existing systems and technology to tackle the impact of spiralling running costs.
Follow the below 10 steps to gain a comprehensive, detailed and transparent overview of all IT costs to boost business performance and enable your IT to reach the next level.
1: Develop an extensive IT service and product catalogue
The development of an IT service and product catalogue is the most effective way to kick-start your cost-optimization journey. This catalogue should act as a precise overview of all individual IT services and what they entail to directly link IT service costs to IT service performance and value. By offering a clear set of standards as to what services are available and comprised of, consumers can gain an understanding of the costs and values of the IT services they deploy.
2: Monitor IT costs closely
By mastering the value chain, a concept that aims to visualise the flow of IT costs from its most basic singular units through to realised business units and capabilities, businesses can keep track of where IT costs stem from. With the help of service catalogues, benchmarks, the use of a cost model focussing on digital value in IT Financial Management (ITFM) or what is often referred to as Technology Business Management (TBM) solutions, comprehensive access to this data can be guaranteed, creating a ‘cost-to-service flow’ that identifies and controls the availability of IT costs.
3: Determine IT budget management
Knowledge of IT cost allocation is a vital factor when making informed spending decisions and adjustments to existing budgets. There are, however, different approaches that can be taken to this including – centralized, decentralized and iterative. A centralized approach means that the budget is determined in advance and distributed to operating cost centres and projects in a top-down process, allowing for easy, tight budget allocation. A decentralized approach reverses this process – operating costs are precisely calculated before budgeting and projects are determined. Both approaches come with their own risks, for centralized overlooking projects that offer potential growth opportunities and for decentralized budget demands that might exceed available resources.
The iterative approach tries to unify both methods. Although the most lucrative approach, it also requires the most resources. So, the chosen approach is very much dependent on the available resources, and the enterprise’s structural organization.
4: Defining ‘run’ vs ‘grow’ costs
Before IT budget can be allocated, costs should be split into two distinct categories: running costs (i.e. operating costs) and costs for growing the business (i.e. products or services used to transform or grow the business). Once these categories have been defined, decisions should be made on how the budget should be split between them. A 70% run/30% grow split is fairly typical across most enterprises, but there is no one-size-fits-all approach, and this decision should be centred around the businesses’ overall strategies and end goals.
5: Ensuring investments result in a profit
By carrying out the aforementioned steps, complete transparency can be achieved over which products and services are offered, where IT costs stem from, and where budgets are allocated. From here, organizations can review how much of the IT budget is being used and where costs lead to profits and losses. By maintaining a positive profit margin, the controlling processes can be further optimized. If the profit margin is negative, appropriate, or timely, corrective measures can be initiated.
6: Staying on top of regulation
For a company that operates internationally (E.g. it markets IT products and services abroad), it is extremely important that it stays on top of country-specific compliance and adheres to varying international tax rules. To do so correctly it is necessary to provide correct transfer price documentation. This requires three factors:
- Transparent analysis and calculation of IT services based on the value chain
- Evaluation of the services used and the associated billing processes
- Access to the management of service contracts between providers and consumers as the legal basis for IT services.
7: Stay competitive
Closely linked to the profit mentioned in step five is the question of how to price IT services in order to stay competitive whilst avoiding losses. This begins with benchmark data which can be researched or determined using existing ITFM solutions that can automatically extract them from different – interconnected – databases. From there, a unit cost calculation can be used to define exactly and effectively what individual IT services – and their preliminary products – cost. This allows organizations to easily compare internal unit cost calculations with the benchmarks and competitor prices, before making pricing decisions.
8: Identify and maintain key cost drivers
Another aspect of IT cost control that is streamlined via the comprehensive assessment of the cost-to-service flow is the identification and management of main IT cost drivers. A properly modelled value chain makes it clear which IT services or associated preliminary products and cost centres incur the greatest costs and why. This analysis allows for concise adjustment to expenditure and helps to avoid misunderstandings about cost drivers. Using this as a basis, strategies can be developed to reduce IT costs effectively and determine a better use of expensive resources.
9: Showback/Chargeback IT costs
By controlling IT costs using the value chain, efficient usage-based billing and invoicing of IT services and products can be achieved. If IT costs are visualized transparently, they can easily be assigned to IT customers, therefore increasing the clarity of the billing process, and providing opportunities to analyze the value of IT in more detail. When informing managers and users about their consumption there are two options: either through the ‘showback’ process – highlighting the costs generated and how they are incurred – or through the ‘chargeback’ process, in which costs incurred are sent directly to customers and subcontractors.
10: Analyse supply vs. demand
By following the processes above, transparency regarding IT cost control is further extended and discussions around the value of IT services are made possible across the organization. A more holistic analysis of IT service consumption allows conclusions to be drawn promptly to enable the optimization of supply and demand for IT services in various business areas. This, in turn, will enable a more comprehensive value analysis and optimization of IT service utilization.
Following these 10 cost management steps, a secure, transparent, and sustainable IT cost control environment can be developed, resulting in fully optimized budgets and in turn – significant cost savings. Cost-cutting aside, automating the financial management process in such an environment can boost productivity substantially freeing up time to focus on valuable work, thus leading to overall business growth.
The business and economic landscape is full of uncertainty right now, but business leaders can regain control via cost management, not only to weather current storms but to set themselves up for success beyond today’s turbulence.
Business
Mortgage digitalization: How mortgage lenders are automating the lending process

Source: Finance Derivative
By Fernando Zandona, Chief Product and Technology Officer at Mambu
The mortgage market has a long history, but its future is digital. As tech capabilities grow and consumer expectations evolve, mortgage providers are increasingly turning to digital solutions to attract and retain customers and streamline the lending process. According to research from the 2022 Celent Origination Study, over half of banks and 75% of building societies expect to make significant changes to their mortgage origination systems within 24 months. So, how is the mortgage industry transforming and what must lenders do to future-proof their business?
The acceleration of digitalisation in mortgage lending
There are several factors that have accelerated the digitalisation of mortgage lending. One is changes to consumer behaviour: customers have come to expect smooth digital experiences across all areas of their life (accelerated by the pandemic). As such, they seek similar ease, speed and efficiency when it comes to home buying.
Then there’s the arrival of fintechs. Newer fintechs are beginning to enter the mortgage sector – often through acquisitions, such as Starling Bank’s acquisition of Fleet Mortgage or Zoopla acquiring YourKeys. They are also bringing with them innovative digital solutions, which raise the bar for the whole industry. At the same time, regulatory changes are helping accelerate and facilitate digitalisation, such as the Bank of England’s decision to withdraw its affordability test recommendation and cut some of the red tape around mortgage lending, and HM Land Registry’s acceptance of electronic signatures. The combination of these forces have played a significant role in accelerating the lending process and making it more efficient.
Today’s financial institutions are offering a wide range of digital options, through online and mobile platforms, to their mortgage customers. Services include easier ways for customers to access and manage their mortgages, schedule a session with a mortgage advisor, find personalised recommendations, and access improved security measures to protect sensitive customer information.
That’s not to mention the embrace of open banking has enabled seamless integration of customer data into the lending process. This innovation is helping reduce the number of steps needed to collect data and resulting in faster processing times, less rekeying of information and lower origination costs. Offering faster, cheaper loan decisions is a crucial advantage in an increasingly-crowded mortgage market and automated processes reduce teams’ manual work and eliminate costly human errors.
Digitalising in the right way
The success of these new products and processes relies on the way mortgage lenders introduce and configure them. Agility is key – lenders need to prioritise configurability and scalability when building new products and choosing technology partners, as they must be able to quickly launch new features or make adjustments, in line with evolving customer expectations, emerging trends and changing industry regulations. The use of software-as-a-service (SaaS) platforms and application programming interface (API) integrations helps with this, allowing for faster feature launches and less internal friction.
APIs are just part of future-proofing the mortgage market. According to Forbes, 55% of senior executives in the US mortgage industry think that AI will make their firm, and the industry overall, more competitive. AI and machine learning can assist lenders in analysing data more quickly, leading to more efficient decision-making and forecasting, although as with all AI applications, providers must be vigilant about encoded bias that can radically increase discrimination.
The mortgage landscape is transforming through digitalisation, and this is bound to continue. Lenders who want to keep up the pace with this change – and reap the benefits of faster, smoother processes as well as keep satisfied, loyal customers – will be future-proofing their processes through lending automation and putting customer ease at the centre of their offering.

Enhancing cybersecurity in investment firms as new regulations come into force

Gearing up for growth amid economic pressure: 10 top tips for maintaining control of IT costs

Banking on legacy – The risks posed by ‘stone age’ banking infrastructure

The Sustainability Carrot Could be More Powerful Than the Stick!

Hybrid cloud adoption: why vendors are making the switch in 2022 and why you should too
