SYDNEY/BOSTON, Oct 11 (Reuters) – As major companies look at drastic ways to cut carbon emissions from corporate travel, airlines are bracing for a major hit to business-class travel, a key revenue driver, industry executives and experts say.
Some are considering a “carbon budget” as they come under growing pressure from environmental advocates and investors to reduce indirect emissions that contribute to climate change.
Flights account for about 90% of business travel emissions. That makes it the lowest-hanging fruit for companies setting reductions targets.
The airline industry last week committed to reach “net zero” emissions by 2050 at a meeting in Boston, decades beyond the corporate travel emissions cut targets. read more
“It’s going to be hard on airlines and they’re going to need to adapt,” Kit Brennan, co-founder of London-based Thrust Carbon, which is advising S&P and other clients on setting up carbon budgets.
“I think what we’re going to see, funnily enough, is more of an unbundling of business class where you might get all perks of business class without the seat,” he said, referring to airport lounges and nicer meals. “Because ultimately it all comes down to the area on the aircraft and it takes up.”
Flying business class emits about three times as much carbon as economy class because the seats take up more room and more of them are empty, according to a World Bank study.
CHANGE ALREADY UNDER WAY
Pre-pandemic, about 5% of international passengers globally flew in premium classes, accounting for 30% of international revenue, according to airline group IATA. read more
The pandemic-related drop in travel and a switch to more virtual meetings have led many companies to save money by resetting travel policies.
Sam Israelit, chief sustainability officer at consulting firm Bain, said his company was evaluating carbon budgets for offices or practice areas to help cut travel emissions per employee by 35% over the next five years.
“I think more broadly, it’s something that companies really will need to start to do if they’re going to be successful in meeting the aggressive targets that everyone’s putting out,” he said.
Companies and corporate travel agencies are also investing heavily in tools to measure flight emissions based on factors such as the type of plane, the routing and the class of service.
“We’re not seeing a lot of companies take a very draconian approach like simply cut travel because that impacts their bottom line,” said Nora Lovell Marchant, vice president of sustainability at American Express Global Business Travel. “But we are seeing an increased ask for transparency so those travellers can make decisions.”
Global ratings agency S&P, which plans to reduce travel emissions by 25% by 2025, found that 42% of its business class use was for internal meetings, its global corporate travel leader, Ann Dery, said at a CAPA Centre for Aviation event last month.
AIRLINES GOING GREEN
U.S. carrier JetBlue (JBLU.O) plans for about 30% of its jet fuel for flights in and out of New York to be sustainable within two to three years.
“Businesses, of course, are going to want to address this climate change issue aggressively,” JetBlue Chief Executive Robin Hayes said on the sidelines of the Boston meeting. “But we think they’re going to be able to do it in a way that still enables business travel to take place.”
The emissions target airlines set last week relies on boosting use of sustainable aviation fuel from less than 0.1% today to 65% by 2050 as well as new engine technologies.
“If we are getting to net zero carbon emissions by 2050 everybody has got to play their part here,” said Air New Zealand (AIR.NZ) Chief Executive Greg Foran. “It is not just the airlines. It is going to be fuel providers, it is going to be governments. And ultimately customers are going to have to buy into this as well.”
Reporting by Jamie Freed in Sydney and Rajesh Kumar Singh in Boston; Editing by Miyoung Kim and Gerry Doyle
Our Standards: The Thomson Reuters Trust Principles.
How to improve the accuracy of your ESG reporting
Source: Finance Derivative
Rajesh Gharpure, Global Head- ESG, Larsen & Toubro Infotech (LTI)
ESG initiatives have become increasingly significant in the business world, with organisations integrating sustainability into their core business strategy and using them as drivers to strengthen resiliency and create long-term value. As a result of this elevated focus, investors now require greater transparency into ESG performance. This means organisations need to make public commitments towards sustainability and provide robust, relevant, and routine updates to their strategies, goals and metrics. They also need to collate accurate ESG data which is critical for making the right capital allocation and investment decisions, and for investors to understand their investment risk and value preservation.
But organisations often struggle to report ESG data effectively. In fact, more than half of leadership across organisations today experience challenges around data availability and data quality. However, with upcoming regulatory changes, such as the European Commission’s Sustainable Financial Disclosure Regulation (SFDR), investment firms need to ensure organisations provide the right data for accurate reporting to support green claims in their ESG-labelled investment funds.
While organisations focus on robust and accurate ESG reporting, it’s not just about trying to report everything, but more about knowing what should be reported. This can be achieved through the focused implementation of the Selection – Innovation – Assurance (S-I-A) approach:
It’s important for organisations to continually reassess their ESG journey and establish the correct boundaries through a precise selection process. Selection criteria should categorically define the areas for reporting. A well-planned and executed materiality assessment can help identify areas which are most impactful for business as well as stakeholders. Taking inspiration from the 17 UN SDG goals, alignment with the objectives, jurisdiction, peer benchmarking and geography-specific requirements, organisations need to limit their reporting purview to the most significant topics. By taking a structured programmatic approach, these steps ensure reliable choices are made regarding organisation boundaries, programmes and KPIs, all of which are crucial for an organisation’s long-term sustainability. This is all while taking into consideration the bandwidth and resources needed to ensure proper governance and accurate reporting.
Reporting as a means to precisely monitor and govern creates a multitude of challenges as data needs are continually expanding. So it’s important to factor in that large organisations grow both organically and inorganically and, in the process, end up having a wide digital landscape most of which is focussed on primary operational needs and not necessarily targeted towards integrated digital fabric. Organisations often struggle with sourcing the right data as much of the non-operational information is recorded and manually maintained in excel spreadsheets. It is also a question of how to track all the relevant data and compile it in a way that is meaningful and ingestible for generating an ESG report. This means a typical ESG reporting cycle for a mid to large scale organisation can take up to 4-5 months to complete, significantly delaying its ability to monitor and adjust course. Furthermore, the myriad of frameworks and reporting programmes used globally only tend to multiply the problem. Manual data collection also comes with the risk of errors and inaccuracies. It also makes it difficult to scale ESG initiatives and examine transactional data for disclosure purposes.
Through digital intervention, this timeline can be systemically reduced, enabling organisations to quickly adapt and evolve their sustainability programmes and metrics. Usage of digital tools to capture and perform the extract, transform, load (ETL) work before reporting serves the triple purpose of effort and time savings, data validation and reporting accuracy. Taking a productised IT and data product approach focussed on ESG data can help more effectively catalogue data from organisational systems and subsystems into formats needed for reporting and disclosures. This reduces the risk of poor data quality. IoT technologies are capable of tracking and measuring the performance of each individual asset in an operation and data from these platforms can be directly pulled and integrated into such catalogues for ESG reporting purposes and auditing.
Assurance (internal, followed by external) provides the evidence to support the accuracy of an organisation’s data, and adds the element of truth and trust in your ESG report. The 2021 survey by the International Federation of Accountants (IFAC), showed that only 51% of the organisations who shared their ESG information, provided assurance with their disclosures. And most of those were limited to certain facets of the total report. However, an ESG report validated by a recognised auditing organisation and verified against an accredited standard provides an impartial overview of performance and compares findings against best practice. This ensures compliance with policies, which gives confidence and security to the investors and reduces the fear of greenwashing. International Standard on Assurance Engagements 3000 (ISAE 3000) and AccountAbility’s (AA) AA1000 Assurance Standard are the dominant standards in the ESG sphere.
However, before you seek external assurance, your leadership need to review the quality and coverage of the data being reported. It should add value by helping to establish a functional ESG control environment and perform a review of the effectiveness of ESG risk assessments and controls. It also adds the benefit of levelling up an organisation’s performance in sustainability-related rating and benchmarking.
Accurate reporting supports decision-making by both investors and taxpayers at one end, and leadership and governing bodies at the other. Organisations should apply the same rigor and checks to ESG reporting, as they do for financial reporting. This can result in increased stakeholder/investor confidence, business value and effectiveness of capital markets.
Solving the Future of Decarbonisation in Real-Time
Source: Finance Derivative
Jamil Ahmed, Distinguished Engineer at Solace
The energy sector has faced many disruptions and challenges in recent years, from pipeline disruption to the growing demand for hydrogen. However, the most significant of all of these is the global desire to decarbonise. The growing concern over fossil fuels has created intense pressure for businesses to transition towards renewable energy sources and cut carbon emissions. Governing bodies have begun to impose regulations on organisations to force them to cut emissions by 3.4 gigatons of carbon dioxide equivalent (GtCO2e) a year by 2050, which amounts to a 90 per cent reduction in current emissions.
The constant development of markets and digital transformations will only increase the demand for energy in the future across all industries. Therefore, reducing emissions, in reality, is no small feat, however harsh or impressive the targets may be. To make decarbonisation a reality in the near term, businesses must adopt an inward-looking strategy to reduce emissions through their own operations. These are termed Scope 1 emissions and refer to emissions released as a direct result of one’s own current operations. Achieving this requires companies to streamline their operations, and improve their internal visibility to measure and track energy consumption.
The major challenge companies face in accurately measuring their energy consumption lies in overcoming the mass amounts of siloed data within their system. These data silos not only diminish productivity but also bury these useful insights, compiled into a mountain of data that is hard to identify and analyse. Ultimately, data silos are a result of organisational infrastructure built for a previous era, one with limited technological adoption, and limited pathways for dataflows. Over time these have created complex organisational barriers.
The lack of data transparency in organisational infrastructure is severely undermining businesses’ ability to gain insight from their existing data. This also impacts their ability to share data with external partners in search of meaningful solutions for decarbonisation. The value of data sharing cannot be overstated when searching for innovative solutions. A recent study shows that 45% of businesses in the energy sector see analytics and innovation as critical tools. With the entire energy sector’s ability to effectively decarbonise hinging on data sharing to drive innovation, gaining greater data insights are non-compensatory.
Another major consideration in decarbonisation is power reliability planning when transitioning to renewable energy sources. Solar and wind energy rely on changeable weather factors for operability, the varying levels of power readiness in these energy sources make them difficult to implement into the national grid. This makes reliably planning this an increasingly complex and important part of the decarbonisation journey as the sector must test for long-term stability and the potential for energy transfers and storage. A solution must be found that can address these real-time concerns.
Reliability in Real-time
Real-time data is the information that is delivered immediately after collation and enables businesses to respond to information at lightning speed. Real-time data has a host of usages in the energy sector, from alerting major weather changes that may impact power reliability to detecting overheating or electrical wastage in appliances. These information transfers are known as an ‘event’ that requires further action or response.
Real-time capabilities play a major role in overcoming data transparency issues associated with the sector, in its ability to connect interactions across systems and processes could enable energy providers to effectively identify opportunities in reducing energy wastage.
Enter event-driven architecture (EDA), the structure that underpins an organisation’s ability to view event series that occur in their system. EDA decouples the events from the system so that they can be processed and then sent in real-time as a useful information resource. This can then be analysed by resource companies to assist with optimising decarbonisation initiatives.
The strength of EDA is its scalable integration platform, as this allows companies to manage enormous quantities of data traffic coming from multiple data streams and energy sources. From this, energy companies can develop durable systems by aggregating information. This can then be sent to control systems to identify power outages or extreme weather events and conditions.
To achieve this, an architectural layer known as an event mesh is required. An event mesh enables EDA to break down data silos and facilitate the real-time integration of people, processes and systems across geographical boundaries. Implementing an event mesh also upgrades and streamlines existing systems/processes to enable better data transparency in real-time data sharing. It is unsurprising that given the great benefits of EDA both in terms of its scalability, durability and agility that a recent study found 85% of organisations surveyed view EDA as a critical component of their digital transformation efforts.
Decarbonising for the future
Regulations on the energy sector are rapidly increasing, most recently the US Senate passed the Inflation Reduction Act (IRA) on August 6th of this year. This Act signals the intense pressure on the energy sector to immediately undertake significant decarbonisation initiatives. It is designed to accelerate the production of greener and more renewable energy sources such as wind and solar. Once nations like the US have begun higher production of the technology that can harness these energy sources, others will follow suit. The only way the large-scale adoption of renewable energy sources will occur is if businesses build real-time capabilities to become event-driven businesses. Only then can the transition to decarbonisation and achieving net zero become a reality.
A zero trust environment is critical for financial services
Source: Finance Derivative
Boris Bialek, Managing Director of Industry Solutions at MongoDB
Not long ago security professionals were still focused on protecting their IT in a similar formation to mediaeval guards protecting a walled city – concentrating on making it as difficult as possible to get inside. Once past this perimeter though, access to what was within was endless. For financial services, this means access to everything from personal identifiable information (PII) including credit card numbers, names, social security information and more ‘marketable data’. Unfortunately, we have many examples of how this type of security doesn’t work, the castle gets stormed and the data isn’t protected. The most famous is still the Equifax incident, where a small breach has led to years of unhappy customers.
Thankfully the mindset has shifted spurred on by the proliferation of networks and applications across geographies, devices and cloud platforms. This has made the classic point to point security obsolete. The perimeter has changed, it is fluid, so reliance on a wall for protection also has to change.
Zero trust presents a new paradigm for cybersecurity. In this context, it is already assumed that the perimeter is breached,no users are trusted, and trust cannot be gained simply by physical or network location. Every user, device and connection must be continually verified and audited.
What might seem obvious, but begs repeating, with the amount of confidential customer and client data that financial institutions hold – not to mention the regulations – this should be an even bigger priority. The perceived value of this data also makes financial services organisations a primary target for data breaches.
But how do you create a zero trust environment?
Keeping the data secure
While ensuring that access to banking apps and online services is vital, it is actually the database that is the backend of these applications that is a key part of creating a zero trust environment. The database contains so much of an organisation’s sensitive, and regulated, information, as well as data that may not be sensitive but is critical to keeping the organisation running. This is why it is imperative that a database is ready and able to work in a zero trust environment.
As more databases are becoming cloud based services, a big part of this is ensuring that the database is secure by default, meaning it is secure out of the box. This takes some of the responsibility for security out of the hands of administrators because the highest levels of security are in place from the start, without requiring attention from users or administrators. To allow access, users and administrators must proactively make changes – nothing is automatically granted.
As more financial institutions embrace the cloud, this can get more complicated. The security responsibilities are divided between the clients’ own organisation, the cloud providers and the vendors of the cloud services being used. This is known as the shared responsibility model. This moves away from the classic model where IT owns hardening the servers and security, then needs to harden the software on top – say the version of the database software – and then needs to harden the actual application code. In this model, the hardware (CPU, network, storage) are solely in the realm of the cloud provider that provisions these systems. The service provider for a Data-as-a-Service model then delivers the database hardened to the client with a designated endpoint. Only then does the actual client team and their application developers and DevOps team come into play for the actual “solution”.
Security and resilience in the cloud are only possible when everyone is clear on their roles and responsibilities. Shared responsibility recognizes that cloud vendors ensure that their products are secure by default, while still available, but also that organisations take appropriate steps to continue to protect the data they keep in the cloud.
In banks and finance organisations, there is always lots of focus on customer authentication, making sure that accessing funds is as secure as possible. But it is also important to make sure that access to the database on the other end is secure. An IT organisation can use any number of methods to allow users to authenticate themselves to a database. Most often that includes a username and password, but given the increased need to maintain the privacy of confidential customer information by financial services organisations this should only be viewed as a base layer.
At the database layer, it is important to have transport layer security and SCRAM authentication which enables traffic from clients to the database to be authenticated and encrypted in transit.
Passwordless authentication is also something that should be considered – not just for customers, but internal teams as well. This can be done in multiple ways with the database, either auto-generated certificates that are needed to access the database or advanced options for organisations already using X.509 certificates and have a certificate management infrastructure.
Tracking is a key component
As a highly regulated industry, it is also important to monitor your zero trust environment to ensure that it remains in force and exompasses your database. The database should be able to log all actions or have functionality to apply filters to capture only specific events, users or roles.
Role-based auditing lets you log and report activities by specific roles, such as userAdmin or dbAdmin, coupled with any roles inherited by each user, rather than having to extract activity for each individual administrator. This approach makes it easier for organisations to enforce end-to-end operational control and maintain the insight necessary for compliance and reporting.
Next level encryption
With large amounts of valuable data, financial institutions also need to make sure that they are embracing encryption – in flight, at rest and even in use. Securing data with client-side field-level encryption allows you to move to managed services in the cloud with greater confidence. The database only works with encrypted fields and organisations control their own encryption keys, rather than having the database provider manage them. This additional layer of security enforces an even more fine-grained separation of duties between those who use the database and those who administer and manage it.
Also, as more data is being transmitted and stored in the cloud – some of which are highly sensitive workloads – additional technical options to control and limit access to confidential and regulated data is needed. However, this data still needs to be used. So ensuring that in-use data encryption is part of your zero trust solution is vital. This also enables organisations to confidently store sensitive data, meeting compliance requirements, while also enabling different parts of the business to gain access and insights from it.
Securing data is only going to continue to become more important for all organisations, but for those in financial services the stakes can be even higher. Leaving the perimeter mentality to the history books and moving towards zero trust – especially as cloud and as-a-service infrastructure permeates the industry – is the only way to protect such valuable data.