WINNIPEG, Manitoba, Oct 22 (Reuters) – Canada’s oil producers face new pressure from Prime Minister Justin Trudeau to reduce emissions in just three years, a sudden acceleration of their plans that at least one major company said looks unrealistic.
Suncor Energy (SU.TO), the second-largest Canadian crude producer, says it remains focused on cutting emissions by 2030, not 2025 as the Canadian government will require.
“Honestly, 2025 is going to be tough,” Martha Hall Findlay, Suncor’s Chief Sustainability Officer, told Reuters. “That’s not a number we’ve used, it’s a number the feds have used.”
Trudeau’s advanced timetable for cuts to the oil sector’s total emissions by 2025, announced last month, comes as the oil sector has focused on longer-term targets, and on reducing emissions on a per-barrel basis.
“That is light speed for an oil sands company. That’s tomorrow,” said Kevin Birn, chief analyst of Canadian oil markets at consultancy IHS Markit, of Trudeau’s demand. “They’re a very hard ship to turn because they have so much emissions.”
Previously, Ottawa had a target of cutting national emissions by at least 40% by 2030, but it did not single out the oil sector. Canada’s crude industry generates some of the highest emissions per barrel worldwide.
Suncor is the only big producer that has laid out a plan – in May – to cut total emissions by 2030, depending heavily on carbon capture, greener power sources and energy efficiency.
But Trudeau’s 2025 demand came as a surprise.
“We had obviously been having conversations with the feds long before the budget came out last spring, long before the (election) campaign,” Hall Findlay said. “None of those discussions have mentioned 2025. At Suncor, we’re laser-focused on 2030.”
Cenovus intends to cut emissions on an absolute and per-barrel basis, said spokesman Reg Curren, but he would not say if cuts would occur by 2025.
Canadian Natural is working on “mid-term” targets connected to the Pathways carbon capture project with its peers, said spokesperson Julie Woo. She would not say if they would address Trudeau’s 2025 requirement.
Governments and business would need to spend C$60 billion annually to cut Canada’s emissions by 75% in the next 30 years, RBC Economics said.
Canadian producers are expected to report big quarterly profits in coming weeks as oil and gas prices have soared. The companies have prioritized repaying debt and returning cash to investors, but Trudeau wants producers to spend some profits on curbing emissions.
He plans to unveil his new cabinet on Tuesday, just ahead of the United Nations’ Climate Change Conference in Glasgow, Scotland.
Ottawa wants to ensure there are ambitious emission reductions from the oil and gas sector, making a meaningful contribution to Canada’s climate goals, said Joanna Sivasankaran, spokesperson for the Canadian environment department.
Trudeau’s 2025 goal is “ambitious for sure” and it would be more realistic to expect the sector to cut emissions sharply by a decade later, said Steve MacDonald, CEO of Emissions Reduction Alberta, an arms-length corporation funded by the provincial government.
‘EASIER THAN ANYONE THINKS’
Some small conventional oil producers are already showing deep emissions cuts are possible, however, using methods that big producers Canadian Natural and Cenovus could widely apply. Both companies produce crude in the oil sands and by conventional methods.
Yangarra Resources (YGR.TO), which produces 10,000 barrels of oil equivalent per day, says it will cut total emissions by 47%, or 50,000 tonnes of carbon dioxide equivalent, by the end of 2022. Its plans involve powering 80 pumpjacks with electricity from the Alberta grid, instead of burning natural gas, and replacing older instruments that emit high amounts of methane.
“Cutting carbon in the oil patch is going to be a whole lot easier than anyone thinks,” said Yangarra CEO Jim Evaskevich. “All of the changes we are implementing make incredible economic sense.”
The moves are likely to generate substantial credits next year that Yangarra can sell to bigger emitters, although the monetary value has not yet been determined, Evaskevich said.
Cenovus, which generates 18% of its production from conventional operations, has cut its methane emissions by nearly half from 2015 levels, a spokesperson said. Canadian Natural has cut methane emissions by 28% since 2016, Woo said.
“They’re big, large operations, and they can’t pivot quite as quickly,” MacDonald said. “But that doesn’t mean they aren’t moving forward in the same areas.”
Emissions reductions are difficult for oil sands operations because of the energy they require, while conventional methane emissions are easier to tackle, said Keith Stewart, senior energy strategist at Greenpeace Canada.
Oil sands producers are counting on expanded carbon capture and sequestration facilities to cut emissions. But the economics requiregovernment funding, said Greg McNab, a partner at the Baker McKenzie law firm. Using renewable power to run oil sands facilities may be the quickest way to curb emissions, he said.
Reporting by Rod Nickel in Winnipeg; Editing by David Gregorio
Our Standards: The Thomson Reuters Trust Principles.
How to improve the accuracy of your ESG reporting
Source: Finance Derivative
Rajesh Gharpure, Global Head- ESG, Larsen & Toubro Infotech (LTI)
ESG initiatives have become increasingly significant in the business world, with organisations integrating sustainability into their core business strategy and using them as drivers to strengthen resiliency and create long-term value. As a result of this elevated focus, investors now require greater transparency into ESG performance. This means organisations need to make public commitments towards sustainability and provide robust, relevant, and routine updates to their strategies, goals and metrics. They also need to collate accurate ESG data which is critical for making the right capital allocation and investment decisions, and for investors to understand their investment risk and value preservation.
But organisations often struggle to report ESG data effectively. In fact, more than half of leadership across organisations today experience challenges around data availability and data quality. However, with upcoming regulatory changes, such as the European Commission’s Sustainable Financial Disclosure Regulation (SFDR), investment firms need to ensure organisations provide the right data for accurate reporting to support green claims in their ESG-labelled investment funds.
While organisations focus on robust and accurate ESG reporting, it’s not just about trying to report everything, but more about knowing what should be reported. This can be achieved through the focused implementation of the Selection – Innovation – Assurance (S-I-A) approach:
It’s important for organisations to continually reassess their ESG journey and establish the correct boundaries through a precise selection process. Selection criteria should categorically define the areas for reporting. A well-planned and executed materiality assessment can help identify areas which are most impactful for business as well as stakeholders. Taking inspiration from the 17 UN SDG goals, alignment with the objectives, jurisdiction, peer benchmarking and geography-specific requirements, organisations need to limit their reporting purview to the most significant topics. By taking a structured programmatic approach, these steps ensure reliable choices are made regarding organisation boundaries, programmes and KPIs, all of which are crucial for an organisation’s long-term sustainability. This is all while taking into consideration the bandwidth and resources needed to ensure proper governance and accurate reporting.
Reporting as a means to precisely monitor and govern creates a multitude of challenges as data needs are continually expanding. So it’s important to factor in that large organisations grow both organically and inorganically and, in the process, end up having a wide digital landscape most of which is focussed on primary operational needs and not necessarily targeted towards integrated digital fabric. Organisations often struggle with sourcing the right data as much of the non-operational information is recorded and manually maintained in excel spreadsheets. It is also a question of how to track all the relevant data and compile it in a way that is meaningful and ingestible for generating an ESG report. This means a typical ESG reporting cycle for a mid to large scale organisation can take up to 4-5 months to complete, significantly delaying its ability to monitor and adjust course. Furthermore, the myriad of frameworks and reporting programmes used globally only tend to multiply the problem. Manual data collection also comes with the risk of errors and inaccuracies. It also makes it difficult to scale ESG initiatives and examine transactional data for disclosure purposes.
Through digital intervention, this timeline can be systemically reduced, enabling organisations to quickly adapt and evolve their sustainability programmes and metrics. Usage of digital tools to capture and perform the extract, transform, load (ETL) work before reporting serves the triple purpose of effort and time savings, data validation and reporting accuracy. Taking a productised IT and data product approach focussed on ESG data can help more effectively catalogue data from organisational systems and subsystems into formats needed for reporting and disclosures. This reduces the risk of poor data quality. IoT technologies are capable of tracking and measuring the performance of each individual asset in an operation and data from these platforms can be directly pulled and integrated into such catalogues for ESG reporting purposes and auditing.
Assurance (internal, followed by external) provides the evidence to support the accuracy of an organisation’s data, and adds the element of truth and trust in your ESG report. The 2021 survey by the International Federation of Accountants (IFAC), showed that only 51% of the organisations who shared their ESG information, provided assurance with their disclosures. And most of those were limited to certain facets of the total report. However, an ESG report validated by a recognised auditing organisation and verified against an accredited standard provides an impartial overview of performance and compares findings against best practice. This ensures compliance with policies, which gives confidence and security to the investors and reduces the fear of greenwashing. International Standard on Assurance Engagements 3000 (ISAE 3000) and AccountAbility’s (AA) AA1000 Assurance Standard are the dominant standards in the ESG sphere.
However, before you seek external assurance, your leadership need to review the quality and coverage of the data being reported. It should add value by helping to establish a functional ESG control environment and perform a review of the effectiveness of ESG risk assessments and controls. It also adds the benefit of levelling up an organisation’s performance in sustainability-related rating and benchmarking.
Accurate reporting supports decision-making by both investors and taxpayers at one end, and leadership and governing bodies at the other. Organisations should apply the same rigor and checks to ESG reporting, as they do for financial reporting. This can result in increased stakeholder/investor confidence, business value and effectiveness of capital markets.
Solving the Future of Decarbonisation in Real-Time
Source: Finance Derivative
Jamil Ahmed, Distinguished Engineer at Solace
The energy sector has faced many disruptions and challenges in recent years, from pipeline disruption to the growing demand for hydrogen. However, the most significant of all of these is the global desire to decarbonise. The growing concern over fossil fuels has created intense pressure for businesses to transition towards renewable energy sources and cut carbon emissions. Governing bodies have begun to impose regulations on organisations to force them to cut emissions by 3.4 gigatons of carbon dioxide equivalent (GtCO2e) a year by 2050, which amounts to a 90 per cent reduction in current emissions.
The constant development of markets and digital transformations will only increase the demand for energy in the future across all industries. Therefore, reducing emissions, in reality, is no small feat, however harsh or impressive the targets may be. To make decarbonisation a reality in the near term, businesses must adopt an inward-looking strategy to reduce emissions through their own operations. These are termed Scope 1 emissions and refer to emissions released as a direct result of one’s own current operations. Achieving this requires companies to streamline their operations, and improve their internal visibility to measure and track energy consumption.
The major challenge companies face in accurately measuring their energy consumption lies in overcoming the mass amounts of siloed data within their system. These data silos not only diminish productivity but also bury these useful insights, compiled into a mountain of data that is hard to identify and analyse. Ultimately, data silos are a result of organisational infrastructure built for a previous era, one with limited technological adoption, and limited pathways for dataflows. Over time these have created complex organisational barriers.
The lack of data transparency in organisational infrastructure is severely undermining businesses’ ability to gain insight from their existing data. This also impacts their ability to share data with external partners in search of meaningful solutions for decarbonisation. The value of data sharing cannot be overstated when searching for innovative solutions. A recent study shows that 45% of businesses in the energy sector see analytics and innovation as critical tools. With the entire energy sector’s ability to effectively decarbonise hinging on data sharing to drive innovation, gaining greater data insights are non-compensatory.
Another major consideration in decarbonisation is power reliability planning when transitioning to renewable energy sources. Solar and wind energy rely on changeable weather factors for operability, the varying levels of power readiness in these energy sources make them difficult to implement into the national grid. This makes reliably planning this an increasingly complex and important part of the decarbonisation journey as the sector must test for long-term stability and the potential for energy transfers and storage. A solution must be found that can address these real-time concerns.
Reliability in Real-time
Real-time data is the information that is delivered immediately after collation and enables businesses to respond to information at lightning speed. Real-time data has a host of usages in the energy sector, from alerting major weather changes that may impact power reliability to detecting overheating or electrical wastage in appliances. These information transfers are known as an ‘event’ that requires further action or response.
Real-time capabilities play a major role in overcoming data transparency issues associated with the sector, in its ability to connect interactions across systems and processes could enable energy providers to effectively identify opportunities in reducing energy wastage.
Enter event-driven architecture (EDA), the structure that underpins an organisation’s ability to view event series that occur in their system. EDA decouples the events from the system so that they can be processed and then sent in real-time as a useful information resource. This can then be analysed by resource companies to assist with optimising decarbonisation initiatives.
The strength of EDA is its scalable integration platform, as this allows companies to manage enormous quantities of data traffic coming from multiple data streams and energy sources. From this, energy companies can develop durable systems by aggregating information. This can then be sent to control systems to identify power outages or extreme weather events and conditions.
To achieve this, an architectural layer known as an event mesh is required. An event mesh enables EDA to break down data silos and facilitate the real-time integration of people, processes and systems across geographical boundaries. Implementing an event mesh also upgrades and streamlines existing systems/processes to enable better data transparency in real-time data sharing. It is unsurprising that given the great benefits of EDA both in terms of its scalability, durability and agility that a recent study found 85% of organisations surveyed view EDA as a critical component of their digital transformation efforts.
Decarbonising for the future
Regulations on the energy sector are rapidly increasing, most recently the US Senate passed the Inflation Reduction Act (IRA) on August 6th of this year. This Act signals the intense pressure on the energy sector to immediately undertake significant decarbonisation initiatives. It is designed to accelerate the production of greener and more renewable energy sources such as wind and solar. Once nations like the US have begun higher production of the technology that can harness these energy sources, others will follow suit. The only way the large-scale adoption of renewable energy sources will occur is if businesses build real-time capabilities to become event-driven businesses. Only then can the transition to decarbonisation and achieving net zero become a reality.
A zero trust environment is critical for financial services
Source: Finance Derivative
Boris Bialek, Managing Director of Industry Solutions at MongoDB
Not long ago security professionals were still focused on protecting their IT in a similar formation to mediaeval guards protecting a walled city – concentrating on making it as difficult as possible to get inside. Once past this perimeter though, access to what was within was endless. For financial services, this means access to everything from personal identifiable information (PII) including credit card numbers, names, social security information and more ‘marketable data’. Unfortunately, we have many examples of how this type of security doesn’t work, the castle gets stormed and the data isn’t protected. The most famous is still the Equifax incident, where a small breach has led to years of unhappy customers.
Thankfully the mindset has shifted spurred on by the proliferation of networks and applications across geographies, devices and cloud platforms. This has made the classic point to point security obsolete. The perimeter has changed, it is fluid, so reliance on a wall for protection also has to change.
Zero trust presents a new paradigm for cybersecurity. In this context, it is already assumed that the perimeter is breached,no users are trusted, and trust cannot be gained simply by physical or network location. Every user, device and connection must be continually verified and audited.
What might seem obvious, but begs repeating, with the amount of confidential customer and client data that financial institutions hold – not to mention the regulations – this should be an even bigger priority. The perceived value of this data also makes financial services organisations a primary target for data breaches.
But how do you create a zero trust environment?
Keeping the data secure
While ensuring that access to banking apps and online services is vital, it is actually the database that is the backend of these applications that is a key part of creating a zero trust environment. The database contains so much of an organisation’s sensitive, and regulated, information, as well as data that may not be sensitive but is critical to keeping the organisation running. This is why it is imperative that a database is ready and able to work in a zero trust environment.
As more databases are becoming cloud based services, a big part of this is ensuring that the database is secure by default, meaning it is secure out of the box. This takes some of the responsibility for security out of the hands of administrators because the highest levels of security are in place from the start, without requiring attention from users or administrators. To allow access, users and administrators must proactively make changes – nothing is automatically granted.
As more financial institutions embrace the cloud, this can get more complicated. The security responsibilities are divided between the clients’ own organisation, the cloud providers and the vendors of the cloud services being used. This is known as the shared responsibility model. This moves away from the classic model where IT owns hardening the servers and security, then needs to harden the software on top – say the version of the database software – and then needs to harden the actual application code. In this model, the hardware (CPU, network, storage) are solely in the realm of the cloud provider that provisions these systems. The service provider for a Data-as-a-Service model then delivers the database hardened to the client with a designated endpoint. Only then does the actual client team and their application developers and DevOps team come into play for the actual “solution”.
Security and resilience in the cloud are only possible when everyone is clear on their roles and responsibilities. Shared responsibility recognizes that cloud vendors ensure that their products are secure by default, while still available, but also that organisations take appropriate steps to continue to protect the data they keep in the cloud.
In banks and finance organisations, there is always lots of focus on customer authentication, making sure that accessing funds is as secure as possible. But it is also important to make sure that access to the database on the other end is secure. An IT organisation can use any number of methods to allow users to authenticate themselves to a database. Most often that includes a username and password, but given the increased need to maintain the privacy of confidential customer information by financial services organisations this should only be viewed as a base layer.
At the database layer, it is important to have transport layer security and SCRAM authentication which enables traffic from clients to the database to be authenticated and encrypted in transit.
Passwordless authentication is also something that should be considered – not just for customers, but internal teams as well. This can be done in multiple ways with the database, either auto-generated certificates that are needed to access the database or advanced options for organisations already using X.509 certificates and have a certificate management infrastructure.
Tracking is a key component
As a highly regulated industry, it is also important to monitor your zero trust environment to ensure that it remains in force and exompasses your database. The database should be able to log all actions or have functionality to apply filters to capture only specific events, users or roles.
Role-based auditing lets you log and report activities by specific roles, such as userAdmin or dbAdmin, coupled with any roles inherited by each user, rather than having to extract activity for each individual administrator. This approach makes it easier for organisations to enforce end-to-end operational control and maintain the insight necessary for compliance and reporting.
Next level encryption
With large amounts of valuable data, financial institutions also need to make sure that they are embracing encryption – in flight, at rest and even in use. Securing data with client-side field-level encryption allows you to move to managed services in the cloud with greater confidence. The database only works with encrypted fields and organisations control their own encryption keys, rather than having the database provider manage them. This additional layer of security enforces an even more fine-grained separation of duties between those who use the database and those who administer and manage it.
Also, as more data is being transmitted and stored in the cloud – some of which are highly sensitive workloads – additional technical options to control and limit access to confidential and regulated data is needed. However, this data still needs to be used. So ensuring that in-use data encryption is part of your zero trust solution is vital. This also enables organisations to confidently store sensitive data, meeting compliance requirements, while also enabling different parts of the business to gain access and insights from it.
Securing data is only going to continue to become more important for all organisations, but for those in financial services the stakes can be even higher. Leaving the perimeter mentality to the history books and moving towards zero trust – especially as cloud and as-a-service infrastructure permeates the industry – is the only way to protect such valuable data.