Connect with us

Business

A New Generation of Nuclear Reactors Could Hold the Key to a Green Future

Source: Time

On a conference-room whiteboard in the heart of Silicon Valley, Jacob DeWitte sketches his startup’s first product. In red marker, it looks like a beer can in a Koozie, stuck with a crazy straw. In real life, it will be about the size of a hot tub, and made from an array of exotic materials, like zirconium and uranium. Under carefully controlled conditions, they will interact to produce heat, which in turn will make electricity—1.5 megawatts’ worth, enough to power a neighborhood or a factory. DeWitte’s little power plant will run for a decade without refueling and, amazingly, will emit no carbon. ”It’s a metallic thermal battery,” he says, coyly. But more often DeWitte calls it by another name: a nuclear reactor.

Fission isn’t for the faint of heart. Building a working reactor—even a very small one—requires precise and painstaking efforts of both engineering and paper pushing. Regulations are understandably exhaustive. Fuel is hard to come by—they don’t sell uranium at the Gas-N-Sip. But DeWitte plans to flip the switch on his first reactor around 2023, a mere decade after co-founding his company, Oklo. After that, they want to do for neighborhood nukes what Tesla has done for electric cars: use a niche and expensive first version as a stepping stone toward cheaper, bigger, higher-volume products. In Oklo’s case, that means starting with a “microreactor” designed for remote communities, like Alaskan villages, currently dependent on diesel fuel trucked, barged or even flown in, at an exorbitant expense. Then building more and incrementally larger reactors until their zero-carbon energy source might meaningfully contribute to the global effort to reduce fossil-fuel emissions.

At global climate summits, in the corridors of Congress and at statehouses around the U.S., nuclear power has become the contentious keystone of carbon reduction plans. Everyone knows they need it. But no one is really sure they want it, given its history of accidents. Or even if they can get it in time to reach urgent climate goals, given how long it takes to build. Oklo is one of a growing handful of companies working to solve those problems by putting reactors inside safer, easier-to-build and smaller packages. None of them are quite ready to scale to market-level production, but given the investments being made into the technology right now, along with an increasing realization that we won’t be able to shift away from fossil fuels without nuclear power, it’s a good bet that at least one of them becomes a game changer.

If existing plants are the energy equivalent of a 2-liter soda bottle, with giant, 1,000-megawatt-plus reactors, Oklo’s strategy is to make reactors by the can. The per-megawatt construction costs might be higher, at least at first. But producing units in a factory would give the company a chance to improve its processes and to lower costs. Oklo would pioneer a new model. Nuclear plants need no longer be bet-the-company big, even for giant utilities. Venture capitalists can get behind the potential to scale to a global market. And climate hawks should fawn over a zero-carbon energy option that complements burgeoning supplies of wind and solar power. Unlike today’s plants, which run most efficiently at full blast, making it challenging for them to adapt to a grid increasingly powered by variable sources (not every day is sunny, or windy), the next generation of nuclear technology wants to be more flexible, able to respond quickly to ups and downs in supply and demand.

Engineering these innovations is hard. Oklo’s 30 employees are busy untangling the knots of safety and complexity that sent the cost of building nuclear plants to the stratosphere and all but halted their construction in the U.S. ”If this technology was brand-‘new’—like if fission was a recent breakthrough out of a lab, 10 or 15 years ago—we’d be talking about building our 30th reactor,” DeWitte says.

But fission is an old, and fraught, technology, and utility companies are scrambling now to keep their existing gargantuan nuclear plants open. Economically, they struggle to compete with cheap natural gas, along with wind and solar, often subsidized by governments. Yet climate-focused nations like France and the U.K. that had planned to phase out nuclear are instead doubling down. (In October, French President Emmanuel Macron backed off plans to close 14 reactors, and in November, he announced the country would instead start building new ones.) At the U.N. climate summit in Glasgow, the U.S. announced its support for Poland, Kenya, Ukraine, Brazil, Romania and Indonesia to develop their own new nuclear plants—while European negotiators assured that nuclear energy counts as “green.” All the while, Democrats and Republicans are (to everyone’s surprise) often aligned on nuclear’s benefits—and, in many cases, putting their powers of the purse behind it, both to keep old plants open in the U.S. and speed up new technologies domestically and overseas.

It makes for a decidedly odd moment in the life of a technology that already altered the course of one century, and now wants to make a difference in another. There are 93 operating nuclear reactors in the U.S.; combined, they supply 20% of U.S. electricity, and 50% of its carbon-free electricity. Nuclear should be a climate solution, satisfying both technical and economic needs. But while the existing plants finally operate with enviable efficiency (after 40 years of working out the kinks), the next generation of designs is still a decade away from being more than a niche player in our energy supply. Everyone wants a steady supply of electricity, without relying on coal. Nuclear is paradoxically right at hand, and out of reach.

For that to change, “new nuclear” has to emerge before the old nuclear plants recede. It has to keep pace with technological improvements in other realms, like long-term energy storage, where each incremental improvement increases the potential for renewables to supply more of our electricity. It has to be cheaper than carbon-capture technologies, which would allow flexible gas plants to operate without climate impacts (but are still too expensive to build at scale). And finally it has to arrive before we give up—before the spectre of climate catastrophe creates a collective “doomerism,” and we stop trying to change.

Not everyone thinks nuclear can reinvent itself in time. “When it comes to averting the imminent effects of climate change, even the cutting edge of nuclear technology will prove to be too little, too late,” predicts Allison Macfarlane, former chair of the U.S. Nuclear Regulatory Commission (NRC)—the government agency singularly responsible for permitting new plants. Can a stable, safe, known source of energy rise to the occasion, or will nuclear be cast aside as too expensive, too risky and too late?

Trying Again

Nuclear began in a rush. In 1942, in the lowest mire of World War II, the U.S. began the Manhattan Project, the vast effort to develop atomic weapons. It employed 130,000 people at secret sites across the country, the most famous of which was Los Alamos Laboratory, near Albuquerque, N.M., where Robert Oppenheimer led the design and construction of the first atomic bombs. DeWitte, 36, grew up nearby. Even as a child of the ’90s, he was steeped in the state’s nuclear history, and preoccupied with the terrifying success of its engineering and the power of its materials. “It’s so incredibly energy dense,” says DeWitte. “A golf ball of uranium would power your entire life!”

DeWitte has taken that bromide almost literally. He co-founded Oklo in 2013 with Caroline Cochran, while both were graduate students in nuclear engineering at the Massachusetts Institute of Technology. When they arrived in Cambridge, Mass., in 2007 and 2008, the nuclear industry was on a precipice. Then presidential candidate Barack Obama espoused a new eagerness to address climate change by reducing carbon emissions—which at the time meant less coal, and more nuclear. (Wind and solar energy were still a blip.) It was an easy sell. In competitive power markets, nuclear plants were profitable. The 104 operating reactors in the U.S. at the time were running smoothly. There hadn’t been a major accident since Chernobyl, in 1986.

The industry excitedly prepared for a “nuclear renaissance.” At the peak of interest, the NRC had applications for 30 new reactors in the U.S. Only two would be built. The cheap natural gas of the fracking boom began to drive down electricity prices, razing nuclear’s profits. Newly subsidized renewables, like wind and solar, added even more electricity generation, further saturating the markets. When on March 11, 2011, an earthquake and subsequent tsunami rolled over Japan’s Fukushima Daiichi nuclear power plant, leading to the meltdown of all three of its reactors and the evacuation of 154,000 people, the industry’s coffin was fully nailed. Not only would there be no renaissance in the U.S, but the existing plants had to justify their safety. Japan shut down 46 of its 50 operating reactors. Germany closed 11 of its 17. The U.S. fleet held on politically, but struggled to compete economically. Since Fukushima, 12 U.S. reactors have begun decommissioning, with three more planned.

At MIT, Cochran and DeWitte—who were teaching assistants together for a nuclear reactor class in 2009, and married in 2011—were frustrated by the setback. ”It was like, There’re all these cool technologies out there. Let’s do something with it,” says Cochran. But the nuclear industry has never been an easy place for innovators. In the U.S., its operational ranks have long been dominated by “ring knockers”—the officer corps of the Navy’s nuclear fleet, properly trained in the way things are done, but less interested in doing them differently. Governments had always kept a tight grip on nuclear; for decades, the technology was under shrouds. The personal computing revolution, and then the wild rise of the Internet, further drained engineering talent. From DeWitte and Cochran’s perspective, the nuclear-energy industry had already ossified by the time Fukushima and fracking totally brought things to a halt. “You eventually got to the point where it’s like, we have to try something different,” DeWitte says.

He and Cochran began to discreetly convene their MIT classmates for brainstorming sessions. Nuclear folks tend to be dogmatic about their favorite method of splitting atoms, but they stayed agnostic. “I didn’t start thinking we had to do everything differently,” says DeWitte. Rather, they had a hunch that marginal improvements might yield major results, if they could be spread across all of the industry’s usual snags—whether regulatory approaches, business models, the engineering of the systems themselves, or the challenge of actually constructing them.

In 2013, Cochran and DeWitte began to rent out the spare room in their Cambridge home on Airbnb. Their first guests were a pair of teachers from Alaska. The remote communities they taught in were dependent on diesel fuel for electricity, brought in at enormous cost. That energy scarcity created an opportunity: in such an environment, even a very expensive nuclear reactor might still be cheaper than the current system. The duo targeted a price of $100 per megawatt hour, more than double typical energy costs. They imagined using this high-cost early market as a pathway to scale their manufacturing. They realized that to make it work economically, they wouldn’t have to reinvent the reactor technology, only the production and sales processes. They decided to own their reactors and supply electricity, rather than supply the reactors themselves—operating more like today’s solar or wind developers. “It’s less about the technology being different,” says DeWitte, “than it is about approaching the entire process differently.”

That maverick streak raised eyebrows among nuclear veterans—and cash from Silicon Valley venture capitalists, including a boost from Y Combinator, where companies like Airbnb and Instacart got their start. In the eight years since, Oklo has distinguished itself from the competition by thinking smaller and moving faster. There are others competing in this space: NuScale, based in Oregon, is working to commercialize a reactor similar in design to existing nuclear plants, but constructed in 60-megawatt modules. TerraPower, founded by Bill Gates in 2006, has plans for a novel technology that uses its heat for energy storage, rather than to spin a turbine, which makes it an even more flexible option for electric grids that increasingly need that pliability. And X-energy, a Maryland-based firm that has received substantial funding from the U.S. Department of Energy, is developing 80-megawatt reactors that can also be grouped into “four-packs,” bringing them closer in size to today’s plants. Yet all are still years—and a billion dollars—away from their first installations. Oklo brags that its NRC application is 20 times shorter than NuScale’s, and its proposal cost 100 times less to develop. (Oklo’s proposed reactor would produce one-fortieth the power of NuScale’s.) NRC accepted Oklo’s application for review in March 2020, and regulations guarantee that process will be complete within three years. Oklo plans to power on around 2023, at a site at the Idaho National Laboratory, one of the U.S.’s oldest nuclear-research sites, and so already approved for such efforts. Then comes the hard part: doing it again and again, booking enough orders to justify building a factory to make many more reactors, driving costs down, and hoping politicians and activists worry more about the menace of greenhouse gases than the hazards of splitting atoms.

Nuclear-industry veterans remain wary. They have seen this all before. Westinghouse’s AP1000 reactor, first approved by the NRC in 2005, was touted as the flagship technology of Obama’s nuclear renaissance. It promised to be safer and simpler, using gravity rather than electricity-driven pumps to cool the reactor in case of an emergency—in theory, this would mitigate the danger of power outages, like the one that led to the Fukushima disaster. Its components could be constructed at a centralized location, and then shipped in giant pieces for assembly.

But all that was easier said than done. Westinghouse and its contractors struggled to manufacture the components according to nuclear’s mega-exacting requirements and in the end, only one AP1000 project in the U.S. actually happened: the Vogtle Electric Generating Plant in Georgia. Approved in 2012, its two reactors were expected at the time to cost $14 billion and be completed in 2016 and 2017, but costs have ballooned to $25 billion. The first will open, finally, next year.

Oklo and its competitors insist things are different this time, but they have yet to prove it. “Because we haven’t built one of them yet, we can promise that they’re not going to be a problem to build,” quips Gregory Jaczko, a former NRC chair who has since become the technology’s most biting critic. “So there’s no evidence of our failure.”

The Challenge

The cooling tower of the Hope Creek nuclear plant rises 50 stories above Artificial Island, New Jersey, built up on the marshy edge of the Delaware River. The three reactors here—one belonging to Hope Creek, and two run by the Salem Generating Station, which shares the site—generate an astonishing 3,465 megawatts of electricity, or roughly 40% of New Jersey’s total supply. Construction began in 1968, and was completed in 1986. Their closest human neighbors are across the river in Delaware. Otherwise the plant is surrounded by protected marshlands, pocked with radiation sensors and the occasional guard booth. Of the 1,500 people working here, around 100 are licensed reactor operators—a special designation given by the NRC, and held by fewer than 4,000 people in the country.

Among the newest in their ranks is Judy Rodriguez, an Elizabeth, N.J., native and another MIT grad. “Do I have your permission to enter?” she asks the operator on duty in the control room for the Salem Two reactor, which came online in 1981 and is capable of generating 1,200 megawatts of power. The operator opens a retractable belt barrier, like at an airport, and we step across a thick red line in the carpet. A horseshoe-shaped gray cabinet holds hundreds of buttons, glowing indicators and blinking lights, but a red LED counter at the center of the wall shows the most important number in the room: 944 megawatts, the amount of power the Salem Two reactor was generating that afternoon in September. Beside it is a circular pattern of square indicator lights showing the uranium fuel assemblies inside the core, deep inside the concrete domed containment building a couple hundred yards away. Salem Two has 764 of these constructions; each is about 6 inches sq and 15 ft. tall. They contain the source of the reactor’s energy, which are among the most guarded and controlled materials on earth. To make sure no one working there forgets that fact, a phrase is painted on walls all around the plant: “Line of Sight to the Reactor.”

As the epitome of critical infrastructure, this station has been buffeted by the crises the U.S. has suffered in the past few decades. After 9/11, the three reactors here absorbed nearly $100 million in security upgrades. Everyone entering the plant passes through metal- and explosives detectors, and radiation detectors on the way out. Walking between the buildings entails crossing a concrete expanse beneath high bullet resistant enclosures (BREs). The plant has a guard corp that has more members than any in New Jersey besides the state police, and federal NRC rules mean that they don’t have to abide by state limitations on automatic weapons.

The scale and complexity of the operation is staggering—and expensive. ”The place you’re sitting at right now costs us about $1.5 million to $2 million a day to run,” says Ralph Izzo, president and CEO of PSEG, New Jersey’s public utility company, which owns and operates the plants. “If those plants aren’t getting that in market, that’s a rough pill to swallow.” In 2019, the New Jersey Board of Public Utilities agreed to $300 million in annual subsidies to keep the three reactors running. The justification is simple: if the state wants to meet its carbon-reduction goals, keeping the plants online is essential, given that they supply 90% of the state’s zero-carbon energy. In September, the Illinois legislature came to the same conclusion as New Jersey, approving almost $700 million over five years to keep two existing nuclear plants open. The bipartisan infrastructure bill includes $6 billion in additional support (along with nearly $10 billion for development of future reactors). Even more is expected in the broader Build Back Better bill.

These subsidies—framed in both states as “carbon mitigation credits”—acknowledge the reality that nuclear plants cannot, on their own terms, compete economically with natural gas or coal. “There has always been a perception of this technology that never was matched by reality,” says Jaczko. The subsidies also show how climate change has altered the equation, but not decisively enough to guarantee nuclear’s future. Lawmakers and energy companies are coming to terms with nuclear’s new identity as clean power, deserving of the same economic incentives as solar and wind. Operators of existing plants want to be compensated for producing enormous amounts of carbon free energy, according to Josh Freed, of Third Way, a Washington, D.C., think tank that champions nuclear power as a climate solution. “There’s an inherent benefit to providing that, and it should be paid for.” For the moment, that has brought some assurance to U.S. nuclear operators of their future prospects. “A megawatt of zero-carbon electricity that’s leaving the grid is no different from a new megawatt of zero carbon electricity coming onto the grid,” says Kathleen Barrón, senior vice president of government and regulatory affairs and public policy at Exelon, the nation’s largest operator of nuclear reactors.

Globally, nations are struggling with the same equation. Germany and Japan both shuttered many of their plants after the Fukushima disaster, and saw their progress at reducing carbon emissions suffer. Germany has not built new renewables fast enough to meet its electricity needs, and has made up the gap with dirty coal and natural gas imported from Russia. Japan, under international pressure to move more aggressively to meet its carbon targets, announced in October that it would work to restart its reactors. “Nuclear power is indispensable when we think about how we can ensure a stable and affordable electricity supply while addressing climate change,” said Koichi Hagiuda, Japan’s minister of economy, trade and industry, at an October news conference. China is building more new nuclear reactors than any other country, with plans for as many as 150 by the 2030s, at an estimated cost of nearly half a trillion dollars. Long before that, in this decade, China will overtake the U.S. as the operator of the world’s largest nuclear-energy system.

The future won’t be decided by choosing between nuclear or solar power. Rather, it’s a technically and economically complicated balance of adding as much renewable energy as possible while ensuring a steady supply of electricity. At the moment, that’s easy. “There is enough opportunity to build renewables before achieving penetration levels that we’re worried about the grid having stability,” says PSEG’s Izzo. New Jersey, for its part, is aiming to add 7,500 megawatts of offshore wind by 2035—or about the equivalent of six new Salem-sized reactors. The technology to do that is readily at hand—Kansas alone has about that much wind power installed already.

The challenge comes when renewables make up a greater proportion of the electricity supply—or when the wind stops blowing. The need for “firm” generation becomes more crucial. “You cannot run our grid solely on the basis of renewable supply,” says Izzo. “One needs an interseasonal storage solution, and no one has come up with an economic interseasonal storage solution.”

Existing nuclear’s best pitch—aside from the very fact it exists already—is its “capacity factor,” the industry term for how often a plant meets its full energy making potential. For decades, nuclear plants struggled with outages and long maintenance periods. Today, improvements in management and technology make them more likely to run continuously—or “breaker to breaker”—between planned refuelings, which usually occur every 18 months, and take about a month. At Salem and Hope Creek, PSEG hangs banners in the hallways to celebrate each new record run without a maintenance breakdown. That improvement stretches across the industry. “If you took our performance back in the mid-’70s, and then look at our performance today, it’s equivalent to having built 30 new reactors,” says Maria Korsnick, president and CEO of the Nuclear Energy Institute, the industry’s main lobbying organization. That improved reliability has become its major calling card today.

Over the next 20 years, nuclear plants will need to develop new tricks. “One of the new words in our vocabulary is flexibility,” says Marilyn Kray, vice president of nuclear strategy and development at Exelon, which operates 21 reactors. “Flexibility not only in the existing plants, but in the designs of the emerging ones, to make them even more flexible and adaptable to complement renewables.” Smaller plants can adapt more easily to the grid, but they can also serve new customers, like providing energy directly to factories, steel mills or desalination plants.

Bringing those small plants into operation could be worth it, but it won’t be easy.”You can’t just excuse away the thing that’s at the center of all of it, which is it’s just a hard technology to build,” says Jaczko, the former NRC chair. “It’s difficult to make these plants, it’s difficult to design them, it’s difficult to engineer them, it’s difficult to construct them. At some point, that’s got to be the obvious conclusion to this technology.”

But the equally obvious conclusion is we can no longer live without it. “The reality is, you have to really squint to see how you get to net zero without nuclear,” says Third Way’s Freed. “There’s a lot of wishful thinking, a lot of fingers crossed.”

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

How VRPs can improve the open banking experiences of consumers

Source: Finance Derivative

Attributed to: Luke Ladyman, COO and Co-founder of Cheddar

Variable Recurring Payments (VRPs) are a new technology within Open Finance. Despite being relatively unknown outside of the financial marketplace, VRPs are at the centre of an exciting conversation around frictionless payments.

This technology enables consumers to set up specific payment instructions with regulated finance apps that are used to make a series of recurring payments. VRPs have been designed to make life easier for consumers and small to medium-sized enterprises (SMEs). However, there is still room for improvement.

Limits to VRP technology

VRPs have the potential to do far more than just simplify recurring payments but unfortunately, the efficacy of VRPs is severely limited. As they stand, VRPs are only mandated for ‘sweeping’, which is a word to describe moving money automatically between an individual’s current or saving accounts.

According to the OBIE, five million people now use Open Banking in the UK. We can expect consumers to demand greater scope in VRP services, but the infrastructure to support VRP technology is just simply lacking.

The next major step in expanding the scope of this technology will be when legacy financial institutions collectively adopt and build out VRP infrastructure. At present, only a small handful of legacy banks offer VRP services to their customers which has severely limited its reach.

Unlocking the potential of VRPs

VRPs promise to unlock countless opportunities for consumers to better manage their finances, whilst simultaneously supporting companies to build out mechanisms that allow for open banking adoption. This technology has the potential to completely transform how consumers interact with financial services and SMEs.

If mandated for more than just the sweeping use case, VRPs can give consumers complete control over their monthly subscription payments for things such as mobile top-ups or gym memberships. Consumers may even set up VRPs with taxi services to automatically charge them whenever they arrive at a destination up to a certain amount.

By expanding the scope of VRPs, consumers can enjoy hyper-personalised Open Banking experiences. And as Gen Z enters the workforce, this tailored approach will be more important than ever before. Setting payment parameters which are as easy to enforce as clicking a button will keep Gen Z happy and will give businesses access to an entirely new demographic.

Cost of living crisis

With living expenses reaching record highs for young people and working families, paying for basic utilities has become a lot more painful. At their full potential, VRPs could enable the most vulnerable in society to authorise utility providers to automatically take payments, only up to a certain amount. This would help consumers budget better for emergencies and keep up with payments.

To further cushion the effects of the crisis, consumers could also specify payment parameters to avoid any shocks to their bank accounts. This will generally lead to less errors as data is processed digitally and won’t require manual entry.

Lastly, those struggling financially will be able to automatically withdraw their consent to payments service providers (PISPs) who make the payments on their behalf. This is in contrast to credit or debit cards where consumers do not have the flexibility to automatically opt-out of transactions and cannot set specific parameters for payments.

How does this affect banks?

On the surface, the value proposition of VRPs looks to threaten certain aspects of banks’ business models. However, once you look deeper, we can see this isn’t the case – it’s actually quite the opposite.

If adopted for all use cases, VRPs can help banks significantly combat fraud as no sensitive payment information is exchanged with businesses. VRPs can act as a frontline defence with its enhanced transparency and tight controls for the consumer. PISPs can create the same experiences you get by using your debit or credit card but with the additional benefits.

By effectively addressing security concerns, banks will greatly improve customer experiences and reduce customer drop-off rates.

The future of VRP technology

We must take into consideration the current limitations of this technology as we press closer towards a completely frictionless way of banking. As things stand, the future of VRPs will rely heavily on further licensing approval from the Financial Conduct Authority (FCA) and increased approval from the Competition and Markets Authority (CMA).

If utilised effectively, VRPs can be applied across a plethora of uses to give consumers greater control over their finances as initially promised. There’s no doubt that expanding the scope of VRPs will give consumers hyper-personalised Open Banking experiences that will decrease customer drop-off rates. So, definitely watch this space!

Continue Reading

Business

Chained to the system – why building another pillar of wealth is key to having the freedom of choice

Source: Finance Derivative

By Marcus de Maria, Founder of Investment Mastery.

For many of us, our lives are already mapped out from the get-go. Go to school, college, then potentially university, get a job and mortgage, whilst spending the rest of our lives paying it off to enable us to have a comfortable retirement.

This is the pathway we have been led to believe we should follow, and many do blindly follow, chained to their desks to pay their mortgage or bills, spending their disposable income to get a better car, bigger house, or enjoy a couple of weeks in the sun.

But for me, I wasn’t happy with this pathway. I didn’t want to be forever paying off a mortgage and saving for a few precious weeks of holiday each year. I wanted more.

So, I took the plunge and started investing  – but fell hard and lost a lot of money. Why? The reason was I didn’t educate myself first. I didn’t have a strategy and I didn’t seek advice from others who had done it before me. Why do people go to university for 3-4 years, study to become a doctor or dentist for 5-7 years or an accountant for 4 years? Because training and education are essential, and the same can be said for the stock markets.

Once I had learned this (the hard way), I worked hard to educate myself and started building my own new pillars of wealth.

Investing isn’t a get-quick-rich scheme. For most, it is not an alternative to working or a reason to quit a lucrative career. It is quite simply another pillar of wealth that gives you the freedom to live a life of more choice. Investments will ripen over the years, and we advise starting as early as possible to ensure you have maximum funds for later in life, when children, ageing parents and retirement can all affect finances.  

The other thing to remember when investing is the level of risk. We say you should never invest any more than you can afford to lose and to keep perspective, as markets are likely to go down as well as up. Investing alongside working in a secure job role is the best option, as you can then funnel small chunks of money into your portfolio each month.

Here are some tips to build a new pillar of wealth:

Where to invest – are you interested in stocks, precious metals, commodities or Cryptocurrencies? If stocks, which Stock are you entering and why? If Crypto, do you know enough in such an unregulated or volatile asset class? Is it on a technical basis where you like the chart pattern or a fundamental basis where you think the company has long-term growth potential?

What price to get in at – I prefer buying low, so I set an order in advance and allow the price of the Stock or Crypto to fall to my entry point. Sometimes I will wait weeks, even months, for this to happen. But I wait because those are the rules.

When to exit with a profit – I know in advance when I am exiting the trade or when to exit with a small loss. So, in order to ensure I am doing the right thing when the Stock is falling, I enter with an automatic order below my entry point, called a ‘Stop Loss’ or ‘Limit Sell Order’ in some cases, to minimise my losses.

How much to invest – this is part and parcel of keeping risk low. I ensure that by the time the stock price falls to my predetermined stop loss, I will only be risking 1% of my portfolio. So, if I have £10,000 to invest, I would only risk 1% or £100 on any one trade. It’s a mathematical equation EVERYONE should know before they start trading. Unfortunately, very few people know this equation, and even fewer utilise it.

For anyone wanting to secure their financial future, increase their pension pot or simply live a life of more choice, building another pillar of wealth is key. Get educated, keep the risk low and be prepared to be in it for the long term –  you may be surprised at the results!

Continue Reading

Business

Resilient technology is the most important factor for successful online banking services

Source: Finance Derivative

By James McCarthy, Director of Solutions Engineering, NS1

More than 90 percent of people in the UK use online banking, according to Statista and of these, over a quarter have opened an account with a digital-only bank. It makes sense. Digital services, along with security, are critical features that consumers now expect from their banks as a way to support their busy on-the-go lifestyles.

The frequency of cash transactions is dropping as contactless and card payments rise and the key to this is convenience. It is faster and easier for customers to use digitally-enabled services than traditional over-the-counter facilities, cheques, and cash. The Covid pandemic, which encouraged people to abandon cash, only accelerated a trend that was already picking up speed in the UK.

But as bank branches close—4865 by April of 2022 and a further 226 scheduled to close by the end of the year, Which research found—banks are under pressure to ensure their online and mobile services are always available. Not only does this keep customers satisfied and loyal, but it is also vital for compliance and regulatory purposes.

Unfortunately, their ability to keep services online is often compromised. In June and July of this year alone, major banks including Barclays, Halifax, Lloyds, TSB, Nationwide, Santander, Nationwide, and Monzo, at various times, locked customers out of their accounts due to outages, leaving them unable to access their mobile banking apps, transfer funds, or view their balances. According to The Mirror, Downdetector,  a website which tracks outages, showed over 1500 service failures were reported in one day as a result of problems at NatWest.

These incidents do not go unnoticed. Customers are quick to amplify their criticism on social media, drawing negative attention for the bank involved, and eroding not just consumer trust, but the trust of other stakeholders in the business. Trading banks leave themselves open to significant losses in transactions if their systems go down due to an outage, even for a few seconds.

There are a multitude of reasons for banking services to fail. The majority of internet-based banking outages occur because the bank’s own internal systems fail. This can be as a result of transferring customer data from legacy platforms which might involve switching off parts of the network. It can also be because they rely on cloud providers to deliver their services and the provider experiences an outage. The Bank of England has said that a quarter of major banks and a third of payment activity is hosted on the public cloud.

There are, however, steps that banks and other financial institutions can take to prevent outages and ensure as close to 100% uptime as possible for banking services.

Building resiliency strategies

If we assume that outages are inevitable, which all banks should, the best solution to managing risk is to embrace infrastructure resiliency strategies. One method is to adopt a multi-cloud and multi-CDN (content delivery platform) approach, which means utilising services from a variety of providers. This will ensure that if one fails, another one can be deployed, eliminating the single point-of-failure that renders systems and services out of action. If the financial institution uses a secondary provider—such as when international banking services are being provided across multiple locations—the agreement must include an assurance that the bank’s applications will operate if the primary provider goes down.

This process of building resiliency in layers, is further strengthened if banks have observability of application delivery performance, and it is beneficial for them to invest in tools that allow them to quickly transfer from one cloud service provider or CDN if it fails to perform against expectations.

Automating against human error

Banks that are further down the digital transformation route should consider the impact of human error on outage incidents and opt for network automation. This will enable systems to communicate seamlessly, giving banks operational agility and stability across the entire IT environment. They can start with a single network source of truth, which allows automation tools to gather all the data they need to optimise resource usage and puts banks in full control of their networks. In addition it will signal to regulators that the bank is taking its provisioning of infrastructure very seriously.

Dynamic steering 

Despite evidence to the contrary, downtime in banking should never be acceptable, and IT teams can make use of specialist tools that allow them to dynamically steer their online traffic more easily. It is not unusual for a DNS failure (domain name system) to be the root cause of an outage, given its importance in the tech stack, so putting in place a secondary DNS network, or multiple DNS systems with separate infrastructures will allow for rerouting of traffic. Teams will then have the power to establish steering policies and change capacity thresholds, so that an influx of activity, or a resource failure, will not affect the smooth-running of their online services. If they utilise monitoring and observability features, they will have the data they need to make decisions based on the real time experiences of end users and identify repeated issues that can be rectified.

Banks are some way into their transformation journeys, and building reputations based on the digital services that they offer. It is essential that they deploy resilient technology that allows them to scale and deliver, regardless of whether the cloud providers they use experience outages, or an internal human error is made, or the online demands of customers suddenly and simultaneously peak. Modern technology will not only speed up the services they provide, but it will also arm them with the resilience they need to compare favourably in the competition stakes.

Continue Reading

Copyright © 2021 Futures Parity.