Connect with us

Business

A New Generation of Nuclear Reactors Could Hold the Key to a Green Future

Source: Time

On a conference-room whiteboard in the heart of Silicon Valley, Jacob DeWitte sketches his startup’s first product. In red marker, it looks like a beer can in a Koozie, stuck with a crazy straw. In real life, it will be about the size of a hot tub, and made from an array of exotic materials, like zirconium and uranium. Under carefully controlled conditions, they will interact to produce heat, which in turn will make electricity—1.5 megawatts’ worth, enough to power a neighborhood or a factory. DeWitte’s little power plant will run for a decade without refueling and, amazingly, will emit no carbon. ”It’s a metallic thermal battery,” he says, coyly. But more often DeWitte calls it by another name: a nuclear reactor.

Fission isn’t for the faint of heart. Building a working reactor—even a very small one—requires precise and painstaking efforts of both engineering and paper pushing. Regulations are understandably exhaustive. Fuel is hard to come by—they don’t sell uranium at the Gas-N-Sip. But DeWitte plans to flip the switch on his first reactor around 2023, a mere decade after co-founding his company, Oklo. After that, they want to do for neighborhood nukes what Tesla has done for electric cars: use a niche and expensive first version as a stepping stone toward cheaper, bigger, higher-volume products. In Oklo’s case, that means starting with a “microreactor” designed for remote communities, like Alaskan villages, currently dependent on diesel fuel trucked, barged or even flown in, at an exorbitant expense. Then building more and incrementally larger reactors until their zero-carbon energy source might meaningfully contribute to the global effort to reduce fossil-fuel emissions.

At global climate summits, in the corridors of Congress and at statehouses around the U.S., nuclear power has become the contentious keystone of carbon reduction plans. Everyone knows they need it. But no one is really sure they want it, given its history of accidents. Or even if they can get it in time to reach urgent climate goals, given how long it takes to build. Oklo is one of a growing handful of companies working to solve those problems by putting reactors inside safer, easier-to-build and smaller packages. None of them are quite ready to scale to market-level production, but given the investments being made into the technology right now, along with an increasing realization that we won’t be able to shift away from fossil fuels without nuclear power, it’s a good bet that at least one of them becomes a game changer.

If existing plants are the energy equivalent of a 2-liter soda bottle, with giant, 1,000-megawatt-plus reactors, Oklo’s strategy is to make reactors by the can. The per-megawatt construction costs might be higher, at least at first. But producing units in a factory would give the company a chance to improve its processes and to lower costs. Oklo would pioneer a new model. Nuclear plants need no longer be bet-the-company big, even for giant utilities. Venture capitalists can get behind the potential to scale to a global market. And climate hawks should fawn over a zero-carbon energy option that complements burgeoning supplies of wind and solar power. Unlike today’s plants, which run most efficiently at full blast, making it challenging for them to adapt to a grid increasingly powered by variable sources (not every day is sunny, or windy), the next generation of nuclear technology wants to be more flexible, able to respond quickly to ups and downs in supply and demand.

Engineering these innovations is hard. Oklo’s 30 employees are busy untangling the knots of safety and complexity that sent the cost of building nuclear plants to the stratosphere and all but halted their construction in the U.S. ”If this technology was brand-‘new’—like if fission was a recent breakthrough out of a lab, 10 or 15 years ago—we’d be talking about building our 30th reactor,” DeWitte says.

But fission is an old, and fraught, technology, and utility companies are scrambling now to keep their existing gargantuan nuclear plants open. Economically, they struggle to compete with cheap natural gas, along with wind and solar, often subsidized by governments. Yet climate-focused nations like France and the U.K. that had planned to phase out nuclear are instead doubling down. (In October, French President Emmanuel Macron backed off plans to close 14 reactors, and in November, he announced the country would instead start building new ones.) At the U.N. climate summit in Glasgow, the U.S. announced its support for Poland, Kenya, Ukraine, Brazil, Romania and Indonesia to develop their own new nuclear plants—while European negotiators assured that nuclear energy counts as “green.” All the while, Democrats and Republicans are (to everyone’s surprise) often aligned on nuclear’s benefits—and, in many cases, putting their powers of the purse behind it, both to keep old plants open in the U.S. and speed up new technologies domestically and overseas.

It makes for a decidedly odd moment in the life of a technology that already altered the course of one century, and now wants to make a difference in another. There are 93 operating nuclear reactors in the U.S.; combined, they supply 20% of U.S. electricity, and 50% of its carbon-free electricity. Nuclear should be a climate solution, satisfying both technical and economic needs. But while the existing plants finally operate with enviable efficiency (after 40 years of working out the kinks), the next generation of designs is still a decade away from being more than a niche player in our energy supply. Everyone wants a steady supply of electricity, without relying on coal. Nuclear is paradoxically right at hand, and out of reach.

For that to change, “new nuclear” has to emerge before the old nuclear plants recede. It has to keep pace with technological improvements in other realms, like long-term energy storage, where each incremental improvement increases the potential for renewables to supply more of our electricity. It has to be cheaper than carbon-capture technologies, which would allow flexible gas plants to operate without climate impacts (but are still too expensive to build at scale). And finally it has to arrive before we give up—before the spectre of climate catastrophe creates a collective “doomerism,” and we stop trying to change.

Not everyone thinks nuclear can reinvent itself in time. “When it comes to averting the imminent effects of climate change, even the cutting edge of nuclear technology will prove to be too little, too late,” predicts Allison Macfarlane, former chair of the U.S. Nuclear Regulatory Commission (NRC)—the government agency singularly responsible for permitting new plants. Can a stable, safe, known source of energy rise to the occasion, or will nuclear be cast aside as too expensive, too risky and too late?

Trying Again

Nuclear began in a rush. In 1942, in the lowest mire of World War II, the U.S. began the Manhattan Project, the vast effort to develop atomic weapons. It employed 130,000 people at secret sites across the country, the most famous of which was Los Alamos Laboratory, near Albuquerque, N.M., where Robert Oppenheimer led the design and construction of the first atomic bombs. DeWitte, 36, grew up nearby. Even as a child of the ’90s, he was steeped in the state’s nuclear history, and preoccupied with the terrifying success of its engineering and the power of its materials. “It’s so incredibly energy dense,” says DeWitte. “A golf ball of uranium would power your entire life!”

DeWitte has taken that bromide almost literally. He co-founded Oklo in 2013 with Caroline Cochran, while both were graduate students in nuclear engineering at the Massachusetts Institute of Technology. When they arrived in Cambridge, Mass., in 2007 and 2008, the nuclear industry was on a precipice. Then presidential candidate Barack Obama espoused a new eagerness to address climate change by reducing carbon emissions—which at the time meant less coal, and more nuclear. (Wind and solar energy were still a blip.) It was an easy sell. In competitive power markets, nuclear plants were profitable. The 104 operating reactors in the U.S. at the time were running smoothly. There hadn’t been a major accident since Chernobyl, in 1986.

The industry excitedly prepared for a “nuclear renaissance.” At the peak of interest, the NRC had applications for 30 new reactors in the U.S. Only two would be built. The cheap natural gas of the fracking boom began to drive down electricity prices, razing nuclear’s profits. Newly subsidized renewables, like wind and solar, added even more electricity generation, further saturating the markets. When on March 11, 2011, an earthquake and subsequent tsunami rolled over Japan’s Fukushima Daiichi nuclear power plant, leading to the meltdown of all three of its reactors and the evacuation of 154,000 people, the industry’s coffin was fully nailed. Not only would there be no renaissance in the U.S, but the existing plants had to justify their safety. Japan shut down 46 of its 50 operating reactors. Germany closed 11 of its 17. The U.S. fleet held on politically, but struggled to compete economically. Since Fukushima, 12 U.S. reactors have begun decommissioning, with three more planned.

At MIT, Cochran and DeWitte—who were teaching assistants together for a nuclear reactor class in 2009, and married in 2011—were frustrated by the setback. ”It was like, There’re all these cool technologies out there. Let’s do something with it,” says Cochran. But the nuclear industry has never been an easy place for innovators. In the U.S., its operational ranks have long been dominated by “ring knockers”—the officer corps of the Navy’s nuclear fleet, properly trained in the way things are done, but less interested in doing them differently. Governments had always kept a tight grip on nuclear; for decades, the technology was under shrouds. The personal computing revolution, and then the wild rise of the Internet, further drained engineering talent. From DeWitte and Cochran’s perspective, the nuclear-energy industry had already ossified by the time Fukushima and fracking totally brought things to a halt. “You eventually got to the point where it’s like, we have to try something different,” DeWitte says.

He and Cochran began to discreetly convene their MIT classmates for brainstorming sessions. Nuclear folks tend to be dogmatic about their favorite method of splitting atoms, but they stayed agnostic. “I didn’t start thinking we had to do everything differently,” says DeWitte. Rather, they had a hunch that marginal improvements might yield major results, if they could be spread across all of the industry’s usual snags—whether regulatory approaches, business models, the engineering of the systems themselves, or the challenge of actually constructing them.

In 2013, Cochran and DeWitte began to rent out the spare room in their Cambridge home on Airbnb. Their first guests were a pair of teachers from Alaska. The remote communities they taught in were dependent on diesel fuel for electricity, brought in at enormous cost. That energy scarcity created an opportunity: in such an environment, even a very expensive nuclear reactor might still be cheaper than the current system. The duo targeted a price of $100 per megawatt hour, more than double typical energy costs. They imagined using this high-cost early market as a pathway to scale their manufacturing. They realized that to make it work economically, they wouldn’t have to reinvent the reactor technology, only the production and sales processes. They decided to own their reactors and supply electricity, rather than supply the reactors themselves—operating more like today’s solar or wind developers. “It’s less about the technology being different,” says DeWitte, “than it is about approaching the entire process differently.”

That maverick streak raised eyebrows among nuclear veterans—and cash from Silicon Valley venture capitalists, including a boost from Y Combinator, where companies like Airbnb and Instacart got their start. In the eight years since, Oklo has distinguished itself from the competition by thinking smaller and moving faster. There are others competing in this space: NuScale, based in Oregon, is working to commercialize a reactor similar in design to existing nuclear plants, but constructed in 60-megawatt modules. TerraPower, founded by Bill Gates in 2006, has plans for a novel technology that uses its heat for energy storage, rather than to spin a turbine, which makes it an even more flexible option for electric grids that increasingly need that pliability. And X-energy, a Maryland-based firm that has received substantial funding from the U.S. Department of Energy, is developing 80-megawatt reactors that can also be grouped into “four-packs,” bringing them closer in size to today’s plants. Yet all are still years—and a billion dollars—away from their first installations. Oklo brags that its NRC application is 20 times shorter than NuScale’s, and its proposal cost 100 times less to develop. (Oklo’s proposed reactor would produce one-fortieth the power of NuScale’s.) NRC accepted Oklo’s application for review in March 2020, and regulations guarantee that process will be complete within three years. Oklo plans to power on around 2023, at a site at the Idaho National Laboratory, one of the U.S.’s oldest nuclear-research sites, and so already approved for such efforts. Then comes the hard part: doing it again and again, booking enough orders to justify building a factory to make many more reactors, driving costs down, and hoping politicians and activists worry more about the menace of greenhouse gases than the hazards of splitting atoms.

Nuclear-industry veterans remain wary. They have seen this all before. Westinghouse’s AP1000 reactor, first approved by the NRC in 2005, was touted as the flagship technology of Obama’s nuclear renaissance. It promised to be safer and simpler, using gravity rather than electricity-driven pumps to cool the reactor in case of an emergency—in theory, this would mitigate the danger of power outages, like the one that led to the Fukushima disaster. Its components could be constructed at a centralized location, and then shipped in giant pieces for assembly.

But all that was easier said than done. Westinghouse and its contractors struggled to manufacture the components according to nuclear’s mega-exacting requirements and in the end, only one AP1000 project in the U.S. actually happened: the Vogtle Electric Generating Plant in Georgia. Approved in 2012, its two reactors were expected at the time to cost $14 billion and be completed in 2016 and 2017, but costs have ballooned to $25 billion. The first will open, finally, next year.

Oklo and its competitors insist things are different this time, but they have yet to prove it. “Because we haven’t built one of them yet, we can promise that they’re not going to be a problem to build,” quips Gregory Jaczko, a former NRC chair who has since become the technology’s most biting critic. “So there’s no evidence of our failure.”

The Challenge

The cooling tower of the Hope Creek nuclear plant rises 50 stories above Artificial Island, New Jersey, built up on the marshy edge of the Delaware River. The three reactors here—one belonging to Hope Creek, and two run by the Salem Generating Station, which shares the site—generate an astonishing 3,465 megawatts of electricity, or roughly 40% of New Jersey’s total supply. Construction began in 1968, and was completed in 1986. Their closest human neighbors are across the river in Delaware. Otherwise the plant is surrounded by protected marshlands, pocked with radiation sensors and the occasional guard booth. Of the 1,500 people working here, around 100 are licensed reactor operators—a special designation given by the NRC, and held by fewer than 4,000 people in the country.

Among the newest in their ranks is Judy Rodriguez, an Elizabeth, N.J., native and another MIT grad. “Do I have your permission to enter?” she asks the operator on duty in the control room for the Salem Two reactor, which came online in 1981 and is capable of generating 1,200 megawatts of power. The operator opens a retractable belt barrier, like at an airport, and we step across a thick red line in the carpet. A horseshoe-shaped gray cabinet holds hundreds of buttons, glowing indicators and blinking lights, but a red LED counter at the center of the wall shows the most important number in the room: 944 megawatts, the amount of power the Salem Two reactor was generating that afternoon in September. Beside it is a circular pattern of square indicator lights showing the uranium fuel assemblies inside the core, deep inside the concrete domed containment building a couple hundred yards away. Salem Two has 764 of these constructions; each is about 6 inches sq and 15 ft. tall. They contain the source of the reactor’s energy, which are among the most guarded and controlled materials on earth. To make sure no one working there forgets that fact, a phrase is painted on walls all around the plant: “Line of Sight to the Reactor.”

As the epitome of critical infrastructure, this station has been buffeted by the crises the U.S. has suffered in the past few decades. After 9/11, the three reactors here absorbed nearly $100 million in security upgrades. Everyone entering the plant passes through metal- and explosives detectors, and radiation detectors on the way out. Walking between the buildings entails crossing a concrete expanse beneath high bullet resistant enclosures (BREs). The plant has a guard corp that has more members than any in New Jersey besides the state police, and federal NRC rules mean that they don’t have to abide by state limitations on automatic weapons.

The scale and complexity of the operation is staggering—and expensive. ”The place you’re sitting at right now costs us about $1.5 million to $2 million a day to run,” says Ralph Izzo, president and CEO of PSEG, New Jersey’s public utility company, which owns and operates the plants. “If those plants aren’t getting that in market, that’s a rough pill to swallow.” In 2019, the New Jersey Board of Public Utilities agreed to $300 million in annual subsidies to keep the three reactors running. The justification is simple: if the state wants to meet its carbon-reduction goals, keeping the plants online is essential, given that they supply 90% of the state’s zero-carbon energy. In September, the Illinois legislature came to the same conclusion as New Jersey, approving almost $700 million over five years to keep two existing nuclear plants open. The bipartisan infrastructure bill includes $6 billion in additional support (along with nearly $10 billion for development of future reactors). Even more is expected in the broader Build Back Better bill.

These subsidies—framed in both states as “carbon mitigation credits”—acknowledge the reality that nuclear plants cannot, on their own terms, compete economically with natural gas or coal. “There has always been a perception of this technology that never was matched by reality,” says Jaczko. The subsidies also show how climate change has altered the equation, but not decisively enough to guarantee nuclear’s future. Lawmakers and energy companies are coming to terms with nuclear’s new identity as clean power, deserving of the same economic incentives as solar and wind. Operators of existing plants want to be compensated for producing enormous amounts of carbon free energy, according to Josh Freed, of Third Way, a Washington, D.C., think tank that champions nuclear power as a climate solution. “There’s an inherent benefit to providing that, and it should be paid for.” For the moment, that has brought some assurance to U.S. nuclear operators of their future prospects. “A megawatt of zero-carbon electricity that’s leaving the grid is no different from a new megawatt of zero carbon electricity coming onto the grid,” says Kathleen Barrón, senior vice president of government and regulatory affairs and public policy at Exelon, the nation’s largest operator of nuclear reactors.

Globally, nations are struggling with the same equation. Germany and Japan both shuttered many of their plants after the Fukushima disaster, and saw their progress at reducing carbon emissions suffer. Germany has not built new renewables fast enough to meet its electricity needs, and has made up the gap with dirty coal and natural gas imported from Russia. Japan, under international pressure to move more aggressively to meet its carbon targets, announced in October that it would work to restart its reactors. “Nuclear power is indispensable when we think about how we can ensure a stable and affordable electricity supply while addressing climate change,” said Koichi Hagiuda, Japan’s minister of economy, trade and industry, at an October news conference. China is building more new nuclear reactors than any other country, with plans for as many as 150 by the 2030s, at an estimated cost of nearly half a trillion dollars. Long before that, in this decade, China will overtake the U.S. as the operator of the world’s largest nuclear-energy system.

The future won’t be decided by choosing between nuclear or solar power. Rather, it’s a technically and economically complicated balance of adding as much renewable energy as possible while ensuring a steady supply of electricity. At the moment, that’s easy. “There is enough opportunity to build renewables before achieving penetration levels that we’re worried about the grid having stability,” says PSEG’s Izzo. New Jersey, for its part, is aiming to add 7,500 megawatts of offshore wind by 2035—or about the equivalent of six new Salem-sized reactors. The technology to do that is readily at hand—Kansas alone has about that much wind power installed already.

The challenge comes when renewables make up a greater proportion of the electricity supply—or when the wind stops blowing. The need for “firm” generation becomes more crucial. “You cannot run our grid solely on the basis of renewable supply,” says Izzo. “One needs an interseasonal storage solution, and no one has come up with an economic interseasonal storage solution.”

Existing nuclear’s best pitch—aside from the very fact it exists already—is its “capacity factor,” the industry term for how often a plant meets its full energy making potential. For decades, nuclear plants struggled with outages and long maintenance periods. Today, improvements in management and technology make them more likely to run continuously—or “breaker to breaker”—between planned refuelings, which usually occur every 18 months, and take about a month. At Salem and Hope Creek, PSEG hangs banners in the hallways to celebrate each new record run without a maintenance breakdown. That improvement stretches across the industry. “If you took our performance back in the mid-’70s, and then look at our performance today, it’s equivalent to having built 30 new reactors,” says Maria Korsnick, president and CEO of the Nuclear Energy Institute, the industry’s main lobbying organization. That improved reliability has become its major calling card today.

Over the next 20 years, nuclear plants will need to develop new tricks. “One of the new words in our vocabulary is flexibility,” says Marilyn Kray, vice president of nuclear strategy and development at Exelon, which operates 21 reactors. “Flexibility not only in the existing plants, but in the designs of the emerging ones, to make them even more flexible and adaptable to complement renewables.” Smaller plants can adapt more easily to the grid, but they can also serve new customers, like providing energy directly to factories, steel mills or desalination plants.

Bringing those small plants into operation could be worth it, but it won’t be easy.”You can’t just excuse away the thing that’s at the center of all of it, which is it’s just a hard technology to build,” says Jaczko, the former NRC chair. “It’s difficult to make these plants, it’s difficult to design them, it’s difficult to engineer them, it’s difficult to construct them. At some point, that’s got to be the obvious conclusion to this technology.”

But the equally obvious conclusion is we can no longer live without it. “The reality is, you have to really squint to see how you get to net zero without nuclear,” says Third Way’s Freed. “There’s a lot of wishful thinking, a lot of fingers crossed.”

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

How can a payments strategy support business growth?

Source: Finance Derivative

Following the global economic upheaval brought on by the pandemic, businesses are once again prioritising growth on a global scale. While every business recognises the importance of expansion, their methods, obstacles, and risks differ greatly.

In the following article, Sonya Geelon, Chief Commercial Officer at Conferma, explores some of the most common challenges holding businesses back, and how by including innovative payments solutions in your payment strategy, you can successfully position your business to expand into global markets.

Barriers to global expansion

At Conferma, we wanted to know what businesses felt stood between them and their growth ambitions, so we spoke to 400 financial decision makers to find out.

The research, shared in our new Growth Ignition Index report, identified global expansion as a key priority for businesses looking to grow across all regions. Significant drivers included increasing customer demand (46 per cent), maintaining a consistent cashflow (36 per cent) and undertaking digital transformation (34 per cent.) Businesses also highlighted a number of barriers, such as identifying valuable markets to expand into (27 per cent) and navigating complex cross-border payment systems (13 per cent.) The following sheds light on some of the factors that businesses perceive to be hindering their growth.

Operational inefficiencies

It’s a well-known fact that operational efficiency is crucial for giving businesses the competitive edge. If your processes run smoothly and effectively, you’re likely in a good position to grow. However, a third (33 per cent) of businesses identified operational inefficiencies as a significant sticking point, particularly among small-and-medium sized organisations. This perhaps indicates that larger companies have already invested in boosting efficiency to a degree, however, the issue was noted across businesses of all sizes.

Complex cross-border payments

Successful growth relies heavily on being able to make fast, seamless transactions, however, recent research from Rapyd found that 38 per cent of businesses experience delays of five days or more when sending or receiving international payments.[1] Costs and delays in cross-border transactions can have a significant impact on growth, cutting into revenues, restricting cash flow and complicating financial planning. Our own research highlighted this, with 14 per cent of businesses reporting slow and/or complex cross-border payments as a significant barrier to expansion.

So how can businesses overcome these challenges and unlock global growth?

Taking your payments strategy virtual

Amid the array of payment options available in the market, virtual cards have emerged as a versatile solution, valued by users globally. According to Juniper Research, the global value of virtual cards will increase over threefold in just 5 years, climbing from $1.9 trillion in 2021 to a staggering $6.8 trillion by 2026.[2]

So how do they work?

Virtual cards are essentially digital versions of traditional credit cards. The technology generates a 16-digit card  number, allowing an employee to make payments without having to physically hand over a card. Instead, they provide the virtual card number, expiration date, and security code, just like they would with a regular credit or debit card.

Virtual cards come with built-in fraud and security features, enabling restrictions on usage. For instance, users can set a specific date range or limit usage to certain merchants. This ensures that any attempts to exceed the set amount, use the card at unauthorised merchants, or spend beyond the specified date range will result in a declined transaction.

Using a virtual card provider allows access to extensive, pre-existing payments ecosystems. For example, Conferma connects 75+ card issuers and banks across the world. This enables businesses to use virtual cards in 62 different currencies, making international payments frictionless while mitigating costly cross-border fees. Virtual cards can also help boost cashflow and improve operational efficiency, automating reconciliation and cutting lengthy processing times. By removing convoluted payment processes, virtual cards give businesses the freedom to grow in the markets they deem most valuable, not just most accessible.

Of those surveyed, four out of five  respondents (82 per cent) plan on expanding their virtual card usage in the next twelve months, with 64 per cent extending usage to additional payment needs. Businesses already using virtual cards also anticipate a substantial increase in the volume of payments they make virtually, with our data projecting a rise from 45 to 57 per cent of all payments being made using virtual cards in the next 12 months.

Virtual cards offer a compelling solution to the challenges limiting international growth by offering enhanced security, streamlined operational processes, and seamless cross-border transactions. By embracing virtual cards as a strategic tool, organisations can unlock opportunities for growth and innovation, empowering them to navigate the complexities of international commerce with ease.


[1] The 2023 State of Cross-Border Payments, Rapyd, 2023.

[2] Virtual Cards: B2B and B2C Applications, Competitive Analysis & Market Forecasts 2021-2026, Juniper Research

Continue Reading

Business

How can businesses make the cloud optional in their operations?

Max Alexander, Co-founder at Ditto

Modern business apps are built to be cloud-dependent. This is great for accessing limitless compute and data storage capabilities but when connection to the cloud is poor or shuts down, business apps stop working, impacting revenue and service. If real-time data is needed for quick decision-making in fields like healthcare, a stalled app can potentially put people in life-threatening situations.

Organisations in sectors as diverse as airlines, fast food retail, and ecommerce that have deskless staff who need digital tools accessible on smartphones, tablets and other devices to do their jobs. But because of widespread connectivity issues and outages, these organisations are beginning to consider how to ensure these tools can operate reliably when the cloud is not accessible. 

The short answer is that building applications with a local-first architecture can help to ensure that they remain functional when disconnected from the internet. But then, why are not all apps built this way? The simple answer is that building and deploying cloud-only applications is much easier as ready-made tools for developers help expedite a lot of the backend building process. The more complex answer is that a local-first architecture solves the issue of offline data accessibility but does not solve the critical issue of offline data synchronisation. Apps disconnected from the internet still have no way to share data across devices. That is where peer-to-peer data sync and mesh networking come into play.

Combining offline-first architecture with peer-to-peer data sync

In the real world, what does an application like this look like?

  • Apps must prioritise local data sync. Rather than sending data to a remote server, applications must be able to write data using its local database in the first instance, and then listen for changes from other devices, and recombine them as needed. Apps should utilise local transports such as Bluetooth Low Energy (BLE) and Peer-to-Peer WiFi (P2P Wi-Fi) to communicate data changes in the event that the internet, local server, or the cloud is not available.
  • Devices are capable of creating real-time mesh networks. Nearby devices should be able to discover, communicate, and maintain constant connections with devices in areas of limited or no connectivity.
  • Seamlessly transition from online to offline (and vice versa). Combining local sync with mesh networking means that devices in the same mesh are constantly updating a local version of the database and opportunistically syncing those changes with the cloud when it is available.
  • Partitioned between large peer and small peer mesh networks to not overwhelm smaller networks if they try to sync every piece of data. In order to do this, smaller networks will only sync the data that it requests, so developers have complete control over bandwidth usage and storage. This is vital when connectivity is erratic or critical data needs prioritising. Whereas, the larger networks sync as much data as they can, which is when there is full access to cloud-based systems.
  • Ad-hoc to enable devices to join and leave the mesh when they need to. This also means that there can be no central server other devices are relying on.
  • Compatible with all data at any time. All devices should account for incoming data with different schemas. In this way, if a device is offline and running an outdated app version, for example, it still must be able to read new data and sync.

Peer-to-peer sync and mesh networking in practice

Let us take a look at a point-of-sale application in the fast-paced environment of a quick-service restaurant. When an order is taken at a kiosk or counter, that data must travel hundreds of miles to a data centre to arrive at a device four metres away in the kitchen. This is an inefficient process and can slow down or even halt operations, especially if there is an internet outage or any issues with the cloud.

A major fast-food restaurant in the US has already modernised its point of sale system using this new architecture and created one that can move order data between store devices independently of an internet connection. As such, this system is much more resilient in the face of outages, ensuring employees can always deliver best-in-class service, regardless of internet connectivity.

The vast power of cloud-optional computing is showcased in healthcare situations in rural areas in developing countries. By using both peer-to-peer data sync and mesh networking, essential healthcare applications can share critical health information without the Internet or a connection to the cloud. This means that healthcare workers in disconnected environments can now quickly process information and share it with relevant colleagues, empowering faster reaction times that can save lives.

Although the shift from cloud-only to cloud-optional is subtle and will not be obvious to end users, it really is a fundamental paradigm shift. This move provides a number of business opportunities for increasing revenue and efficiencies and helps ensure sustained service for customers.

Continue Reading

Business

When something personal fills an important gap in the market 

by Cécile Mazuet-Eller, founder of NameSwitch

There aren’t many business ideas that go from a personal experience to filling an important gap in the market. However, this is certainly the case for NameSwitch, the UK’s pioneering and only name changing support service launched in 2018. But what inspired its inception and what challenges did it face? Here, Cécile Mazuet-Eller, the founder of the company, in its seventh year, explains.

My entrepreneurial journey is a bit unusual in that it started from my own experience of going through a divorce, which became a pivotal turning point for me not only emotionally, but practically too. I wanted to remove my married name, and I had a visceral reason to do so as I really didn’t want to keep it. Feeling extremely frustrated at still receiving letters and official documents featuring my previous name, I was desperate to change it but like for so many people it became a stop-start, arduous task.

Once I started the process, I realised it was taking up far too much time I didn’t have; being a single mum to two young children and working full-time is no mean feat, so when I embarked on the name changing process I realised it wasn’t going to be easy.  Searching for a solution to help, all I came up with was a service covering the US and Canada, but nothing that worked for the UK, so in the end, I spent a whole year to get everything changed that had to be, which proved long and stressful to say the least.

Nurturing the idea

In the early days I was fortunate enough to be surrounded by positive people who had good contacts, and who saw the viability of my idea. Living in a small community filled with intelligent and well-rounded people, I wasn’t short of encouragement from them and friends, who recognised as well as I did there was a definite gap in the market. Working with a web development team in Serbia which was also recommended, I enlisted additional help from a university student on some research.

I always wanted to run my own business, and there were several reasons why I needed to embark on something new. As the only breadwinner in the house, there were mounting bills while balancing the demands of motherhood and other financial responsibilities. Cash was limited but what little I had was used carefully which I put into the business.

In the early stages, which included the development of the unique technology that underpins the service, I carved pockets of time at night and on weekends to create a strong foundation for the business. Creating something completely from scratch was like a form of healing, which is why it was and remains such a personal project.

Mulling over the idea for at least two years following the original lightbulb moment, the business was registered in 2015, with time needed for building the robust platform in order to  create a viable product. Drawing on my previous experience, I investigated overseas equivalents, financials and marketing intelligence ensuring there was a genuine need for the service in the UK. Fortunately enough I was able to share my plans with my employer at the time, who turned out to be my biggest supporters, becoming my first paying customer who purchased a NameSwitch for his ex-wife, who was getting married to someone else!

With a career in telecommunications and a degree in marketing, I was already used to hard work and having the support and encouragement from my telecoms team was extremely helpful.   

Support and coaching

Coaching was an important element of the start-up process, obtained through a wider network and some financial support from family,  with no other funding or investment being available.

The challenges

Presented with certain obstacles like all businesses are, there was a lot to juggle and at times it felt like too much but I managed to navigate the complexities involved. When Covid hit that was a huge set-back, given that our biggest target market was and still is, newly-weds. With all weddings being banned, it hit NameSwitch hard, but our saving grace were the people who used the time to change their name’s in lockdown, by doing something they previously didn’t have time for. Being 100% employed by the business by this stage, it turned into a year of survival and another big challenge.  

In 2022-2023 we concentrated on growth for NameSwitch, when me and my dedicated team were satisfied with the service, it was time to consider investment into PR, advertising and partnerships to increase brand awareness to reach the revenues that were needed.

In 2022-2024, it was forecast that 285,000 – 415,000 weddings will take place resulting from the pandemic, which has reflected well on the business in recent years. And amidst the trials and tribulations it’s proved to be both exhilarating and exhausting in equal measure.

With hindsight, there are certain things I’d have done differently, such as bringing in a partner early on to put us in a stronger position sooner, and adding more resource  to improve growth, but I know that’s all part of the steep learning curve and something to take with me to projects in the future.

Advice for aspiring entrepreneurs

For anyone contemplating their own entrepreneurial endeavours, I’d recommend to ‘one hundred percent go for it’ – but do not bet the house on it and whatever happens, embrace the journey.

Continue Reading

Copyright © 2021 Futures Parity.