Business
A New Generation of Nuclear Reactors Could Hold the Key to a Green Future

Source: Time
On a conference-room whiteboard in the heart of Silicon Valley, Jacob DeWitte sketches his startup’s first product. In red marker, it looks like a beer can in a Koozie, stuck with a crazy straw. In real life, it will be about the size of a hot tub, and made from an array of exotic materials, like zirconium and uranium. Under carefully controlled conditions, they will interact to produce heat, which in turn will make electricity—1.5 megawatts’ worth, enough to power a neighborhood or a factory. DeWitte’s little power plant will run for a decade without refueling and, amazingly, will emit no carbon. ”It’s a metallic thermal battery,” he says, coyly. But more often DeWitte calls it by another name: a nuclear reactor.
Fission isn’t for the faint of heart. Building a working reactor—even a very small one—requires precise and painstaking efforts of both engineering and paper pushing. Regulations are understandably exhaustive. Fuel is hard to come by—they don’t sell uranium at the Gas-N-Sip. But DeWitte plans to flip the switch on his first reactor around 2023, a mere decade after co-founding his company, Oklo. After that, they want to do for neighborhood nukes what Tesla has done for electric cars: use a niche and expensive first version as a stepping stone toward cheaper, bigger, higher-volume products. In Oklo’s case, that means starting with a “microreactor” designed for remote communities, like Alaskan villages, currently dependent on diesel fuel trucked, barged or even flown in, at an exorbitant expense. Then building more and incrementally larger reactors until their zero-carbon energy source might meaningfully contribute to the global effort to reduce fossil-fuel emissions.
At global climate summits, in the corridors of Congress and at statehouses around the U.S., nuclear power has become the contentious keystone of carbon reduction plans. Everyone knows they need it. But no one is really sure they want it, given its history of accidents. Or even if they can get it in time to reach urgent climate goals, given how long it takes to build. Oklo is one of a growing handful of companies working to solve those problems by putting reactors inside safer, easier-to-build and smaller packages. None of them are quite ready to scale to market-level production, but given the investments being made into the technology right now, along with an increasing realization that we won’t be able to shift away from fossil fuels without nuclear power, it’s a good bet that at least one of them becomes a game changer.
If existing plants are the energy equivalent of a 2-liter soda bottle, with giant, 1,000-megawatt-plus reactors, Oklo’s strategy is to make reactors by the can. The per-megawatt construction costs might be higher, at least at first. But producing units in a factory would give the company a chance to improve its processes and to lower costs. Oklo would pioneer a new model. Nuclear plants need no longer be bet-the-company big, even for giant utilities. Venture capitalists can get behind the potential to scale to a global market. And climate hawks should fawn over a zero-carbon energy option that complements burgeoning supplies of wind and solar power. Unlike today’s plants, which run most efficiently at full blast, making it challenging for them to adapt to a grid increasingly powered by variable sources (not every day is sunny, or windy), the next generation of nuclear technology wants to be more flexible, able to respond quickly to ups and downs in supply and demand.
Engineering these innovations is hard. Oklo’s 30 employees are busy untangling the knots of safety and complexity that sent the cost of building nuclear plants to the stratosphere and all but halted their construction in the U.S. ”If this technology was brand-‘new’—like if fission was a recent breakthrough out of a lab, 10 or 15 years ago—we’d be talking about building our 30th reactor,” DeWitte says.
But fission is an old, and fraught, technology, and utility companies are scrambling now to keep their existing gargantuan nuclear plants open. Economically, they struggle to compete with cheap natural gas, along with wind and solar, often subsidized by governments. Yet climate-focused nations like France and the U.K. that had planned to phase out nuclear are instead doubling down. (In October, French President Emmanuel Macron backed off plans to close 14 reactors, and in November, he announced the country would instead start building new ones.) At the U.N. climate summit in Glasgow, the U.S. announced its support for Poland, Kenya, Ukraine, Brazil, Romania and Indonesia to develop their own new nuclear plants—while European negotiators assured that nuclear energy counts as “green.” All the while, Democrats and Republicans are (to everyone’s surprise) often aligned on nuclear’s benefits—and, in many cases, putting their powers of the purse behind it, both to keep old plants open in the U.S. and speed up new technologies domestically and overseas.
It makes for a decidedly odd moment in the life of a technology that already altered the course of one century, and now wants to make a difference in another. There are 93 operating nuclear reactors in the U.S.; combined, they supply 20% of U.S. electricity, and 50% of its carbon-free electricity. Nuclear should be a climate solution, satisfying both technical and economic needs. But while the existing plants finally operate with enviable efficiency (after 40 years of working out the kinks), the next generation of designs is still a decade away from being more than a niche player in our energy supply. Everyone wants a steady supply of electricity, without relying on coal. Nuclear is paradoxically right at hand, and out of reach.
For that to change, “new nuclear” has to emerge before the old nuclear plants recede. It has to keep pace with technological improvements in other realms, like long-term energy storage, where each incremental improvement increases the potential for renewables to supply more of our electricity. It has to be cheaper than carbon-capture technologies, which would allow flexible gas plants to operate without climate impacts (but are still too expensive to build at scale). And finally it has to arrive before we give up—before the spectre of climate catastrophe creates a collective “doomerism,” and we stop trying to change.
Not everyone thinks nuclear can reinvent itself in time. “When it comes to averting the imminent effects of climate change, even the cutting edge of nuclear technology will prove to be too little, too late,” predicts Allison Macfarlane, former chair of the U.S. Nuclear Regulatory Commission (NRC)—the government agency singularly responsible for permitting new plants. Can a stable, safe, known source of energy rise to the occasion, or will nuclear be cast aside as too expensive, too risky and too late?
Trying Again
Nuclear began in a rush. In 1942, in the lowest mire of World War II, the U.S. began the Manhattan Project, the vast effort to develop atomic weapons. It employed 130,000 people at secret sites across the country, the most famous of which was Los Alamos Laboratory, near Albuquerque, N.M., where Robert Oppenheimer led the design and construction of the first atomic bombs. DeWitte, 36, grew up nearby. Even as a child of the ’90s, he was steeped in the state’s nuclear history, and preoccupied with the terrifying success of its engineering and the power of its materials. “It’s so incredibly energy dense,” says DeWitte. “A golf ball of uranium would power your entire life!”
DeWitte has taken that bromide almost literally. He co-founded Oklo in 2013 with Caroline Cochran, while both were graduate students in nuclear engineering at the Massachusetts Institute of Technology. When they arrived in Cambridge, Mass., in 2007 and 2008, the nuclear industry was on a precipice. Then presidential candidate Barack Obama espoused a new eagerness to address climate change by reducing carbon emissions—which at the time meant less coal, and more nuclear. (Wind and solar energy were still a blip.) It was an easy sell. In competitive power markets, nuclear plants were profitable. The 104 operating reactors in the U.S. at the time were running smoothly. There hadn’t been a major accident since Chernobyl, in 1986.
The industry excitedly prepared for a “nuclear renaissance.” At the peak of interest, the NRC had applications for 30 new reactors in the U.S. Only two would be built. The cheap natural gas of the fracking boom began to drive down electricity prices, razing nuclear’s profits. Newly subsidized renewables, like wind and solar, added even more electricity generation, further saturating the markets. When on March 11, 2011, an earthquake and subsequent tsunami rolled over Japan’s Fukushima Daiichi nuclear power plant, leading to the meltdown of all three of its reactors and the evacuation of 154,000 people, the industry’s coffin was fully nailed. Not only would there be no renaissance in the U.S, but the existing plants had to justify their safety. Japan shut down 46 of its 50 operating reactors. Germany closed 11 of its 17. The U.S. fleet held on politically, but struggled to compete economically. Since Fukushima, 12 U.S. reactors have begun decommissioning, with three more planned.
At MIT, Cochran and DeWitte—who were teaching assistants together for a nuclear reactor class in 2009, and married in 2011—were frustrated by the setback. ”It was like, There’re all these cool technologies out there. Let’s do something with it,” says Cochran. But the nuclear industry has never been an easy place for innovators. In the U.S., its operational ranks have long been dominated by “ring knockers”—the officer corps of the Navy’s nuclear fleet, properly trained in the way things are done, but less interested in doing them differently. Governments had always kept a tight grip on nuclear; for decades, the technology was under shrouds. The personal computing revolution, and then the wild rise of the Internet, further drained engineering talent. From DeWitte and Cochran’s perspective, the nuclear-energy industry had already ossified by the time Fukushima and fracking totally brought things to a halt. “You eventually got to the point where it’s like, we have to try something different,” DeWitte says.
He and Cochran began to discreetly convene their MIT classmates for brainstorming sessions. Nuclear folks tend to be dogmatic about their favorite method of splitting atoms, but they stayed agnostic. “I didn’t start thinking we had to do everything differently,” says DeWitte. Rather, they had a hunch that marginal improvements might yield major results, if they could be spread across all of the industry’s usual snags—whether regulatory approaches, business models, the engineering of the systems themselves, or the challenge of actually constructing them.
In 2013, Cochran and DeWitte began to rent out the spare room in their Cambridge home on Airbnb. Their first guests were a pair of teachers from Alaska. The remote communities they taught in were dependent on diesel fuel for electricity, brought in at enormous cost. That energy scarcity created an opportunity: in such an environment, even a very expensive nuclear reactor might still be cheaper than the current system. The duo targeted a price of $100 per megawatt hour, more than double typical energy costs. They imagined using this high-cost early market as a pathway to scale their manufacturing. They realized that to make it work economically, they wouldn’t have to reinvent the reactor technology, only the production and sales processes. They decided to own their reactors and supply electricity, rather than supply the reactors themselves—operating more like today’s solar or wind developers. “It’s less about the technology being different,” says DeWitte, “than it is about approaching the entire process differently.”
That maverick streak raised eyebrows among nuclear veterans—and cash from Silicon Valley venture capitalists, including a boost from Y Combinator, where companies like Airbnb and Instacart got their start. In the eight years since, Oklo has distinguished itself from the competition by thinking smaller and moving faster. There are others competing in this space: NuScale, based in Oregon, is working to commercialize a reactor similar in design to existing nuclear plants, but constructed in 60-megawatt modules. TerraPower, founded by Bill Gates in 2006, has plans for a novel technology that uses its heat for energy storage, rather than to spin a turbine, which makes it an even more flexible option for electric grids that increasingly need that pliability. And X-energy, a Maryland-based firm that has received substantial funding from the U.S. Department of Energy, is developing 80-megawatt reactors that can also be grouped into “four-packs,” bringing them closer in size to today’s plants. Yet all are still years—and a billion dollars—away from their first installations. Oklo brags that its NRC application is 20 times shorter than NuScale’s, and its proposal cost 100 times less to develop. (Oklo’s proposed reactor would produce one-fortieth the power of NuScale’s.) NRC accepted Oklo’s application for review in March 2020, and regulations guarantee that process will be complete within three years. Oklo plans to power on around 2023, at a site at the Idaho National Laboratory, one of the U.S.’s oldest nuclear-research sites, and so already approved for such efforts. Then comes the hard part: doing it again and again, booking enough orders to justify building a factory to make many more reactors, driving costs down, and hoping politicians and activists worry more about the menace of greenhouse gases than the hazards of splitting atoms.
Nuclear-industry veterans remain wary. They have seen this all before. Westinghouse’s AP1000 reactor, first approved by the NRC in 2005, was touted as the flagship technology of Obama’s nuclear renaissance. It promised to be safer and simpler, using gravity rather than electricity-driven pumps to cool the reactor in case of an emergency—in theory, this would mitigate the danger of power outages, like the one that led to the Fukushima disaster. Its components could be constructed at a centralized location, and then shipped in giant pieces for assembly.
But all that was easier said than done. Westinghouse and its contractors struggled to manufacture the components according to nuclear’s mega-exacting requirements and in the end, only one AP1000 project in the U.S. actually happened: the Vogtle Electric Generating Plant in Georgia. Approved in 2012, its two reactors were expected at the time to cost $14 billion and be completed in 2016 and 2017, but costs have ballooned to $25 billion. The first will open, finally, next year.
Oklo and its competitors insist things are different this time, but they have yet to prove it. “Because we haven’t built one of them yet, we can promise that they’re not going to be a problem to build,” quips Gregory Jaczko, a former NRC chair who has since become the technology’s most biting critic. “So there’s no evidence of our failure.”
The Challenge
The cooling tower of the Hope Creek nuclear plant rises 50 stories above Artificial Island, New Jersey, built up on the marshy edge of the Delaware River. The three reactors here—one belonging to Hope Creek, and two run by the Salem Generating Station, which shares the site—generate an astonishing 3,465 megawatts of electricity, or roughly 40% of New Jersey’s total supply. Construction began in 1968, and was completed in 1986. Their closest human neighbors are across the river in Delaware. Otherwise the plant is surrounded by protected marshlands, pocked with radiation sensors and the occasional guard booth. Of the 1,500 people working here, around 100 are licensed reactor operators—a special designation given by the NRC, and held by fewer than 4,000 people in the country.
Among the newest in their ranks is Judy Rodriguez, an Elizabeth, N.J., native and another MIT grad. “Do I have your permission to enter?” she asks the operator on duty in the control room for the Salem Two reactor, which came online in 1981 and is capable of generating 1,200 megawatts of power. The operator opens a retractable belt barrier, like at an airport, and we step across a thick red line in the carpet. A horseshoe-shaped gray cabinet holds hundreds of buttons, glowing indicators and blinking lights, but a red LED counter at the center of the wall shows the most important number in the room: 944 megawatts, the amount of power the Salem Two reactor was generating that afternoon in September. Beside it is a circular pattern of square indicator lights showing the uranium fuel assemblies inside the core, deep inside the concrete domed containment building a couple hundred yards away. Salem Two has 764 of these constructions; each is about 6 inches sq and 15 ft. tall. They contain the source of the reactor’s energy, which are among the most guarded and controlled materials on earth. To make sure no one working there forgets that fact, a phrase is painted on walls all around the plant: “Line of Sight to the Reactor.”
As the epitome of critical infrastructure, this station has been buffeted by the crises the U.S. has suffered in the past few decades. After 9/11, the three reactors here absorbed nearly $100 million in security upgrades. Everyone entering the plant passes through metal- and explosives detectors, and radiation detectors on the way out. Walking between the buildings entails crossing a concrete expanse beneath high bullet resistant enclosures (BREs). The plant has a guard corp that has more members than any in New Jersey besides the state police, and federal NRC rules mean that they don’t have to abide by state limitations on automatic weapons.
The scale and complexity of the operation is staggering—and expensive. ”The place you’re sitting at right now costs us about $1.5 million to $2 million a day to run,” says Ralph Izzo, president and CEO of PSEG, New Jersey’s public utility company, which owns and operates the plants. “If those plants aren’t getting that in market, that’s a rough pill to swallow.” In 2019, the New Jersey Board of Public Utilities agreed to $300 million in annual subsidies to keep the three reactors running. The justification is simple: if the state wants to meet its carbon-reduction goals, keeping the plants online is essential, given that they supply 90% of the state’s zero-carbon energy. In September, the Illinois legislature came to the same conclusion as New Jersey, approving almost $700 million over five years to keep two existing nuclear plants open. The bipartisan infrastructure bill includes $6 billion in additional support (along with nearly $10 billion for development of future reactors). Even more is expected in the broader Build Back Better bill.
These subsidies—framed in both states as “carbon mitigation credits”—acknowledge the reality that nuclear plants cannot, on their own terms, compete economically with natural gas or coal. “There has always been a perception of this technology that never was matched by reality,” says Jaczko. The subsidies also show how climate change has altered the equation, but not decisively enough to guarantee nuclear’s future. Lawmakers and energy companies are coming to terms with nuclear’s new identity as clean power, deserving of the same economic incentives as solar and wind. Operators of existing plants want to be compensated for producing enormous amounts of carbon free energy, according to Josh Freed, of Third Way, a Washington, D.C., think tank that champions nuclear power as a climate solution. “There’s an inherent benefit to providing that, and it should be paid for.” For the moment, that has brought some assurance to U.S. nuclear operators of their future prospects. “A megawatt of zero-carbon electricity that’s leaving the grid is no different from a new megawatt of zero carbon electricity coming onto the grid,” says Kathleen Barrón, senior vice president of government and regulatory affairs and public policy at Exelon, the nation’s largest operator of nuclear reactors.
Globally, nations are struggling with the same equation. Germany and Japan both shuttered many of their plants after the Fukushima disaster, and saw their progress at reducing carbon emissions suffer. Germany has not built new renewables fast enough to meet its electricity needs, and has made up the gap with dirty coal and natural gas imported from Russia. Japan, under international pressure to move more aggressively to meet its carbon targets, announced in October that it would work to restart its reactors. “Nuclear power is indispensable when we think about how we can ensure a stable and affordable electricity supply while addressing climate change,” said Koichi Hagiuda, Japan’s minister of economy, trade and industry, at an October news conference. China is building more new nuclear reactors than any other country, with plans for as many as 150 by the 2030s, at an estimated cost of nearly half a trillion dollars. Long before that, in this decade, China will overtake the U.S. as the operator of the world’s largest nuclear-energy system.
The future won’t be decided by choosing between nuclear or solar power. Rather, it’s a technically and economically complicated balance of adding as much renewable energy as possible while ensuring a steady supply of electricity. At the moment, that’s easy. “There is enough opportunity to build renewables before achieving penetration levels that we’re worried about the grid having stability,” says PSEG’s Izzo. New Jersey, for its part, is aiming to add 7,500 megawatts of offshore wind by 2035—or about the equivalent of six new Salem-sized reactors. The technology to do that is readily at hand—Kansas alone has about that much wind power installed already.
The challenge comes when renewables make up a greater proportion of the electricity supply—or when the wind stops blowing. The need for “firm” generation becomes more crucial. “You cannot run our grid solely on the basis of renewable supply,” says Izzo. “One needs an interseasonal storage solution, and no one has come up with an economic interseasonal storage solution.”
Existing nuclear’s best pitch—aside from the very fact it exists already—is its “capacity factor,” the industry term for how often a plant meets its full energy making potential. For decades, nuclear plants struggled with outages and long maintenance periods. Today, improvements in management and technology make them more likely to run continuously—or “breaker to breaker”—between planned refuelings, which usually occur every 18 months, and take about a month. At Salem and Hope Creek, PSEG hangs banners in the hallways to celebrate each new record run without a maintenance breakdown. That improvement stretches across the industry. “If you took our performance back in the mid-’70s, and then look at our performance today, it’s equivalent to having built 30 new reactors,” says Maria Korsnick, president and CEO of the Nuclear Energy Institute, the industry’s main lobbying organization. That improved reliability has become its major calling card today.
Over the next 20 years, nuclear plants will need to develop new tricks. “One of the new words in our vocabulary is flexibility,” says Marilyn Kray, vice president of nuclear strategy and development at Exelon, which operates 21 reactors. “Flexibility not only in the existing plants, but in the designs of the emerging ones, to make them even more flexible and adaptable to complement renewables.” Smaller plants can adapt more easily to the grid, but they can also serve new customers, like providing energy directly to factories, steel mills or desalination plants.
Bringing those small plants into operation could be worth it, but it won’t be easy.”You can’t just excuse away the thing that’s at the center of all of it, which is it’s just a hard technology to build,” says Jaczko, the former NRC chair. “It’s difficult to make these plants, it’s difficult to design them, it’s difficult to engineer them, it’s difficult to construct them. At some point, that’s got to be the obvious conclusion to this technology.”
But the equally obvious conclusion is we can no longer live without it. “The reality is, you have to really squint to see how you get to net zero without nuclear,” says Third Way’s Freed. “There’s a lot of wishful thinking, a lot of fingers crossed.”
You may like
Business
‘Tis the Season to be Wary: How to Protect Your Business from Holiday Season Hacking

The holiday season will soon be in full swing, but cybercriminals aren’t known for their holiday spirit. While consumers have traditionally been the prime targets for cybercriminals during the holiday season – lost in a frenzy of last-minute online shopping and unrelenting ads – companies are increasingly falling victim to calculated cyber attacks.
Against this backdrop of relaxed vigilance and festive distractions, cybercriminals are set to deploy everything from ransomware to phishing scams, all designed to capitalise on the holiday haze. Businesses that fail to prioritise their cybersecurity could end up embracing not so much “tidings of comfort and joy” as unwanted data breaches and service outages well into 2024.
Threat Landscape
With the usual winter disruptions about to kick into overdrive, opportunistic hackers are aiming to exploit organisational turmoil this holiday season. Industry research consistently indicates a substantial spike in cyber attacks targeting businesses during holidays, particularly when coupled with the following factors:
- Employee Burnout: Employee burnout is rife around the holidays. Trying to complete major projects or hit targets before the end of the year can require long hours and intense workweeks. Overwrought schedules combined with the seasonal stressors of Christmas shopping, family politics, travel expenses, hosting duties etc., can lead to a less effective and exhausted workforce.
- Vacation Days: The holiday season is a popular time for employees to use up their vacation days and paid time off. This means offices are often emptier than usual during late December and early January. With fewer people working on-site, critical security tasks are neglected and gaps in security widen.
- Network Strain: The holidays also mark a period of network strain due to increased traffic and network requests. Staff shortages also reduce organisational response capacity if systems are compromised. The result is company networks that are understaffed and overwhelmed.
Seasonal Cyber Attacks
There are many ways bad actors look to exploit system vulnerabilities and human errors to breach defences this time of year. But rather than relying solely on sophisticated hacking techniques, most holiday-fueled cyber attacks succeed through tried and true threat vectors:
- Holiday-Themed Phishing and Smishing Campaigns: Emails and texts impersonating parcel carriers with tracking notifications contain fraudulent links, deploying malware or capturing account credentials once clicked by unwitting recipients trying to track deliveries. A momentary slip-up is all it takes to unleash malware payloads granting complete network access.
- Fake Charity Schemes: Malicious links masquerading as holiday philanthropy efforts compromise business accounts when donated to.
- Remote Access Exploits: External connectivity to internal networks comes with the territory of the season. However, poorly configured cloud apps and public Wi-Fi access points create openings for criminals to intercept company data from inadequately protected employee devices off-site.
- Ransomware Presents: Empty offices combined with delayed threat detection gives innovative extortion malware time to wrap itself around entire company systems and customer data before unveiling a not so jolly ransom note on Christmas morning.
Without proper precautions, the impact from misdirected clicks or downloads can quickly spiral across business servers over the holidays, leading to widespread data breaches and stolen customer credentials.
Essential Steps to Safeguard Systems
While eliminating all risks remains unlikely and tight budgets preclude launching entirely new security initiatives this holiday season, businesses can deter threats and address seasonal shortcomings through several key actions:
Prioritise Core Software Updates
Hardening network infrastructure is the first line of defence this holiday season. With many software products reaching end-of-life in December, it is critical to upgrade network architectures and prioritise core software updates to eliminate known vulnerabilities. Segmenting internal networks and proactively patching software can cut off preferred access routes for bad actors, confining potential breaches when hacking attacks surge.
Cultivate a Culture of Cybersecurity Awareness
Cybersecurity awareness training makes employees more resilient to rising social engineering campaigns and phishing links that increase during the holidays. Refreshing employees on spotting suspicious emails can thwart emerging hacking techniques. With more distractions and time out of the office this season, vigilance is more important than ever! Train your staff to “never” directly click a link from an email or text. Even if they are expecting a delivery they should still go directly to the known trusted source.
Manage Remote Access Proactively
Criminals aggressively pursue any vulnerabilities exposed during the holiday period to intercept financial and customer data while defences lie dormant. Therefore, businesses should properly configure cloud apps and remote networks before the holiday season hits. This will minimise pathways for data compromise when employees eventually disconnect devices from company systems over the holidays.
Mandate Multifactor Authentication (MFA)
Most successful attacks stem from compromised user credentials. By universally mandating MFA across all access points this season, retailers add critical layers of identity verification to secure systems. With MFA fatigue setting in over holidays, have backup verification methods ready to deter credential stuffing.
Prepare to Respond, Not Just Prevent
Despite precautions, holiday disasters can and do occur. Businesses need response plans for periods of disruption and reduced capacity. Have emergency communications prepared for customers and partners in case an attack disrupts operations. The time to prepare is before vacation schedules complicate incident response. It’s important to know how and when to bring in the right expertise if a crisis emerges.
By following best practices to prevent cybersecurity standards slipping before peak winter months, companies can enjoy the holidays without becoming victims of calculated cyber attacks. With swift and decisive action there is still time for businesses to prepare defences against holiday season hacks.
Business
Transforming unified comms to future-proof your business

By Jonathan Wright, Director of Products and Operations at GCX
Telephony is not usually the first thing SMBs think about when it comes to their digital transformation. However, push and pull factors are bringing it up the priority list and leading them to rethink their approach.
Indeed, it is just one year until PSTN (the copper-based telephone network) will be switched off by BT Openreach. With a recent survey showing that as many as 88% of UK businesses rely on PSTN, many organisations’ hands are being forced to review their communications ahead of the deadline.
But even if this change for some is being forced upon them, the benefits of building a more future-proofed unified communications strategy far outweigh the associated challenges. Nearly three-quarters of employees in UK SMEs now work partly or fully remotely, indeed the highest percentage of any G7 country. Voice over Internet Protocol (VoIP) telephone systems are much better suited to distributed workforces as the phone line is assigned on a user basis, rather than to a fixed location.
And with more companies now integrating AI capabilities to augment their products and services – like Microsoft Teams Pro which leverages OpenAI for improved transcription, automated notes generation and recommended actions – the productivity-boosting benefits for users are only improving.
Making the right choice
For those companies that are seizing the opportunity to change their unified comms in 2024, what should they consider when making their decision?
- Choose platforms that will boost user adoption – User adoption will make or break the rollout of a new IT project. So due consideration should be given to what products or services will have the path of least resistance with employees. Choosing a service or graphical user interface (GUI) users are already used to, like Zoom or MS Teams, is likely to result in a higher adoption rate than a net new service.
- Embrace innovation with AI capabilities – While some of the services leveraging AI and Large Language Model (LLM) to enhance their capabilities are more expensive than traditional VoIP, the productivity gains could offer an attractive return on investment for many small businesses. Claiming back the time spent typing up meeting notes, or improving the response time to customer calls with automatically-generated actions, will both have tangible benefits to the business. That said, companies should consider what level of service makes sense to their business; they may not need the version with all the bells and whistles to make significant efficiency gains.
- Bring multiple services under a single platform – The proliferation of IT tools is becoming an increasing challenge in many businesses; it creates silos that hamper collaboration, leaves employees feeling overwhelmed by the sheer number of communications channels to manage, and leads to mounting costs on the business. Expanding the use of existing platforms, or retiring multiple solutions by bringing their features together in one new platform, benefits the business and user experience alike.
- Automate onboarding to reduce the burden on IT – Any changes to unified comms should aim to benefit all of the different stakeholders – and that includes the IT team tasked with implementing and managing it. Choosing platforms which support automated onboarding and activation, for example, will reduce the burden on IT when provisioning new tenants, as well as with the ongoing policy management. What’s more, it reduces the risk of human error when configuring the setup to improve the overall security. Or, in the case of Microsoft Teams, even negates the need for Microsoft PowerShell.
- Consider where you work – Employees are not only working between home and the office more. Since the pandemic, more people are embracing the digital nomad lifestyle, while others are embracing the opportunity to work more closely with clients on-site or at their offices. This should be considered in unified comms planning as those companies with employees working outside the UK will need to choose a geo-agnostic service.
- Stay secure – Don’t let security and data protection be an afterthought. Opt for platforms leveraging authentication protocols, strong encryption, and security measures to safeguard sensitive information and support compliance.
Making the right switch
As many small businesses start planning for changes in their telephony in 2024 as the PSTN switch-off approaches, it is important that take the time to explore how the particular requirements of their organisations and how the changes to their communications could better support their new working practices and boost productivity.
Business
Will your network let down your AI strategy?

Rob Quickenden, CTO at Cisilion
As companies start to evaluate how they can use AI effectively, there is a clear need to ensure your network is up to the challenges of AI first. AI applications are going to require your data to be easily accessible and your network will need to be able to handle the huge compute needs of these new applications. It will also need to be secure enough at all points of access for the different applications to end users’ different devices. If your network isn’t reliable, readily available and secure it is likely going to fail.
In Cisco’s 2023 Networking Report 41% of networking professional across 2,500 global companies said that providing secure access to applications distributed across multiple cloud platforms is their key challenge, followed by gaining end-to-end visibility into network performance and security (37%).
So, what can you do to make your network AI ready?
First, you need to see AI as part of your digital transformation, then you need to look at where you need it and where you don’t. Jumping on the bandwagon and implementing AI for the sake of it isn’t the way forward. You need to have a clear strategy in place about where and how you are going to use AI. Setting up an AI taskforce to look at all aspects of your AI strategy is a good first step. They need to be able to identify how AI can help transform your business processes and free up time to focus on your core business. At the same time, they need to make sure your infrastructure can handle your AI needs.
Enterprise networks and IT landscapes are growing more intricate every day. The demand for seamless connectivity has skyrocketed as businesses expand their digital footprint and hybrid working continues. The rise of cloud services, the Internet of Things (IoT), and data-intensive applications have placed immense pressure on traditional network infrastructures and AI will only increase this burden. AI requires much higher levels of compute power too. The challenge lies in ensuring consistent performance, security, and reliability across a dispersed network environment.
Use hybrid and multi-cloud to de-silo operations
According to Gartner’s predictions, by 2025, 51% of IT spending will shift to the cloud. Underscoring the importance of having a robust and adaptable network infrastructure that can seamlessly integrate with cloud services. This is even more important with AI as it needs to access data from different locations and sources across your business to be successful. For example, AI often requires data from different sources to train models and make predictions. A company that wants to develop an AI system to predict customer churn may need to access data from multiple sources such as customer demographics, purchase history and social media activity.
IT teams need to make sure that they are using hybrid cloud and multi-cloud to de-silo operations to bring together network and security controls and visibility and allow for easy access to data. Where businesses use multiple cloud providers or have some data on-premise, they need to be reviewing how that data will be used and so how to access it across departments.
Install the best security and network monitoring
It’s clear that as we develop AI for good, there is also a darker side weaponizing AI to create more sophisticated cyber-attacks. Businesses need end-to-end visibility into their network performance and security and to be able to provide secure access to applications distributed across multiple cloud platforms. This means having effective monitoring tools in place and the right layers of security – not only at the end user level but also across your network at all access points.
Being able to review and test the performance of your SaaS based applications will also be key to the success of your AI solutions. AI requires apps to work harder and faster so tasting their speed, scalability and stability, and ensuring they are up to the job and can perform well under varying workloads is important.
Secure Access Service Edge
The best way to ensure your network security is as good as it can be is to simplify your tools and create consistency by using Secure Access Service Edge (SASE). This is an architecture that delivers converged network and security as service capabilities including SD-WAN and cloud native security functions such as secure web gateways, cloud access security brokers, firewall as-a-service, and zero-trust network access. SASE delivers wide area network and security controls as a cloud computing service directly to the source of connection rather than at the data centre which will protect your network and users more effectively.
SD-WAN connectivity
If you haven’t already, extending your SD-WAN connectivity consistently across multiple clouds to automate cloud-agnostic connectivity and optimise the application experience is a must. It will enable your organisation to securely connect users, applications and data across multiple locations while providing improved performance, reliability and scalability. SD-WAN also simplifies the management of WANs by providing centralised control and visibility over the entire network.
As we head towards the new era of AI, cloud is the new data centre, Internet is the new network, and cloud offerings will dominate applications. By making sure your network is AI ready, by adopting a cloud-centric operating model, having a view of global Internet health and the performance of top SaaS applications, IT teams will be able to implement their company’s AI strategy successfully.

The state of Artificial Intelligence in 2024

‘Tis the Season to be Wary: How to Protect Your Business from Holiday Season Hacking

Transforming unified comms to future-proof your business

The Sustainability Carrot Could be More Powerful Than the Stick!

Hybrid cloud adoption: why vendors are making the switch in 2022 and why you should too
