Connect with us

Technology

From reactive to proactive: The future of cyber defence

By Nick Walker, Regional Director EMEA, NetSPI

There have been 2.29 million cyber-attacks on UK businesses in the last 12 months, with ransomware attacks increasing by 70%. These shocking figures not only highlight the growing scale of cyber threats but that businesses are also becoming increasingly vulnerable to them.

It’s clear that cybersecurity is no longer confined to the realm of IT departments and tech experts; it has become a concern for every individual in the organisation. In fact, PWC’s Global Digital Insights survey found that cyber budgets in 2024 are increasing at a much higher rate than last year, as “mega breaches” rise in number, scale and cost.

When security can’t keep up with the pace of innovation, the ability to deliver bottom-line results is at stake. To innovate with confidence, organisations need proactive security at the core of their cybersecurity programme. But what exactly is proactive security, why is there a growing need for this approach, and how can organisations ensure their security programmes aren’t just reactive?

Defining proactive security

Traditionally, many businesses tend to operate cyber security defensively or reactively. For instance, patching vulnerabilities or implementing a new security tool after experiencing a breach. This is especially the case for organisations that belong to an industry with significant regulatory or government compliance pressures, such as financial services or healthcare. This approach is no longer fit for purpose – that’s where proactive security comes into play.

Eric Parizo, Principal Analyst at Omdia, defines proactive security as: “…technologies (including those provided as services) that enable organizations to seek out and mitigate likely threats and threat conditions before they pose a danger to the extended IT environment.”

Essentially, a proactive security approach hinges on the anticipation of cyber threats before they materialise into actual breaches. It involves staying one step ahead of the hackers by identifying vulnerabilities before they are exploited by bad actors. Contrary to a reactive approach, dedicated security teams will focus on the entire scope of an organization’s security posture – specifically how to identify, protect (against), detect, and respond to risk.

Let’s also clarify what proactive security is not – it is not a collection of disjointed, temporary solutions that are one trick ponies. This sentiment only creates more confusion and tool fatigue and it may give a false sense of security if those solutions aren’t properly configured or validated. Proactive security is also not knee-jerk reactions to cyber threats – gone are the days of one-off escalated events, too many alerts, and flashing screens. Security teams do not need to respond to everything in their systems; we must be more strategic than that.

What is driving the need for proactive security?

In today’s threat landscape, hackers are finding new ways to breach the security of corporations and one of the tools they are using is artificial intelligence (AI). AI, specifically generative AI such as ChatGPT and the like, has become a powerful tool in their arsenal using the technology to automate attacks, create convincing phishing messages, develop more evasive malware or crack passwords. While AI can help cyber defenders, it also means an expanded attack surface across the organisation which can leave assets exposed and vulnerable to adversaries.

Alongside this, businesses have had to address the growing demand for cloud computing infrastructure, as well as adopt new digital identity technologies to not only satisfy customer needs, but continue innovating at record speed.

A holistic, proactive approach to cybersecurity

Despite the large investments many companies have made in detective controls, they often struggle to detect tactics, techniques, and procedures (TTPs) used by real-world threat actors during sustained and sophisticated attack campaigns. On top of this, the expanding attack surface and ever-changing parameters puts security controls to the test so gaining visibility into external facing assets, vulnerabilities and exposures is a time-consuming and difficult challenge.

The goal is to help businesses address these issues more easily with a combination of right technology and right people to provide expert, tailored guidance. While there isn’t a one size fits all approach, here are the steps to ensuring a holistic approach to your proactive security programme:  

  • Identify: This starts with penetration testing, to provide a snapshot of an organisation’s current environment.
  • Protect: A continuous assessment of the external attack surface is carried out.
  • Detect: Identify a vulnerability and run a play in a breach and attack simulation solution to detect whether a threat would be identified by your security stack.
  • Respond: Complete a Red Team engagement to ensure the team would be able to successfully defend and respond to that threat.

By following these steps, organisations can enhance their cybersecurity posture, proactively address vulnerabilities, and bolster their resilience against cyber threats. This approach not only helps in mitigating risks but also fosters a culture of proactive security within the organisation.

With cybercrime on the rise and only 15% of UK businesses having a formal cybersecurity incident management plan – there has never been a better time to get proactive about security. As AI continues to infiltrate organisations and cloud computing continues to evolve, business leaders must gain a better understanding of their IT stack and overall security posture to minimise potential gaps and exposures. Innovation – and business success – depends on it.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

How can businesses make the cloud optional in their operations?

Max Alexander, Co-founder at Ditto

Modern business apps are built to be cloud-dependent. This is great for accessing limitless compute and data storage capabilities but when connection to the cloud is poor or shuts down, business apps stop working, impacting revenue and service. If real-time data is needed for quick decision-making in fields like healthcare, a stalled app can potentially put people in life-threatening situations.

Organisations in sectors as diverse as airlines, fast food retail, and ecommerce that have deskless staff who need digital tools accessible on smartphones, tablets and other devices to do their jobs. But because of widespread connectivity issues and outages, these organisations are beginning to consider how to ensure these tools can operate reliably when the cloud is not accessible. 

The short answer is that building applications with a local-first architecture can help to ensure that they remain functional when disconnected from the internet. But then, why are not all apps built this way? The simple answer is that building and deploying cloud-only applications is much easier as ready-made tools for developers help expedite a lot of the backend building process. The more complex answer is that a local-first architecture solves the issue of offline data accessibility but does not solve the critical issue of offline data synchronisation. Apps disconnected from the internet still have no way to share data across devices. That is where peer-to-peer data sync and mesh networking come into play.

Combining offline-first architecture with peer-to-peer data sync

In the real world, what does an application like this look like?

  • Apps must prioritise local data sync. Rather than sending data to a remote server, applications must be able to write data using its local database in the first instance, and then listen for changes from other devices, and recombine them as needed. Apps should utilise local transports such as Bluetooth Low Energy (BLE) and Peer-to-Peer WiFi (P2P Wi-Fi) to communicate data changes in the event that the internet, local server, or the cloud is not available.
  • Devices are capable of creating real-time mesh networks. Nearby devices should be able to discover, communicate, and maintain constant connections with devices in areas of limited or no connectivity.
  • Seamlessly transition from online to offline (and vice versa). Combining local sync with mesh networking means that devices in the same mesh are constantly updating a local version of the database and opportunistically syncing those changes with the cloud when it is available.
  • Partitioned between large peer and small peer mesh networks to not overwhelm smaller networks if they try to sync every piece of data. In order to do this, smaller networks will only sync the data that it requests, so developers have complete control over bandwidth usage and storage. This is vital when connectivity is erratic or critical data needs prioritising. Whereas, the larger networks sync as much data as they can, which is when there is full access to cloud-based systems.
  • Ad-hoc to enable devices to join and leave the mesh when they need to. This also means that there can be no central server other devices are relying on.
  • Compatible with all data at any time. All devices should account for incoming data with different schemas. In this way, if a device is offline and running an outdated app version, for example, it still must be able to read new data and sync.

Peer-to-peer sync and mesh networking in practice

Let us take a look at a point-of-sale application in the fast-paced environment of a quick-service restaurant. When an order is taken at a kiosk or counter, that data must travel hundreds of miles to a data centre to arrive at a device four metres away in the kitchen. This is an inefficient process and can slow down or even halt operations, especially if there is an internet outage or any issues with the cloud.

A major fast-food restaurant in the US has already modernised its point of sale system using this new architecture and created one that can move order data between store devices independently of an internet connection. As such, this system is much more resilient in the face of outages, ensuring employees can always deliver best-in-class service, regardless of internet connectivity.

The vast power of cloud-optional computing is showcased in healthcare situations in rural areas in developing countries. By using both peer-to-peer data sync and mesh networking, essential healthcare applications can share critical health information without the Internet or a connection to the cloud. This means that healthcare workers in disconnected environments can now quickly process information and share it with relevant colleagues, empowering faster reaction times that can save lives.

Although the shift from cloud-only to cloud-optional is subtle and will not be obvious to end users, it really is a fundamental paradigm shift. This move provides a number of business opportunities for increasing revenue and efficiencies and helps ensure sustained service for customers.

Continue Reading

Business

How 5G is enhancing communication in critical sectors

Luke Wilkinson, MD, Mobile Tornado

In critical sectors where high-stakes situations are common, effective communication is non-negotiable. Whether it’s first responders dealing with a crisis or a construction team coordinating a complex project, the ability to share information quickly and reliably can mean the difference between success and failure.

Long-distance communication became feasible in the 1950s when wireless network connectivity was first utilised in mobile radio-telephone systems, often using push-to-talk (PTT) technology. As private companies invested in cellular infrastructure, the networks developed and data speeds improved increasingly. Each major leap forward in mobile network capabilities was classed as a different generation and thus 1G, 2G, 3G, 4G, and now 5G were born.

5G is the fifth generation of wireless technology and has been gradually rolled out since 2019 when the first commercial 5G network was launched. Since then, the deployment of 5G infrastructure has been steadily increasing, with more and more countries and regions around the world adopting this cutting-edge technology.

Its rollout has been particularly significant for critical sectors that rely heavily on push-to-talk over cellular (PTToC) solutions. With 5G, PTToC communications can be carried out with higher bandwidth and speed, resulting in clearer and more seamless conversations, helping to mitigate risks in difficult scenarios within critical sectors.

How is 5G benefiting businesses?

According to Statista, by 2030, half of all connections worldwide are predicted to use 5G technology, increasing from one-tenth in 2022. This showcases the rapid pace at which 5G is becoming the standard in global communication infrastructure.

But what does this mean for businesses? Two of the key improvements under 5G are improved bandwidth and download speeds, facilitating faster and more reliable communication within teams. PTToC solutions can harness the capabilities of 5G and bring the benefits to critical sectors that need it most, whether that’s in public safety, security, or logistics: the use cases are infinite. For example, this could be leveraging 5G’s increased bandwidth to enable larger group calls and screen sharing for effective communication.

Communication between workers in critical industries can be difficult, as often the workforces are made up of lone workers or small groups of individuals in remote locations. PTToC is indispensable in these scenarios for producing quick and secure communication, as well as additional features including real-time location information and the ability to send SOS alerts. PTToC with 5G works effectively in critical sectors, as 5G is designed to be compatible with various network conditions, including 2G and 3G. This ensures that communication remains reliable and efficient even in countries or areas where 5G infrastructure is not fully deployed to keep remote, lone workers safe and secure.

The impact of 5G on critical communications

The International Telecommunication Union has reported that 95 percent of the world’s population can access a mobile broadband network. This opens up a world of new possibilities for PTToC, particularly when harnessing new capabilities for 5G as it’s being rolled out.

One of the most significant improvements brought by 5G is within video communications, which most PTToC solutions now offer. Faster speeds, higher bandwidth, and lower latency enhance the stability and quality of video calls, which are crucial in critical sectors. After all, in industries like public safety, construction, and logistics, the importance of visual information for effective decision-making and situational awareness cannot be overstated. 5G enables the real-time transmission of high-quality video, allowing for effective coordination and response strategies, ultimately improving operational outcomes and safety measures.

Challenges in Adopting 5G in Critical Sectors

While the benefits of 5G are undeniable, the industry faces some challenges in its widespread adoption. Network coverage and interoperability are two key concerns that need to be addressed to ensure communication can keep improving in critical sectors.

According to the International Telecommunication Union, older-generation networks are being phased out in many countries to allow for collaborative 5G standards development across industries. Yet, particularly in lower-income countries in Sub-Saharan Africa, Latin America, and Asia-Pacific, there is a need for infrastructure upgrades and investment to support 5G connectivity. The potential barriers to adoption, including device accessibility, the expense of deploying the new networks, and regulatory issues, must be carefully navigated to help countries make the most out of 5G capabilities within critical sectors and beyond.

However, the rollout of 5G does cause data security concerns for mission-critical communications and operations, as mobile networks present an expanded attack surface. Nonetheless, IT professionals, including PTToC developers, have the means to safeguard remote and lone workers and shield corporate and employee data. Encryption, authentication, remote access, and offline functionality are vital attributes that tackle emerging data threats both on devices and during transmission. Deploying this multi-tiered strategy alongside regular updates substantially diminishes the vulnerabilities associated with exploiting 5G mobile networks and devices within critical sectors.

While the challenges faced by the industry must be addressed, the potential benefits of 5G in enhancing communication and collaboration are undeniable. As the rollout of 5G continues to gain momentum, the benefits of this cutting-edge technology in enhancing communication in critical sectors are becoming increasingly evident. The faster, more reliable, and efficient communication enabled by 5G is crucial for industries that rely on real-time information exchange and decision-making.

Looking ahead, the potential for further advancements and increased adoption of 5G in critical sectors is truly exciting. As the industry continues to address the challenges faced, such as network coverage, interoperability, and data security concerns, we can expect to see even greater integration of this technology across a wide range of mission-critical applications for critical sectors.

Continue Reading

Auto

Could electric vehicles be the answer to energy flexibility?

Rolf Bienert, Managing and Technical Director, OpenADR Alliance

Last year, what was the Department for Business, Energy & Industrial Strategy and Ofgem published its Electric Vehicle Smart Charging Action plans to unlock the power of electric vehicle (EV) charging. Owners would have the opportunity to charge their vehicles while powering their homes with excess electricity stored in their car.

Known as vehicle to grid (V2G) or vehicle to everything (V2X), it is the communication between a vehicle and another entity. This could be the transfer of electricity stored in an EV to the home, the grid, or to other destinations. V2X requires bi-directional energy flow from the charger to the vehicle and bi- or unidirectional flow from the charger to the destination, depending on how it is being used.

While there are V2X pilots already out there, it’s considered an emerging technology. The Government is backing it with its V2X Innovation Programme with the aim of addressing barriers to enabling energy flexibility from EV charging. Phase 1 will support development of V2X bi-directional charging prototype hardware, software or business models, while phase 2 will support small scale V2X demonstrations.

The programme is part of the Flexibility Innovation Programme which looks to enable large-scale widespread electricity system flexibility through smart, flexible, secure, and accessible technologies – and will fund innovation across a range of key smart energy applications.

As part of the initiative, the Government will also fund Demand Side Response (DSR) projects activated through both the Innovation Programme and its Interoperable Demand Side Response Programme (IDSR) designed to support innovation and design of IDSR systems. DSR and energy flexibility is becoming increasingly important as demand for energy grows.

The EV potential

EVs offer a potential energy resource, especially at peak times when the electricity grid is under pressure. Designed to power cars weighing two tonnes or more, EV batteries are large, especially when compared to other potential energy resources.

While a typical solar system for the home is around 10kWh, electric car batteries range from 30kWh or more. A Jaguar i-Pace is 85kWh while the Tesla model S has a 100kWh battery, which offers a much larger resource. This means that a fully powered EV could support an average home for several days.

But to make this a reality the technology needs to be in place first to ensure there is a stable, reliable and secure supply of power. Most EV charging systems are already connected via apps and control platforms with pre-set systems, so easy to access and easy to use. But, owners will need to factor in possible additional hardware costs, including invertors for charging and discharging the power.

The vehicle owner must also have control over what they want to do. For example, how much of the charge from the car battery they want to make available to the grid and how much they want to leave in the vehicle.

The concept of bi-directional charging means that vehicles need to be designed with bi-directional power flow in mind and Electric Vehicle Supply Equipment will have to be upgraded as Electric Vehicle Power Exchange Equipment (EVPE).

Critical success factors

Open standards will be also critical to the success of this opportunity, and to ensure the charging infrastructure for V2X and V2G use cases is fit for purpose.

There are also lifecycle implications for the battery that need to be addressed as bi-directional charging can lead to degradation and shortening of battery life. Typically EVs are sold with an eight-year battery life, but this depends on the model, so drivers might be reluctant to add extra wear and tear, or pay for new batteries before time.

There is also the question of power quality. With more and more high-powered invertors pushing power into the grid, it could lead to questions about power quality that is not up to standard, and that may require periodic grid code adjustments.

But before this becomes reality, it has to be something that EV owners want. The industry is looking to educate users about the benefits and opportunities of V2X, but is it enough? We need a unified message, from automotive companies and OEMs, to government, and a concerted effort to promote new smart energy initiatives.

While plans are not yet agreed with regards to a ban on the sale on new petrol and diesel vehicles, figures from the IEA show that by 2035, one in four vehicles on the road will be electric. So, it’s time to raise awareness the opportunities of these programs.

With trials already happening in the UK, US, and other markets, I’m optimistic that it could become a disruptor market for this technology.

Continue Reading

Copyright © 2021 Futures Parity.