Connect with us


The secret history of the internet: how web domains rule the world

Stuart Fuller, Domain Services Director at Com Laude

The internet has evolved in remarkable ways since its inception, transforming from a directory of static web pages in the early 90s to the interactive and immersive digital landscape everybody navigates today. Amidst these monumental shifts, the Domain Name System (DNS) – a critically important backbone of the web – has undergone transformative changes of its own.

In its nascent stages, the internet was envisioned as a far more linear place than its current iteration. Until 2000, many of the websites you could visit ended in .com, .edu, .gov, .mil, .org, .net, and .int, with all of these top-level domains inextricably tied to their owner’s function, in addition to the country code domains such as .uk and .fr. If you visited a .com, you’d see a commercial entity, with network infrastructures tied to .net domains and .org domains for those that didn’t quite fit. This is not true of the internet today, with over 1500 domains in use and .com, .net, and .org now being entirely unrestricted in who owns them, knowing the value of your domain has become more challenging for businesses in the online world.

These developments often go unrecognised, but with further change on the horizon announced by the DNS’ administrators, ICANN, it is time to take stock of just how far things have come, and consider what the service over 5 billion people use will look like in the years ahead.

How did we get here?

Prior to the 1990s, what would become the internet was predominantly restricted to academic researchers. Known as ARPANET, conceived by the U.S Department of Defence’s Advanced Research Projects Agency (ARPA), it was designed to facilitate research collaboration among universities and government entities. However, as the project yielded substantial developments of standardised protocols to enable communication on a network of computers, such as TCP/IP, it was the catalyst to a digital revolution that has shaped nearly every aspect of modern society.

During this period, an administrative organisation fulfilling technical functions for this ever-growing network was established by two scientists at the University of California at Los Angeles – John Postel and Joyce K. Reynolds. Yet as the internet was predominantly used by academic researchers, it was merely one part of a collaborative effort across universities to maintain the network.

However, as access grew throughout the 90s, the demand to commercialise the network and regulate it from government increased in tow. In 1993, the National Science Foundation, a U.S. Government Agency, privatised the domain name registry, followed by the authorisation of the sale of generic domain names in 1995. This resulted in widespread dissatisfaction across internet users – it signalled a concentration of power over what was previously envisioned as a decentralised system, whilst individual countries were free to develop their own rules and regulations determining the sale and usage of their specific country codes.

In response, Postel drafted a paper proposing the creation of new top-level domains, in a bid to institutionalise his organisation. After it was ignored, Postel emailed eight regional root system operators instructing them to change the central server they operated within to his organisation’s. They complied, dividing control of internet naming between Postel and the government.

With a furious reaction from government officials, Postel reversed the decision. Subsequently, changes were issued regarding authority over these root system servers, and Postel died unexpectedly a few months later.

Following this, his organisation was subsumed into the newly created ICANN, designed to perform the functions of Postel’s organisation. As the internet became global, this produced a renewed interest in fostering commercial competition and the number of domain names expanded dramatically.

As new demands came from how the internet was used, domain names were created to match. For example, with the introduction of internet access via mobile devices, .mobi was created, and when the Asia-Pacific region’s internet usage grew substantially, .asia was created in 2005. Large companies took notice of the value of these registered strings of characters, and in 2012 ICANN enabled businesses to apply for their own domain names. At present, 496 companies possess these, with examples ranging from .bmw for the automobile company all the way through to .sky for the television and broadband provider.

Recently, ICANN announced that there will be a second round of issuing brand names, currently pencilled in for 2026, presenting new opportunities for businesses to register their own piece of internet space. And, in a sense, Postel’s vision for a decentralised internet was realised, as in 2016 ICANN ended its contract with the U.S. government and the organisation transitioned to the global internet community.

Where is this all going?

Although it may be impossible to predict how the internet will be used in the future, and what structures may change to adapt, there are interesting technological developments that could be transformative.

With the rise of blockchain technologies, caused by the rocketing use of cryptocurrencies, we could see a further decentralisation with regard to system ownership. Instead of registering internet space with an authority consisting of a number of global stakeholders, blockchain systems can share ownership equally over every user, with potentially interesting, democratic implications for registering parts of that space.

Alternatively, with developments in metaverse technologies, we could see a new meaning applied to domain registration. As digital technologies and reality blur, this could mean staking claims over digital space on top of physical, or registering ownership over a rendered place in a virtual reality world.

An exciting future

Regardless of what the future brings, if history holds true, it will propel us toward a future where the boundaries of digital interaction are continually expanded and redefined. The evolution of the technology from an academic research tool to a fundamental part of people’s lives is nothing short of extraordinary. Yet, as these developments occur, they will undoubtedly bring new benefits in democratising information, entertainment, and connectivity, in a way that will shape the lives of everyone.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *


Harnessing AI to Navigate Regulatory Complexity in Banking and Finance

Source: Finance Derivative

By Harry Borovick, General Counsel, Luminance

The global banking and finance sector is navigating an increasingly complex regulatory landscape, compounded by uncertain macroeconomic conditions, marketplace competition, and heightened customer expectations. These pressures have increased the volume and difficulty[RW1]  of compliance requirements and raised the risk of substantial fines for businesses operating in this sector. Amidst these challenges, AI can offer practical solutions to ensure compliance and mitigate risks.

The Challenge

Whether it’s successfully navigating the London Interbank Offered Rate (LIBOR) or remaining compliant with newly implemented regulation like Digital Operational Resilience Act (DORA), financial institutions are no stranger to new regulations. From antitrust and competition laws to sustainability-focused regulations like the Financial Disclosures Regulation 2019/2088, growing regulatory complexity presents significant hurdles for legal departments within financial institutions. Additionally, the sheer volume and fragmented nature of the data at hand adds significant friction to legal workflows.[RW2] 

Legal teams in financial institutions are mandated to stay aware of incoming changes and must be equipped to handle them. After all, non-compliance carries severe economic, operational, and reputational consequences. In 2021, the UK’s Financial Conduct Authority (FCA) issued over £500 million in fines for non-compliance. The stakes are higher than ever, and the repercussions of failing to meet regulatory standards can be catastrophic. For instance, a prominent financial institution faced massive fines for failing to comply with anti-money laundering regulations, even being subjected to the first ever criminal charge issued by the FCA. This event highlights the significant financial and reputational risks involved when institutions fail to adhere to regulatory measures.

However, the issue extends beyond fines and potential financial loss. The stress exerted on industry professionals tasked with ensuring compliance is leading to increased mental health issues and high turnover rates. Reportedly, 60% of compliance staff feel burned out by the responsibilities they face. The pressure to maintain compliance amidst an ever-evolving regulatory environment should not be overlooked, as it may lead to a talent drain within the sector.

The Solution

AI provides a tangible solution to the compliance challenges faced by financial institutions. But what does that look like in practice?

  1. Effective Third-Party Risk Management: Financial institutions must maintain effective third-party risk management to identify and reduce risk across their service providers. This is often a manual, labour intensive task, but remains deeply important to compliance. Financial institutions can conduct thorough due diligence by centralising service provider contracts to ensure comprehensive oversight and risk management. AI provides a far more comprehensive ability to search through these documents, automatically surfacing key figures and grouping documents which are conceptually similar.
  • Accelerated Compliance Process: AI can automate documents routing across the team, ensuring an effective review process. AI automtically flag renewal dates in contracts, reducing time spent searcging for these vital data points.
  • Empowering Non-Legal Teams: Non-legal departments can use AI to generate standard agreements based on compliant, gold-standard language through self-service contract generation tools, streamlining approvals and reducing delays.
  • Navigating Global Complexity:  Global companies are often juggling multiple regulatory regimes, making compliance an even more complex, labour-intensive task. AI tools [AM3] can quickly and comprehensively analyse data sets [RW4] in multiple languages, removing barriers in global operations and expediting the document review process.

But what does this look like in practice? A leading US-headquartered private equity firm used Luminance to review nearly 1,000 documents, including NDAs, credit agreements, and fund documents. A project estimated to take two weeks manually was completed significantly faster, with over 350 LIBOR definition clauses identified upon upload. This kind of saving is instrumental to company success, particularly in such a competitive environment.

In an era where regulatory requirements are becoming more stringent and the consequences of non-compliance are more severe, financial institutions must leverage AI to navigate the evolving compliance landscape and maintain a competitive edge in a challenging sector. [RW5] Within a trend towards both financial transparency and environmental intervention which will only keep growing, taking steps now will be a key step for business continuity tomorrow. Adoption of AI-driven solutions enables compliance teams to keep up with the pace of regulation, even as it rapidly changes and evolves.

Just avoiding repetition of ‘complex’ – some other word than ‘difficulty’ might be better, if you prefer. [RW1]

Again, just finding ways to paraphrase complex/add some nuance. [RW2]

We want to be careful about appearing too self promotional, or the editor will reject. We should flag when we share the byline that the editor may reject the para which talks about lumi tech specifically due to neutrality guidelines.  [AM3]

Is ‘data room’ a term of art Luminance uses? It’s new to me, if so. [RW4]

This is fine in itself, but feels like it’s repeating what’s already been said in the byline. We could do with a bit of a step forward in the thinking that really brings the point home. [RW5]

One option would be to say something like:

“The 60% of compliance staff who report burnout might tell us all we need to know about the landscape right now, but there’s no reason to believe that this challenge will ebb in the future. Within a trend towards both financial transparency and environmental intervention which will only keep growing, taking steps now will be a key step for business continuity tomorrow…”

And then spell out the adoption of AI-driven solutions (which themselves will evolve at pace alongside changing legislation/regulation)?

Continue Reading


Three ‘Must Haves’ to Convert Data Disaster into a Triumph

By Richard Connolly, Regional Director for UKI at Infinidat

When we think about disaster recovery planning, our thoughts tend to focus on natural disasters.  While flood, fire, earthquakes and other natural disasters are an IT disaster too, they are not as frequent as many think.

But another type of disaster is looming large. It’s entirely preventable. I’m talking about a cyberattack. Cyber threats are much more likely to occur than a natural disaster. Cyberattacks are now widely regarded as one of the single biggest risks that any organisation faces and almost always cited by CEOs as their #1 or #2 existential threat.

The risks of a cyber attack are evident in the UK Government’s Cyber Security Breaches Survey 2024. This study reported that half of UK businesses (50%), have experienced some form of cyber security breach or attack in the last 12 months. Among the largest businesses in the study, the frequency of cyber incidents is even higher. Seventy percent (70%) of mid-range businesses and 74% of large businesses (74%) reported an attack.  And these threats are not limited to the UK, as both the European Union and the United States have put out cyber security guidelines for business to follow to try to reduce the impact of cyber crime.

40% of big business cyberattacks are malware related

Cybersecurity attacks come in many forms and include a broad range of activities. Of all the possibilities, a malware attack is known to be the most disruptive to business operations. Malware incidents account for 40% of all cyberattacks on large businesses in the UK specifically and are a significant threat because of the risks they pose to data integrity. Regarded as ‘data disasters’ by storage experts, even a small malware incident can result in a business being shut down for days or weeks. Could your business survive an incident like that?

Minimise the threat of a cyberattack

If your business becomes the subject of a cyber attack, what steps can you take to minimise disruption and ensure the fastest possible recovery? In the past, one way a business could protect its data from disaster was by having data backups stored at multiple locations. If one site was hit, there would always be another copy available. Unfortunately, things are no longer that straightforward. Data disasters, like massive ransomware attacks, have completely changed the rules of disaster recovery and business continuity. Added to this, the significance of business data as a strategic asset is much greater today than it was previously. It’s why KPMG advises that ‘data is the most significant asset many organisations possess’ and protecting it isn’t just a case of having it stored at multiple locations.

3 must haves for a data disaster triumph

There are three absolute ‘must haves’ when it comes to being prepared for a data disaster with an iron-clad recovery strategy. These are as follows:

Must have #1 The ability to take ‘immutable snapshots’ of data that cannot be altered in any way and then isolate them in a forensic environment, when an attack hits. This means the copies can safely be analysed to identify a good replica of the data to recover.

Must have #2 The ability to perform cyber detection on primary storage, i.e. the data, programmes and instructions that are being used in real-time by the business; and secondary storage – data that is accessed less frequently or retained for compliance and historical reasons. Both are critically important.

Must have #3 The ability to instantaneously recover data.

Why are the data recovery ‘must haves’ so critical?

Looking into each of these capabilities in detail, immutable snapshots are the foundation of a robust data disaster recovery. Without a good copy of your data, you cannot recover quickly after a ransomware attack, which is likely to have corrupted or encrypted your data. By segregating the data copies with logical air-gapping and then having a fenced forensic environment, you can create a safe space to review the data prior to recovery. Even if datasets have been taken “hostage,” it’s possible to complete a recovery back to the most recent known good copy of data. This can completely obliterate the impact malware attacks can have because if the data is fully recoverable, there’s potentially no need to pay the cybercriminals.

The second “must-have” ability is cyber detection on primary and secondary storage. This is important because it can be an early warning sign of a cyberattack. It also ensures that there is no ransomware or malware hidden in the last known copy of data that you could revert back to. But before going through to the recovery stage, how do you know that a data copy is really “clean?”

This is where advanced cyber detection capabilities built into a software-defined primary storage platform can make the difference. They make it possible to do highly intelligent, deep data scanning and to identify any corruption whilst the data is still segregated in a fenced forensic environment. Additionally, identifying the highest integrity copy is more straightforward and it also provides indexing to identify potential issues.

The third “must-have” ability is rapid data recovery. This is obvious, but it’s easier said than done. When a business experiences a data disaster, time is of the essence. They can’t wait for days or weeks to recover a known good data copy. Even six hours of downtime is too much. Recovery should ideally take minutes to avoid a negative impact on the business. For this reason, experts measure how quickly you can recover your data and the quality of the data. Can you bounce back from a cyberattack quickly? Would your employees and customers notice if you were hit by a malware incident?

1 in 2 UK businesses experienced a cyberattack in 2023

The Government’s research says it all. Cyberattacks are taking place all the time and the latest study shows that 1 in 2 businesses are being affected. 40% of the attacks involved ransomware. As data becomes ever more important as a business asset, we can expect that these types of data disasters will become even more commonplace.

Although, your business might not be able to completely avoid a malware or ransomware attack, you can avert a full blown disaster and avoid the disruption they cause. By protecting your business with the three disaster recovery must haves – immutable snapshots, fenced forensic environments and advanced cyber scanning and rapid recovery – you will have done everything possible to mitigate this risk.

Continue Reading


Preparing data for DORA compliance

Source: Finance Derivative

By Andrew Carr, Managing Director, Camwood

The financial sector is increasingly looking towards technology as the way to introduce new products and services and achieve competitive differentiation. But this reliance opens up avenues for cyber hackers to exploit weaknesses, and it’s a risk that the World Economic Forum has taken note of. Funding issues, reputational damage and a detrimental impact on other critical services could ensue from a successful attack, and the EU is making moves to counteract the threat.

The Digital Operational Resilience Act (DORA) will be applied on 17th January 2025. It’s a framework that makes prevention the priority, with the IT security of financial entities including banks, insurance companies and investment firms coming under its scope. Primarily applying to EU-based firms, UK organisations that work in EU markets also need to be compliant. With the implementation date nearing, businesses should review their preparations and ensure everything is ready, with a particular focus on their data management processes.

The details behind the regulation

The DORA regulation encompasses several key areas, including ICT-related incident reporting, digital operational resilience testing, ICT risk management and even monitoring of primary third-party providers. It also emphasises information sharing for exchange of data and intelligence around the latest cyber threats. Failure to comply can bring significant consequences. Fines can be up to 2% of total annual  turnover or up to 1% of average daily turnover worldwide.

Firms need a strong understanding of their data to meet the criteria, such as timely reporting of cyber incidents and sharing relevant intelligence. For example, there needs to be awareness of where each piece of data is located, who has recently accessed it, the access permissions attached to it and the type of storage being used. For numerous businesses, this information isn’t privy to them. A mixture of data is likely to sit in a complex mix of cloud, on-premise and multi-cloud deployments.

Data in numerous locations

A significant amount of data is hiding in places that financial organisations aren’t aware of. This is not because of any malicious activity, but simply due to natural data sprawl in different hosting solutions over so many years. Multi-cloud has achieved widespread adoption, with nine-in-ten organisations following this strategy according to the Flexera 2024 State of the Cloud Report.

This widespread distribution of data complicates locating specific information for sharing and presents security risks that jeopardise compliance with the DORA regulation. For example, it’s possible to have multiple copies of the same sensitive document stored in different locations. This not only wastes available storage space, but also increases the chances of unauthorised access to the data.

Supplier relationships are another key aspect of the regulation. Strategic partners will likely need access to a specific part of a financial firm’s system, and this data must be readily available, all while ensuring they can’t access other sensitive information. If a supplier fails, is the financial firm able to call on a readily available list of alternative service providers to ensure continuity? Data needs to be organised and in the right place for this to be made a reality.

Organising data

Achieving DORA compliance requires organising data into a manageable structure through several key steps. This starts with a data audit or assessment to identify data locations, storage types, retention periods and last access dates. This process provides a snapshot of the current data situation and highlights any necessary changes or alterations before January.

Next, fragmented data can be relocated from obscure locations to more logical ones and be clearly tagged. This allows users to easily identify data for sharing or reporting purposes. Duplicate documents can be identified and deleted in a move to free up space, reduce storage costs and lower cyber risks.

Finally, access controls and governance can be implemented to ensure that only authorised personnel, whether internal or external, can access specific data. Previously, 73% of leaders and employees have admitted that a lack of trust and data overload has hindered decision-making. With data properly organised, leaders and staff can make informed decisions based on accurate and trusted insights.

Planning ahead

As the financial sector increasingly relies on technology to move ahead with innovation, it must also address the associated risks. With the application date of DORA looming, which has strict requirements including incident reporting, ICT risk management, operational resilience testing and third-party oversight, firms need to tackle their data challenges head-on by assessing their current situation and implementing sufficient data management practices.

Data sprawl is a significant challenge, but detailed audits and structured data management can reduce risks and enhance operational resilience. By identifying where data is sitting, eliminating any duplicates and integrating strict access controls, financial organisations can ensure compliance while simultaneously strengthening their defences against cyber threats.

Continue Reading

Copyright © 2021 Futures Parity.