Connect with us

Technology

The secret history of the internet: how web domains rule the world

Stuart Fuller, Domain Services Director at Com Laude

The internet has evolved in remarkable ways since its inception, transforming from a directory of static web pages in the early 90s to the interactive and immersive digital landscape everybody navigates today. Amidst these monumental shifts, the Domain Name System (DNS) – a critically important backbone of the web – has undergone transformative changes of its own.

In its nascent stages, the internet was envisioned as a far more linear place than its current iteration. Until 2000, many of the websites you could visit ended in .com, .edu, .gov, .mil, .org, .net, and .int, with all of these top-level domains inextricably tied to their owner’s function, in addition to the country code domains such as .uk and .fr. If you visited a .com, you’d see a commercial entity, with network infrastructures tied to .net domains and .org domains for those that didn’t quite fit. This is not true of the internet today, with over 1500 domains in use and .com, .net, and .org now being entirely unrestricted in who owns them, knowing the value of your domain has become more challenging for businesses in the online world.

These developments often go unrecognised, but with further change on the horizon announced by the DNS’ administrators, ICANN, it is time to take stock of just how far things have come, and consider what the service over 5 billion people use will look like in the years ahead.

How did we get here?

Prior to the 1990s, what would become the internet was predominantly restricted to academic researchers. Known as ARPANET, conceived by the U.S Department of Defence’s Advanced Research Projects Agency (ARPA), it was designed to facilitate research collaboration among universities and government entities. However, as the project yielded substantial developments of standardised protocols to enable communication on a network of computers, such as TCP/IP, it was the catalyst to a digital revolution that has shaped nearly every aspect of modern society.

During this period, an administrative organisation fulfilling technical functions for this ever-growing network was established by two scientists at the University of California at Los Angeles – John Postel and Joyce K. Reynolds. Yet as the internet was predominantly used by academic researchers, it was merely one part of a collaborative effort across universities to maintain the network.

However, as access grew throughout the 90s, the demand to commercialise the network and regulate it from government increased in tow. In 1993, the National Science Foundation, a U.S. Government Agency, privatised the domain name registry, followed by the authorisation of the sale of generic domain names in 1995. This resulted in widespread dissatisfaction across internet users – it signalled a concentration of power over what was previously envisioned as a decentralised system, whilst individual countries were free to develop their own rules and regulations determining the sale and usage of their specific country codes.

In response, Postel drafted a paper proposing the creation of new top-level domains, in a bid to institutionalise his organisation. After it was ignored, Postel emailed eight regional root system operators instructing them to change the central server they operated within to his organisation’s. They complied, dividing control of internet naming between Postel and the government.

With a furious reaction from government officials, Postel reversed the decision. Subsequently, changes were issued regarding authority over these root system servers, and Postel died unexpectedly a few months later.

Following this, his organisation was subsumed into the newly created ICANN, designed to perform the functions of Postel’s organisation. As the internet became global, this produced a renewed interest in fostering commercial competition and the number of domain names expanded dramatically.

As new demands came from how the internet was used, domain names were created to match. For example, with the introduction of internet access via mobile devices, .mobi was created, and when the Asia-Pacific region’s internet usage grew substantially, .asia was created in 2005. Large companies took notice of the value of these registered strings of characters, and in 2012 ICANN enabled businesses to apply for their own domain names. At present, 496 companies possess these, with examples ranging from .bmw for the automobile company all the way through to .sky for the television and broadband provider.

Recently, ICANN announced that there will be a second round of issuing brand names, currently pencilled in for 2026, presenting new opportunities for businesses to register their own piece of internet space. And, in a sense, Postel’s vision for a decentralised internet was realised, as in 2016 ICANN ended its contract with the U.S. government and the organisation transitioned to the global internet community.

Where is this all going?

Although it may be impossible to predict how the internet will be used in the future, and what structures may change to adapt, there are interesting technological developments that could be transformative.

With the rise of blockchain technologies, caused by the rocketing use of cryptocurrencies, we could see a further decentralisation with regard to system ownership. Instead of registering internet space with an authority consisting of a number of global stakeholders, blockchain systems can share ownership equally over every user, with potentially interesting, democratic implications for registering parts of that space.

Alternatively, with developments in metaverse technologies, we could see a new meaning applied to domain registration. As digital technologies and reality blur, this could mean staking claims over digital space on top of physical, or registering ownership over a rendered place in a virtual reality world.

An exciting future

Regardless of what the future brings, if history holds true, it will propel us toward a future where the boundaries of digital interaction are continually expanded and redefined. The evolution of the technology from an academic research tool to a fundamental part of people’s lives is nothing short of extraordinary. Yet, as these developments occur, they will undoubtedly bring new benefits in democratising information, entertainment, and connectivity, in a way that will shape the lives of everyone.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Business

How can businesses make the cloud optional in their operations?

Max Alexander, Co-founder at Ditto

Modern business apps are built to be cloud-dependent. This is great for accessing limitless compute and data storage capabilities but when connection to the cloud is poor or shuts down, business apps stop working, impacting revenue and service. If real-time data is needed for quick decision-making in fields like healthcare, a stalled app can potentially put people in life-threatening situations.

Organisations in sectors as diverse as airlines, fast food retail, and ecommerce that have deskless staff who need digital tools accessible on smartphones, tablets and other devices to do their jobs. But because of widespread connectivity issues and outages, these organisations are beginning to consider how to ensure these tools can operate reliably when the cloud is not accessible. 

The short answer is that building applications with a local-first architecture can help to ensure that they remain functional when disconnected from the internet. But then, why are not all apps built this way? The simple answer is that building and deploying cloud-only applications is much easier as ready-made tools for developers help expedite a lot of the backend building process. The more complex answer is that a local-first architecture solves the issue of offline data accessibility but does not solve the critical issue of offline data synchronisation. Apps disconnected from the internet still have no way to share data across devices. That is where peer-to-peer data sync and mesh networking come into play.

Combining offline-first architecture with peer-to-peer data sync

In the real world, what does an application like this look like?

  • Apps must prioritise local data sync. Rather than sending data to a remote server, applications must be able to write data using its local database in the first instance, and then listen for changes from other devices, and recombine them as needed. Apps should utilise local transports such as Bluetooth Low Energy (BLE) and Peer-to-Peer WiFi (P2P Wi-Fi) to communicate data changes in the event that the internet, local server, or the cloud is not available.
  • Devices are capable of creating real-time mesh networks. Nearby devices should be able to discover, communicate, and maintain constant connections with devices in areas of limited or no connectivity.
  • Seamlessly transition from online to offline (and vice versa). Combining local sync with mesh networking means that devices in the same mesh are constantly updating a local version of the database and opportunistically syncing those changes with the cloud when it is available.
  • Partitioned between large peer and small peer mesh networks to not overwhelm smaller networks if they try to sync every piece of data. In order to do this, smaller networks will only sync the data that it requests, so developers have complete control over bandwidth usage and storage. This is vital when connectivity is erratic or critical data needs prioritising. Whereas, the larger networks sync as much data as they can, which is when there is full access to cloud-based systems.
  • Ad-hoc to enable devices to join and leave the mesh when they need to. This also means that there can be no central server other devices are relying on.
  • Compatible with all data at any time. All devices should account for incoming data with different schemas. In this way, if a device is offline and running an outdated app version, for example, it still must be able to read new data and sync.

Peer-to-peer sync and mesh networking in practice

Let us take a look at a point-of-sale application in the fast-paced environment of a quick-service restaurant. When an order is taken at a kiosk or counter, that data must travel hundreds of miles to a data centre to arrive at a device four metres away in the kitchen. This is an inefficient process and can slow down or even halt operations, especially if there is an internet outage or any issues with the cloud.

A major fast-food restaurant in the US has already modernised its point of sale system using this new architecture and created one that can move order data between store devices independently of an internet connection. As such, this system is much more resilient in the face of outages, ensuring employees can always deliver best-in-class service, regardless of internet connectivity.

The vast power of cloud-optional computing is showcased in healthcare situations in rural areas in developing countries. By using both peer-to-peer data sync and mesh networking, essential healthcare applications can share critical health information without the Internet or a connection to the cloud. This means that healthcare workers in disconnected environments can now quickly process information and share it with relevant colleagues, empowering faster reaction times that can save lives.

Although the shift from cloud-only to cloud-optional is subtle and will not be obvious to end users, it really is a fundamental paradigm shift. This move provides a number of business opportunities for increasing revenue and efficiencies and helps ensure sustained service for customers.

Continue Reading

Business

How 5G is enhancing communication in critical sectors

Luke Wilkinson, MD, Mobile Tornado

In critical sectors where high-stakes situations are common, effective communication is non-negotiable. Whether it’s first responders dealing with a crisis or a construction team coordinating a complex project, the ability to share information quickly and reliably can mean the difference between success and failure.

Long-distance communication became feasible in the 1950s when wireless network connectivity was first utilised in mobile radio-telephone systems, often using push-to-talk (PTT) technology. As private companies invested in cellular infrastructure, the networks developed and data speeds improved increasingly. Each major leap forward in mobile network capabilities was classed as a different generation and thus 1G, 2G, 3G, 4G, and now 5G were born.

5G is the fifth generation of wireless technology and has been gradually rolled out since 2019 when the first commercial 5G network was launched. Since then, the deployment of 5G infrastructure has been steadily increasing, with more and more countries and regions around the world adopting this cutting-edge technology.

Its rollout has been particularly significant for critical sectors that rely heavily on push-to-talk over cellular (PTToC) solutions. With 5G, PTToC communications can be carried out with higher bandwidth and speed, resulting in clearer and more seamless conversations, helping to mitigate risks in difficult scenarios within critical sectors.

How is 5G benefiting businesses?

According to Statista, by 2030, half of all connections worldwide are predicted to use 5G technology, increasing from one-tenth in 2022. This showcases the rapid pace at which 5G is becoming the standard in global communication infrastructure.

But what does this mean for businesses? Two of the key improvements under 5G are improved bandwidth and download speeds, facilitating faster and more reliable communication within teams. PTToC solutions can harness the capabilities of 5G and bring the benefits to critical sectors that need it most, whether that’s in public safety, security, or logistics: the use cases are infinite. For example, this could be leveraging 5G’s increased bandwidth to enable larger group calls and screen sharing for effective communication.

Communication between workers in critical industries can be difficult, as often the workforces are made up of lone workers or small groups of individuals in remote locations. PTToC is indispensable in these scenarios for producing quick and secure communication, as well as additional features including real-time location information and the ability to send SOS alerts. PTToC with 5G works effectively in critical sectors, as 5G is designed to be compatible with various network conditions, including 2G and 3G. This ensures that communication remains reliable and efficient even in countries or areas where 5G infrastructure is not fully deployed to keep remote, lone workers safe and secure.

The impact of 5G on critical communications

The International Telecommunication Union has reported that 95 percent of the world’s population can access a mobile broadband network. This opens up a world of new possibilities for PTToC, particularly when harnessing new capabilities for 5G as it’s being rolled out.

One of the most significant improvements brought by 5G is within video communications, which most PTToC solutions now offer. Faster speeds, higher bandwidth, and lower latency enhance the stability and quality of video calls, which are crucial in critical sectors. After all, in industries like public safety, construction, and logistics, the importance of visual information for effective decision-making and situational awareness cannot be overstated. 5G enables the real-time transmission of high-quality video, allowing for effective coordination and response strategies, ultimately improving operational outcomes and safety measures.

Challenges in Adopting 5G in Critical Sectors

While the benefits of 5G are undeniable, the industry faces some challenges in its widespread adoption. Network coverage and interoperability are two key concerns that need to be addressed to ensure communication can keep improving in critical sectors.

According to the International Telecommunication Union, older-generation networks are being phased out in many countries to allow for collaborative 5G standards development across industries. Yet, particularly in lower-income countries in Sub-Saharan Africa, Latin America, and Asia-Pacific, there is a need for infrastructure upgrades and investment to support 5G connectivity. The potential barriers to adoption, including device accessibility, the expense of deploying the new networks, and regulatory issues, must be carefully navigated to help countries make the most out of 5G capabilities within critical sectors and beyond.

However, the rollout of 5G does cause data security concerns for mission-critical communications and operations, as mobile networks present an expanded attack surface. Nonetheless, IT professionals, including PTToC developers, have the means to safeguard remote and lone workers and shield corporate and employee data. Encryption, authentication, remote access, and offline functionality are vital attributes that tackle emerging data threats both on devices and during transmission. Deploying this multi-tiered strategy alongside regular updates substantially diminishes the vulnerabilities associated with exploiting 5G mobile networks and devices within critical sectors.

While the challenges faced by the industry must be addressed, the potential benefits of 5G in enhancing communication and collaboration are undeniable. As the rollout of 5G continues to gain momentum, the benefits of this cutting-edge technology in enhancing communication in critical sectors are becoming increasingly evident. The faster, more reliable, and efficient communication enabled by 5G is crucial for industries that rely on real-time information exchange and decision-making.

Looking ahead, the potential for further advancements and increased adoption of 5G in critical sectors is truly exciting. As the industry continues to address the challenges faced, such as network coverage, interoperability, and data security concerns, we can expect to see even greater integration of this technology across a wide range of mission-critical applications for critical sectors.

Continue Reading

Auto

Could electric vehicles be the answer to energy flexibility?

Rolf Bienert, Managing and Technical Director, OpenADR Alliance

Last year, what was the Department for Business, Energy & Industrial Strategy and Ofgem published its Electric Vehicle Smart Charging Action plans to unlock the power of electric vehicle (EV) charging. Owners would have the opportunity to charge their vehicles while powering their homes with excess electricity stored in their car.

Known as vehicle to grid (V2G) or vehicle to everything (V2X), it is the communication between a vehicle and another entity. This could be the transfer of electricity stored in an EV to the home, the grid, or to other destinations. V2X requires bi-directional energy flow from the charger to the vehicle and bi- or unidirectional flow from the charger to the destination, depending on how it is being used.

While there are V2X pilots already out there, it’s considered an emerging technology. The Government is backing it with its V2X Innovation Programme with the aim of addressing barriers to enabling energy flexibility from EV charging. Phase 1 will support development of V2X bi-directional charging prototype hardware, software or business models, while phase 2 will support small scale V2X demonstrations.

The programme is part of the Flexibility Innovation Programme which looks to enable large-scale widespread electricity system flexibility through smart, flexible, secure, and accessible technologies – and will fund innovation across a range of key smart energy applications.

As part of the initiative, the Government will also fund Demand Side Response (DSR) projects activated through both the Innovation Programme and its Interoperable Demand Side Response Programme (IDSR) designed to support innovation and design of IDSR systems. DSR and energy flexibility is becoming increasingly important as demand for energy grows.

The EV potential

EVs offer a potential energy resource, especially at peak times when the electricity grid is under pressure. Designed to power cars weighing two tonnes or more, EV batteries are large, especially when compared to other potential energy resources.

While a typical solar system for the home is around 10kWh, electric car batteries range from 30kWh or more. A Jaguar i-Pace is 85kWh while the Tesla model S has a 100kWh battery, which offers a much larger resource. This means that a fully powered EV could support an average home for several days.

But to make this a reality the technology needs to be in place first to ensure there is a stable, reliable and secure supply of power. Most EV charging systems are already connected via apps and control platforms with pre-set systems, so easy to access and easy to use. But, owners will need to factor in possible additional hardware costs, including invertors for charging and discharging the power.

The vehicle owner must also have control over what they want to do. For example, how much of the charge from the car battery they want to make available to the grid and how much they want to leave in the vehicle.

The concept of bi-directional charging means that vehicles need to be designed with bi-directional power flow in mind and Electric Vehicle Supply Equipment will have to be upgraded as Electric Vehicle Power Exchange Equipment (EVPE).

Critical success factors

Open standards will be also critical to the success of this opportunity, and to ensure the charging infrastructure for V2X and V2G use cases is fit for purpose.

There are also lifecycle implications for the battery that need to be addressed as bi-directional charging can lead to degradation and shortening of battery life. Typically EVs are sold with an eight-year battery life, but this depends on the model, so drivers might be reluctant to add extra wear and tear, or pay for new batteries before time.

There is also the question of power quality. With more and more high-powered invertors pushing power into the grid, it could lead to questions about power quality that is not up to standard, and that may require periodic grid code adjustments.

But before this becomes reality, it has to be something that EV owners want. The industry is looking to educate users about the benefits and opportunities of V2X, but is it enough? We need a unified message, from automotive companies and OEMs, to government, and a concerted effort to promote new smart energy initiatives.

While plans are not yet agreed with regards to a ban on the sale on new petrol and diesel vehicles, figures from the IEA show that by 2035, one in four vehicles on the road will be electric. So, it’s time to raise awareness the opportunities of these programs.

With trials already happening in the UK, US, and other markets, I’m optimistic that it could become a disruptor market for this technology.

Continue Reading

Copyright © 2021 Futures Parity.