Latest Posts

Entegro Group announces European expansion with France-based KYNTUS Group Our Expertise in Optical Fibre Technologies: Splicing and Testing Celebrating 250K homes passed with Fibrus Entegro holds company event focused on Strategy, Leadership and Company Culture

News Archive

All 2024 Posts All 2023 Posts All 2022 Posts All 2021 Posts All 2020 Posts All 2019 Posts

Transforming the Internet with Edge Computing

How we enjoy the Internet is changing. Access to connectivity is no longer just a gateway to consume information, but also one to produce it, in colossal volumes. With a continued explosion in the number and intricacy of connected devices, the Internet is more pervasive than ever before.

In the past decade, we have witnessed the centralisation and consolidation of the Internet, fundamentally upending the way in which services are provisioned. This shift has taken hold under the banner of cloud computing, a revolutionary development which has simplified access to inexpensive compute and storage resources.                                                                                                                                                       

In spite of its earth-shattering success, an Internet based solely on the concept of cloud computing is not an Internet suited to the applications of tomorrow. As we sit on the cusp of an Internet of Things (IoT) revolution, the physical distance over which information must traverse is becoming a critical differentiator.

A Cloud Burst is Inbound with the Edge

With a foundational concept to push processing resources from a centralised medium to the extreme peripherals of a network, edge computing paves the way for faster and more efficient analysis of data by reducing the geographical distance between the devices which generate data and the compute resources that interpret it.

This compulsion to distribute processing power across multiple distinct networks stands in stark contrast to the cloud computing model, where every bit of data produced by sensors must be transported from the network edge to a centralised data centre.

As you could guess, the long-haul transportation of data across vast distances exerts unnecessary pressure on the backbone infrastructure that supports the Internet. By eliminating this, edge computing mitigates the occurrence of capacity bottlenecks and enhances latency performance.

Time is money, and edge computing will reduce downtime. Decentralisation creates multiple concurrent layers of physical redundancy, limiting the magnitude of outages with isolation. This also enhances security, making it significantly more difficult for actors with mal-intent to isolate data, effectively quarantining information in small, secure pockets across the network edge.

The Edge is On-Demand and Omnipresent

The thoughts of initiating a dramatic and fundamental change in how the Internet works have, understandably, raised many eyebrows. From the perspective of implementation, the cloud of doubt hangs over the definition of the edge - where is it?

For fixed and mobile operators, perhaps the most prudent manner in which to adopt an edge architecture is to place compute resources across its portfolio of cell sites, aggregation nodes and at central office sites. In this model, the network edge lies where an operator’s infrastructure is closest to its customers.

If implemented, this operator-controlled architecture could support the provision of in-house services such as streaming in the form of an integrated content delivery network (CDN).

There are similar ambitions held by data centre owners. To fulfil such ambitions, they will need to decentralise their facilities and expand into smaller cities and towns while maintaining an adequate level of interconnectivity with fibres to multiple telecom operators.

This architecture will allow data centre owners to differentiate their services from those offered by fixed and mobile operators with a high level of interconnectivity, key for applications such as facial recognition which need to access databases in the central cloud. 

Finally, there exists another, more abstract concept to enable edge computing - utilisation of software to intelligently aggregate and exploit decentralised compute resources at the edge. This architecture effectively transforms edge infrastructure into something that is on-demand and omnipresent.

Of course, the core purpose for aggregation of multiple edge facilities is to create a unified platform onto which software developers can test and deploy their products.

Imagine a developer intends to deploy their augmented reality (AR) game across markets in Europe and North America. In this case, edge computing is required to minimise device-to-server latency.

For monetisation of the game, analytical and advertisement services must run atop it, stemming from different servers on another cloud platform. Here, it is server-to-server latency that will matter most. With a neutral software-defined architecture, the game can be deployed with relative ease across a large geographical area.

Conclusion: Supporting a Data Tidal Wave with Edge Computing

To continue supporting an open and diverse ecosystem of services, the Internet and its underlying infrastructure need to evolve. Make no mistake, we are in the midst of a connectivity explosion, with over 75 billion Internet of Things devices predicted to exist by 2025[1]. This is why a paradigm shift is occurring - the mass production of data at the edge.

While the cloud has successfully facilitated a connectivity explosion thus far, it is not a sustainable or secure architecture to transport every bit of information from the network edge to a centralised data centre. Challenges lie ahead, particularly in the presence of power and size restrictions. But, for the Internet and our digital society, edge computing is another stepping stone of progress.


[1] Statista, “Internet of Things (IoT) connected devices installed base worldwide from 2015 to 2025”.

Posted on 3 Jul 2019.

Back to blog list