Over the next few years, we will see an exponential increase in the number of mobile devices and machine-to-machines (M2M), which will handle significant amounts of IP traffic.
Consumers tomorrow will require faster WiFi services and will use more mobile apps. Many IoT devices, such as autonomous vehicles, will require real-time and low latency communications with local processing resources to ensure security.
There is no doubt that current IP networks are not able to manage high-speed data transmissions that will require the connected devices of tomorrow.
In a traditional IP architecture, data often travel hundreds of miles across a network between end users or devices and cloud resources. This results in latency or slow delivery of time-sensitive data.
The solution to reduce latency lies in the edge computing. By establishing IT distributions for cloud-based services in edge data centers in localized areas, IT resources can be effectively brought closer to end users and devices.
This helps to get an efficient and high-speed data exchange of applications and data. For this reason, edge data centers are usually located on the edge of a network, with connections to a centralized core cloud.
L’edge computing at data center service
Many new technologies will use and benefit from edge data centers, including fifth generation (5G), Internet of Things (IoT) and Industrial Internet of Things (IIoT), autonomous vehicles, virtual and augmented reality, intelligence
Organisations investing in data center edge implementation will immediately gain a competitive advantage and be able to deliver faster and more reliable services and applications.
To support IoT, IIoT and other next-generation technologies, it is essential to expand the capacity of data and network, reduce latency and ensure faster processing of data and applications.
Instead of bringing users and devices to the data center, it is advisable to lead the power of the data center to users and devices. L’edge computing is based on a distributed data center architecture, where IT cloud servers hosted in data center edge are distributed on the external margins of a network.
By bringing IT resources closer to end users and/or devices, we can achieve high-speed and low latency processing of applications and data.
The problem of latency
Latency is caused by a number of factors, including the physical distance data transmissions have to travel between centralized cloud servers and end-user devices, the number of network jumps between the switches that the transmission has to perform and the amount of traffic on the network For consumers, latency is usually just a nuisance: it could mean a slow download of a movie or interference with the time of reaction of an online video game.
For IoT and M2M devices, network latency can be an important obstacle, especially for devices that rely on guaranteed response times and real-time processing of data and applications.
A good example is the autonomous driving vehicle, which will rely on the interactions between internal and external devices to ensure the safety of vehicle passengers, other drivers and pedestrians. In this case, a peripheral data center would be installed at the blind intersection to operate the intelligent traffic light. It could be a envelope deployed quickly at the corner of a nearby road, or even a small box at the base of the traffic light containing a microprocessor. If the sensors detect that another machine is about to accelerate through a red traffic light at the intersection, the servers in the peripheral data center would immediately process a warning application, directing the smart traffic light to signal the autonomous vehicle to activate the brakes
What is a data center edge
Data centers edge are small in size compared to traditional data centers, they are located in a strategic position and close to those (users or IoT) who use and generate data themselves. They usually connect to a larger central data center or to more data centers.
By processing data and services as close as possible to the end user, the edge computing allows organizations to reduce latency and improve customer experience.
It should be stressed that a data center edge qualifies as a real data center, and not just a simple network node. A data center edge includes the same power, cooling, connectivity and security features as located in a centralized data center, but on a smaller scale. In addition, IT implementations in a data center edge will manage application processing, data analysis and data storage in the vicinity of end users and devices using such applications and data.
A data center edge is located on the outer margins of an IP network. It connects to a centralized core cloud in a data center that is usually located at a certain distance. In addition, a data center edge group will often be connected to form an aggregate edge cloud that creates a shared pool of localized processing, storage and network resources.
Data center edge take many forms: modular, containerized, micro, Waterhouse and office-base. All types require the same infrastructure elements as in the larger centralized data centers. Ultra-wide and low latency bandwidth connectivity, high-speed fibre or copper structured wiring, cable management solutions and automated infrastructure management tools (AIM) are all key elements.
New technologies such as 5G networks, smart cities and autonomously driven vehicles will highlight a constant need for low latency local computing power, maximizing the yield of edge data centers.
Learn more thanks to the dedicated monograph
Download the free monograph