in

Edge computing

What is edge computing and Why does it matter?

What is Edge Computing?

Gartner defines edge computing as “a part of a distributed computing topology in which information processing is located close to the edge – where things and people produce or consume that information.”-

Computation and data storage are brought closer to the devices by edge computing at its basic level, where it’s being gathered, rather than relying on a central location that can be thousands of miles away. This is done so that data, especially real-time data, does not suffer latency issues that can affect an application’s performance. By having the processing done locally, reducing the amount of data that needs to be processed in a centralized or cloud-based location, companies can also save money. The exponential growth of lot devices developed edge computing. It either receives information from the cloud or delivers data back to the cloud by being connected to the internet. Enormous amounts of data are generated by many IoT devices during the course of their operations.

Network World / IDG

Network World Image
Network World

Think about devices that monitor manufacturing equipment on a factory floor or an internet-connected video camera that sends live footage from a remote office. While data can be produced and transmitted across a network quite easily by a single device, problems arise when the number of devices transmitting data at the same time grows. The costs in bandwidth will be tremendous and the quality will suffer due to latency when a video camera transmitting live footage is multiplied by hundreds or thousands of devices.

This problem is solved by Edge-computing hardware and services. In fact, it is a local source of processing and storage for many of these systems. For example, data can be processed from an edge device by an edge gateway and then send only the relevant data back through the cloud, reducing bandwidth needs. In the case of real-time application needs, data can be sent back to the edge device.

An IoT sensor, an employee’s notebook computer, smartphone, the security camera, or even the internet-connected microwave oven in the office break room can be included in these edge devices. Even the edge gateways are considered as edge devices within an edge-computing infrastructure.

Why does edge computing matter?

 Edge computing infrastructure

The ability to process and store data faster, enabling more efficient real-time applications that are critical to companies is the biggest benefit of edge computing. In fact, before edge computing, a smartphone scanning a person’s face for facial recognition would need to run the facial recognition algorithm through a cloud-based service, which would take a lot of time to process. Now, with an edge computing model, the algorithm could run locally on an edge server or gateway, or even on the smartphone itself, given the increasing power of smartphones.

“Edge computing has evolved significantly from the days of isolated IT at ROBO [Remote Office Branch Office] locations,” says Kuba Stolarski, a research director at IDC, in the “Worldwide Edge Infrastructure (Compute and Storage) Forecast, 2019-2023” report. “With enhanced interconnectivity enabling improved edge access to more core applications, and with new IoT and industry-specific business use cases, edge infrastructure is poised to be one of the main growth engines in the server and storage market for the next decade and beyond.

What do you think?

306 points
Upvote Downvote

Written by Edward Faulkner

Cybercrime and COVID-19