As the demand for real-time data processing continues to surge with the rise of connected devices, edge computing is expected to experience an increasing demand; for those unfamiliar with the technology, the question remains: what is edge computing and how does it work?
Edge Computing
According to the Open Glossary of Edge Computing edge computing is the delivery of computing capabilities to the logical extremes of a network in order to improve the performance, operating cost and reliability of applications and services.
By shortening the distance between devices and the cloud resources that serve them, and also reducing network hops, edge computing mitigates the latency and bandwidth constraints of today’s Internet, ushering in new classes of applications.
In practical terms, this means distributing new resources and software stacks along the path between today’s centralized data centers and the increasingly large number of devices in the field, concentrated, in particular, but not exclusively, in close proximity to the last mile network, on both the infrastructure and device sides.