We’re hearing a lot about so-called “edge computing”—largely because our devices (smartphones, laptops, tablets, IoT devices) on the fringes of centralized systems can hold much more information and do more with it than in years past.
Definition: Edge computing is a method of optimizing cloud computing systems by performing data processing at the edge of the network, near the source of the data. This reduces the communications bandwidth needed between sensors and the central data center by performing analytics and knowledge generation at or near the source of the data.