According to Gartner, 25 billion connected devices will comprise the Internet of Things (IoT) by 2025. But even now, with IoT in its infancy, the data volumes being produced by networked devices worldwide are congesting data pipelines in the cloud.
As IoT continues to grow, we’re finding there isn’t enough bandwidth to handle all of the data being created and a remote, centralized data center can’t respond quickly enough to enable the productivity and downtime-reduction benefits that IoT adopters will expect. Plus, the current cloud-based model can’t effectively manage the huge volumes of data being created by these connected devices. This is why fog computing has become one of the hottest topics in the data center industry.
Fog computing – a term coined by Cisco – basically serves as a bridge between IoT connected devices and remote data centers. The solution to these huge volumes of data, as we’ve discovered in the data center, is intelligent controllers and gateways that collect data from devices operating in close proximity to each other or across devices that comprise a system. This local level of collection and control, which is what the fog computing model is built on, addresses several of the most significant challenges posed by IoT.
Similar in philosophy to the micro data centers now being deployed at the network edge to provide faster access to content and applications, the concept of fog computing connects multiple small networks of industrial systems into one large network across an enterprise. In the fog, application services are distributed across smart devices and micro data centers to improve efficiency and concentrate data processing so that only actionable data is transmitted. This provides a more efficient and effective method of dealing with the immense amount of data being generated by the sensors that comprise the IoT, while also allowing data to be aggregated and filtered locally to preserve bandwidth for actionable data.
While Fog Computing has the potential to help free up bandwidth for more valuable data traffic one must ensure that mission critical systems, information, data storage, communications, and the entire framework of the system are sufficiently protected, resilient, and able to recover from a local or regional fault. Unlike a data center where we protect the IT kit within the confines of the building with Fog Computing and the burgeoning IoT we will need to identify key nodes and ‘weak links’ to ensure system and business continuity.
Think of it this way – when one traffic light is out that is an inconvenience. When 100 traffic lights are out in a common area it is a major disruption to the morning commute. But when 10,000 traffic lights are out you have a regional emergency.
Do you think fog computing is the industry’s answer to the surplus of data created by IoT?