In an era where industry is focusing on gains made from big data and analytics, organizations are making a more concerted effort to tap into these edge of the network insights derived from a variety of data generated via devices and sensors. The growing Field Service Management (FSM) software segment, for example, is driven by factors such as the use of cloud, analytics, and mobile technologies. Organizations are gathering every little byte of information they can find from these sources and pushing it to the cloud where the storing, processing and analysis of this data happens to bring out deeper trends and competitive benefits.
The Internet of Things (IoT) is expanding as more smart devices start communicating with each other, but frankly, machine to machine (M2M) data transfer is not new and has existed for quite a while. What is new is the driving need to harness this data urgently and put it to use in a larger context.
Despite its power and potential, the model of cloud computing is still focused on a centralized functionality and is difficult to be applied to environments where internet connectivity is poor or not stable and operations are time-critical. How can this be fixed? By adding fog computing, which extends the cloud model to be closer to the objects that produce and act on IoT data.
Latency-Free in the Fog
Fog computing solves the latency problem prevalent with cloud computing by keeping the data as close to the ground as possible rather than sending it back and forth through a central cloud location. Fog computing can address the inadequacies of cloud-only models, which have serious challenges with latency, network bandwidth, geographic focus, reliability, and security. Fog computing is actually the continuum of cloud towards the edge of the network. And Fog supports densely distributed data collection points, adding a fourth axis to the often-mentioned big data dimensions (volume, variety, and velocity).
But rather than replacing cloud, fog computing is likely to complement it with its ability to reduce the amount of data to be sent to the cloud to be processed. Fog will improve the cloud model as a complementary asset by taking a part of the burden away from what cloud servers currently have to handle alone now. By enabling the cloud by pre-processing data, cloud will be freed to take care of the heavier tasks, like analyzing larger datasets of processed sensor inputs for longitudinal trends, for example. Cloud is useful as an environment where the data has to be analyzed from several fog computing sources that are remote from each other.
Blue Hill believes that a combination model of cloud computing and fog computing is what the enterprises truly need for the rapid adoption of IoT.
Use Cases for Fog Computing
One interesting use case is shown by Airbus, which collects information on maintenance, spare parts, and aircraft performance at a regional and local level to best optimize time on the ground. Other use cases include smart traffic light systems, which can change its signals based on surveillance of incoming traffic to prevent accidents or reduce congestion where data could also be sent to the cloud for longer-term analytics.
Other cases discussed in industry have included rail safety; power restoration from a smart grid network; and cybersecurity. There are also use cases with connected cars (for vehicle-to-vehicle and vehicle-to-cloud communication); and in smart city applications, such as intelligent lighting and smart parking meters. Fog computing is also used in New York-based renewable energy company Envision who has been able to obtain a 15 percent productivity improvement from the vast network of wind turbines it operates. The company is processing as much as 20 terabytes of data at a time, generated by 3 million sensors installed on the 20,000 turbines it manages. Moving computation to the edge has enabled Envision to cut down data analysis time from 10 minutes to mere seconds, providing them with actionable insights and significant business benefits.
Fog computing can be leveraged to provide ample opportunities for creating new applications and services that cannot be easily supported by the current host-based and cloud-based application platforms. In that light, a new industry group, The OpenFog Consortium, was formed last November to define and promote fog computing. The consortium, founded by ARM, Cisco, Dell, Intel, Microsoft and Princeton University, “seeks to create an architecture and approach to fog, edge, and distributed computing”. Now with after one year, the 53 members in 15 countries involved in the Consortium they are, through a series of working groups, working on developing an OpenFog architecture, addressing security issues, and planning industry testbeds.
How will this help the use of fog computing for IoT? The Consortium’s work “is centered around creating a framework for efficient & reliable networks and intelligent endpoints combined with identifiable, secure, and privacy-friendly information flows between clouds, endpoints, and services based on open standard technologies.” An open framework gives references points for others in both implementation and development. Best practices and user cases are great incentives for others to follow suit, very much like what has happened with Bitcoin.
How Secure is Your Fog?
Security solutions exist for cloud computing, but due to the underlying differences between cloud computing and fog computing, such solutions may not suit fog computing devices that are at the edges of networks. This is important as IoT data is increasingly used for decisions affecting citizen safety and critical infrastructure. In such environments, fog computing devices face threats that do not arise in a well-managed cloud environment as IoT data needs to be protected both in transit and at rest. This requires monitoring and automated response across the entire attack continuum: before, during, and after.
Fog computing can increase the benefits of IoT by accelerating response to events by eliminating a round trip to the cloud for analysis. It has the potential to avoid the need for costly bandwidth additions by offloading network traffic from the core network. If the data is within an internal structure, it can also protect data of a sensitive nature (healthcare, for example) that is gathered internally by analyzing it inside the infrastructure.
Blue Hill believes that organizations that adopt fog computing in combination with a cloud model have the potential to gain deeper and faster insights, leading to increased business agility, improved data safety and faster response times leading to higher service levels.