How Will Fog Computing Help IoT Adoption in a Cloud Model?

pexels-fog-linesIn an era where industry is focusing on gains made from big data and analytics, organizations are making a more concerted effort to tap into these edge of the network insights derived from a variety of data generated via devices and sensors. The growing Field Service Management (FSM) software segment, for example, is driven by factors such as the use of cloud, analytics, and mobile technologies.   Organizations are gathering every little byte of information they can find from these sources and pushing it to the cloud where the storing, processing and analysis of this data happens to bring out deeper trends and competitive benefits.

The Internet of Things (IoT) is expanding as more smart devices start communicating with each other, but frankly, machine to machine (M2M) data transfer is not new and has existed for quite a while. What is new is the driving need to harness this data urgently and put it to use in a larger context.

Despite its power and potential, the model of cloud computing is still focused on a centralized functionality and is difficult to be applied to environments where internet connectivity is poor or not stable and operations are time-critical. How can this be fixed?  By adding fog computing, which extends the cloud model to be closer to the objects that produce and act on IoT data.

Latency-Free in the Fog

Fog computing solves the latency problem prevalent with cloud computing by keeping the data as close to the ground as possible rather than sending it back and forth through a central cloud location.  Fog computing can address the inadequacies of cloud-only models, which have serious challenges with latency, network bandwidth, geographic focus, reliability, and security. Fog computing is actually the continuum of cloud towards the edge of the network. And Fog supports densely distributed data collection points, adding a fourth axis to the often-mentioned big data dimensions (volume, variety, and velocity).

But rather than replacing cloud, fog computing is likely to complement it with its ability to reduce the amount of data to be sent to the cloud to be processed. Fog will improve the cloud model as a complementary asset by taking a part of the burden away from what cloud servers currently have to handle alone now. By enabling the cloud by pre-processing data, cloud will be freed to take care of the heavier tasks, like analyzing larger datasets of processed sensor inputs for longitudinal trends, for example. Cloud is useful as an environment where the data has to be analyzed from several fog computing sources that are remote from each other.

Blue Hill believes that a combination model of cloud computing and fog computing is what the enterprises truly need for the rapid adoption of IoT.

Use Cases for Fog Computing

One interesting use case is shown by Airbus, which collects information on maintenance, spare parts, and aircraft performance at a regional and local level to best optimize time on the ground. Other use cases include smart traffic light systems, which can change its signals based on surveillance of incoming traffic to prevent accidents or reduce congestion where data could also be sent to the cloud for longer-term analytics.

Other cases discussed in industry have included rail safety; power restoration from a smart grid network; and cybersecurity. There are also use cases with connected cars (for vehicle-to-vehicle and vehicle-to-cloud communication); and in smart city applications, such as intelligent lighting and smart parking meters. Fog computing is also used in New York-based renewable energy company Envision who has been able to obtain a 15 percent productivity improvement from the vast network of wind turbines it operates. The company is processing as much as 20 terabytes of data at a time, generated by 3 million sensors installed on the 20,000 turbines it manages. Moving computation to the edge has enabled Envision to cut down data analysis time from 10 minutes to mere seconds, providing them with actionable insights and significant business benefits.

Open Fog?

Fog computing can be leveraged to provide ample opportunities for creating new applications and services that cannot be easily supported by the current host-based and cloud-based application platforms. In that light, a new industry group, The OpenFog Consortium, was formed last November to define and promote fog computing. The consortium, founded by ARM, Cisco, Dell, Intel, Microsoft and Princeton University, “seeks to create an architecture and approach to fog, edge, and distributed computing”. Now with after one year, the 53 members in 15 countries involved in the Consortium they are, through a series of working groups, working on developing an OpenFog architecture, addressing security issues, and planning industry testbeds.

OpenFog Key Pillars

How will this help the use of fog computing for IoT?  The Consortium’s work “is centered around creating a framework for efficient & reliable networks and intelligent endpoints combined with identifiable, secure, and privacy-friendly information flows between clouds, endpoints, and services based on open standard technologies.” An open framework gives references points for others in both implementation and development.  Best practices and user cases are great incentives for others to follow suit, very much like what has happened with Bitcoin.

How Secure is Your Fog?

Security solutions exist for cloud computing, but due to the underlying differences between cloud computing and fog computing, such solutions may not suit fog computing devices that are at the edges of networks. This is important as IoT data is increasingly used for decisions affecting citizen safety and critical infrastructure. In such environments, fog computing devices face threats that do not arise in a well-managed cloud environment as IoT data needs to be protected both in transit and at rest. This requires monitoring and automated response across the entire attack continuum: before, during, and after.

Concluding Thoughts

Fog computing can increase the benefits of IoT by accelerating response to events by eliminating a round trip to the cloud for analysis.  It has the potential to avoid the need for costly bandwidth additions by offloading network traffic from the core network. If the data is within an internal structure, it can also protect data of a sensitive nature (healthcare, for example) that is gathered internally by analyzing it inside the infrastructure.

Blue Hill believes that organizations that adopt fog computing in combination with a cloud model have the potential to gain deeper and faster insights, leading to increased business agility, improved data safety and faster response times leading to higher service levels.

About Dr. Alea Fairchild

Dr. Alea Fairchild is an Entrepreneur-in-Residence at Blue Hill Research. As a technology commentator, she has a broad presence both in the traditional media and online. Alea covers the convergence of technology in the cloud, mobile, and social spaces, and helps global enterprises understand the competitive marketplace and to profit from digital process redesign. She has expertise in the following industries: industrial automation, computer/networking, telecom, financial services, media, transport logistics, and manufacturing. Her clients are both commercial, government / public sector, NGO and trade associations. Dr. Fairchild received her Ph.D in Applied Economics from Limburgs Universitair Centrum (now Univ. Hasselt) in Belgium, in the area of banking and technology. She has a Masters degree in International Management from Boston University/Vrije Universiteit Brussel, Brussels, Belgium, and a Bachelors degree in Business Management and Marketing from Cornell University, Ithaca, New York. She is a masters Olympic weightlifter for Belgium, having won many international medals.
Posted on December 9, 2016 by Dr. Alea Fairchild

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Latest Blog

GRC Implementation Success, Part 3: Business Requirement Definition GRC Implementation Success, Part 2: GRC’s Place in the Business GRC Implementation Success, Part 1: Implementation Success is GRC Success

Topics of Interest

Advanced Analytics

AI

Analytics

Anodot

Attunity

authentication

BI

Big Data

Blog

Business Intelligence

Cloud

Cognitive Computing

Corporate Payments

Data Management

Data Preparation

Data Wrangling

DataKitchen

DataOps

DataRobot

design

design thinking

Domo

Emerging Tech

enterprise applications

Enterprise Performance Management

enterprise video

fog computing

General Industry

GoodData

GRC

Hadoop World

Human Resources

IBM

IBM Interconnect

Iguazio

ILTACON

Informatica

Information Builders

innovation

Internet of Things

IoT

knowledge

legacy IT

Legal

Legal Tech

Log Data

Machine Learning

Managed Mobility Services

Microsoft

Mobility

Nexla

Order-to-Cash

passwords

Pentaho

Podcast

Predictive Analytics

Private Equity

Procure-to-Pay

Qubole

Questioning Authority

Recurring Revenue

Risk Management

ROI

Sales Enablement

Salesforce

Security

service desk

Social Media

Strata

Striim

Supply Chain Finance

Switchboard Software

Tableau

Talend

Tangoe

Telecom Expense Management

Time-to-Value

Trifacta

TWIDO

Unified Communications

usability

USER Applications

User Experience

User Interface

video platform

Virtualization

Visualization

Wearable Tech

Yellowfin