Resources / Insights

January 4, 2017

No tiptoeing around it: Fog is roaring in

by Lynne Canavan, Executive Director, OpenFog Consortium.

The fog comes on little cat feet.  It sits looking over harbor and city on silent haunches and then moves on.  – Carl Sandburg, 1916

One hundred years later, fog is making a much more dramatic and lasting impression in today’s digital world.  BI Intelligence forecasts that 5.8 billion IoT devices owned by enterprises and governments will use fog computing in 2020, up from 570 million devices in 2015.  Driven a specific set of high-velocity digital business problems and growth opportunities, fog computing isn’t tiptoeing in.  It’s coming at us in a full speed sprint.  For those involved in the Internet of Things (IoT), 5G, artificial intelligence and virtual reality, fog computing isn’t moving on. It’s here, and it’s now.

Fog computing distributes compute, communication, control, storage and decision making closer to where the data is originated, enabling dramatically faster processing time and lowering network costs.  At OpenFog, we describe it as bridging the gap in the cloud-to-thing continuum.  To illustrate why it’s necessary, let’s take a look at two scenarios.

Fog computing is often touted for its ability to slash latency.  Autonomous vehicles, emergency responsiveness, drones and virtual reality are among the dozens of applications that require sub-millisecond reaction time.  In cloud network-only environments, there are latency, mobility, geographic, network bandwidth, reliability, security and privacy challenges.  A drone, for example, can travel at 100 miles per hour, or roughly 147 feet per second.  During its journey, it requires continuous software updates, produces massive amounts of data that require computation and communication.  If you consider that the best cloud round trip latency is around 80 milliseconds, the drone would fly about 12 feet between cloud network messages.  Fog nodes can reduce the latency to such a degree that a drone will only travel two inches before the next update is delivered.

Fog also supports data-intense, remote operations.  In oil and gas exploration, real-time subsurface imaging and monitoring reduces the drilling of exploratory wells, saving money and minimizing environmental damage.  Thousands of seismic sensors generate the high-resolution imaging required to discover risks and opportunities.  Fog computing manages the energy, bandwidth and computing needed for timely risk and opportunity analysis in geographically-challenged, disruption-prone and data-intensive process.  Instead of collecting data in the cloud for post-processing, fog nodes form mesh networks to stream data processing tasks and communicate with each other to compute the subsurface image in the network.  The fog computing algorithm is resilient to network disruption and adapts to energy and bandwidth changes.

Fog computing requires rapid, trusted and secure transmissions, based on an open, interoperable architecture.  And that’s the work of the OpenFog Consortium.  Our technical teams are finalizing the OpenFog Reference Architecture, to be published in February 2017.  This is a unified framework for providing computing, networking and storage in the cloud-to-thing continuum, and is an important step in creating a common language for fog computing.  There’s more – much more – to come on this, as we do a deep dive into the eight foundational pillars:  Security, scalability, openness, autonomy, RAS (reliability, availability and serviceability), agility, hierarchy, and programmability.  From the silicon layer through to the operating system, OpenFog members are defining and testing functional and component level interoperability for fog-to-fog communication, by applying the architecture to specific use cases.


Get ready for the fog to roar its way into today’s connected everything world.

About the author:  Lynne Canavan is the Executive Director for the OpenFog Consortium, the global organization founded by ARM, Cisco, Dell, Intel, Microsoft and Princeton University Edge Laboratory in November 2015.  Previously, she was Vice President, Program Management for the Industrial Internet Consortium.  Lynne spent 17 years at IBM in its Global Alliances division and before that ran her own retail technology marketing firm.  She can be reached at

Scroll Up