Fog Computing Aims to Reduce Processing Burden of Cloud Systems

Fog computing is bringing data processing, networking, storage and analytics closer to the devices and applications that are working at the network's edge.

Fog Computing B

Cisco Systems officials have spent much of the year talking about extending the cloud with "the fog."

In January, the networking giant introduced its fog computing vision, the idea of bringing cloud computing capabilities—from applications and data analytics to processing, networking and storage—to the edge of the network, closer to the rapidly growing number of user devices that are consuming cloud services and generating the increasingly massive amount of data.

Cisco officials also unveiled the company's IOx platform, designed to bring distributed computing to the network edge.

Earlier this month, the vendor rolled out the next phase of IOx, which included adding to the list of Cisco products that support the platform and the list of partners involved.

The billions of mobile devices—such as smartphones and tablets—already being used to generate, receive and send data make a case for putting the computing capabilities closer to where these devices are located, rather than having all information flowing back over networks to central data centers. The expected growth of the Internet of things (IoT) only strengthens the argument.

Cloud adoption will only grow in the future, according to Todd Baker, head of IOx product management at Cisco. As it does, the fog's importance will grow as well.

"We really see the cloud and fog as complementary to each other," Baker told eWEEK.

To be sure, Cisco may have coined the term "fog computing," but the company is far from the only tech vendor embracing the idea of putting compute capabilities out to the infrastructure edge. A broad array of industry players—from EMC and VMware to Intel, IBM and Extreme Networks—are rolling out products designed for what many vendors and analysts are calling edge computing.

For example, Extreme Networks this month unveiled its new Summit X460-G2 Gigabit fixed switch and IdentiFi Access Point 3805, both designed to work with the company's NetSight management and Purview analytics platforms as part of its larger Unified Edge portfolio. Days later, Freescale introduced its new ARM-based quad-core QorIQ LS1043A communications processor, a high-performance and energy-efficient system-on-a-chip (SoC) designed for systems at the network's edge.

Another point in the growing awareness around edge computing is the Fog Computing Conference, held Nov. 19 and 20 in San Jose, Calif. Along with Cisco, other vendors that appeared at the show ranged from top-tier tech companies like SAP, Broadcom, IBM and to smaller players like Axeda, Aeris and 6fusion to industry organizations like the Linux Foundation. Sponsors included Plat.One and Wi-Next.

All this technology is going to be needed, according to Vernon Turner, a senior vice president at IDC whose research areas include enterprise infrastructure, networks and the IoT.

"We clearly see data and content being created at the edge of the network via the Digital Universe (i.e. if it can create information, it will—be it a human, a car, a house, a factory... sensors)," Turner said in an email to eWEEK. "Quite a lot of this content won’t be sent over the network to be processed by the 'enterprise-based' cloud infrastructure. Rather, you will need cloud computing-like processing at the edge," he said.

"In summary–this is a big deal," said Turner.

Cisco estimates that there currently are 25 billion connected devices worldwide, a number that could jump to 50 billion by 2020. And these smart devices are generating a lot of data, according to the company.

The 46 million smart meters in the United States today include 1.1 billion data points, while a single consumer packaged goods manufacturing machine generates 13 billion data samples a day. A commercial airliner creates 10 TB of data for every 30 minutes of flight time (and there are more than 25,000 flights per day).