Fog Computing Aims to Reduce Processing Burden of Cloud Systems
In all, more than 2 exabytes of data are generated worldwide every day, according to Cisco. And that will only grow as the number of smart connected devices and systems increases. That represents a host of challenges for enterprises and tech vendors. Using all of that expensive bandwidth to run the data from the devices to a central cloud data center would be costly. Latency also becomes an issue. Some applications demand immediate response, and consumers increasingly are looking for rapid service delivery. "For some of the real-time apps that are emerging—in manufacturing, in vehicle-to-vehicle communications—the [speed] requirements are much more stringent," Mark Hung, research vice president at Gartner, told eWEEK, pointing to accident avoidance technologies in cars as an example. "They want to minimize delays in communication." Enterprises also want to derive as much useful business knowledge as quickly as possible from all the data that's being collected. They don't want to wait for the data to be sent from the device to a central data center, where it's analyzed before the results are sent back. They want the data from the sensors to be collected, brought together, analyzed as it comes into the network and managed at the edge.Bringing those computing capabilities—from storage to processing to development—closer to the infrastructure edge can address the bulk of the issues around cost, latency and big data analytics. "Building out a computing infrastructure to support such a large number of devices falls very much into cloud computing attributes—you need scale, speed and cost management," IDC's Turner said. "To that end, 'fog' computing gives edge processing in an IoT environment the same benefits as public/private cloud computing." Businesses are beginning to understand the trend. According to a survey of 800 data center professionals done this year by Emerson Network Power, 67 percent of respondents expect that by 2025, 60 percent of computing will be done in the cloud. Data centers will be smaller, and Internet switching centers will move closer to the end users, according to respondents. Forty-eight percent said they will be as close as the same city, while another 31 percent said the neighborhood and 21 percent said the block. "While the findings suggest enterprise data centers will shrink in size, the overall load of these enterprises will continue to grow," Emerson officials said in the report. "Some may decide to shift their load to larger co-location facilities, while others may place nodes on the thickening edge of the network." Adam Burns, product marketing director for Intel's IoT Group, said the chip maker is seeing interest in its IoT gateway systems from a range of industries, including transportation to addressing such issues as saving fuel, smart homes and buildings, industrial and retail, where businesses can locally combine, manage and analyze data from such sources as digital signage, RFD readers and point-of-sale systems. Intel, along with its McAfee security and Wind River software businesses, has created its Gateway Solutions for the Internet of Things, where they work with an array of original design manufacturers (ODMs) to build pre-integrated and pre-validated intelligent gateway appliances that are placed at the network edge and manage data traffic between the end devices and systems—including legacy machines—and the central cloud environment. They also help in deciding what data can be kept locally, and which needs to be sent to the cloud, Burns told eWEEK. Some sensors collect data every few seconds, he noted. It would make little sense to send all of that data back to the cloud. Gateways such as those created via Intel's program can be used to decide how much of that data is sent to the cloud.
"It's a lot of data," Cisco's Baker said. "The problem with all that data is that it's just data, and data by itself is not useful. What's useful is information."