Cisco Systems officials have spent much of the year talking about extending the cloud with “the fog.”
In January, the networking giant introduced its fog computing vision, the idea of bringing cloud computing capabilities—from applications and data analytics to processing, networking and storage—to the edge of the network, closer to the rapidly growing number of user devices that are consuming cloud services and generating the increasingly massive amount of data.
Cisco officials also unveiled the company’s IOx platform, designed to bring distributed computing to the network edge.
Earlier this month, the vendor rolled out the next phase of IOx, which included adding to the list of Cisco products that support the platform and the list of partners involved.
The billions of mobile devices—such as smartphones and tablets—already being used to generate, receive and send data make a case for putting the computing capabilities closer to where these devices are located, rather than having all information flowing back over networks to central data centers. The expected growth of the Internet of things (IoT) only strengthens the argument.
Cloud adoption will only grow in the future, according to Todd Baker, head of IOx product management at Cisco. As it does, the fog’s importance will grow as well.
“We really see the cloud and fog as complementary to each other,” Baker told eWEEK.
To be sure, Cisco may have coined the term “fog computing,” but the company is far from the only tech vendor embracing the idea of putting compute capabilities out to the infrastructure edge. A broad array of industry players—from EMC and VMware to Intel, IBM and Extreme Networks—are rolling out products designed for what many vendors and analysts are calling edge computing.
For example, Extreme Networks this month unveiled its new Summit X460-G2 Gigabit fixed switch and IdentiFi Access Point 3805, both designed to work with the company’s NetSight management and Purview analytics platforms as part of its larger Unified Edge portfolio. Days later, Freescale introduced its new ARM-based quad-core QorIQ LS1043A communications processor, a high-performance and energy-efficient system-on-a-chip (SoC) designed for systems at the network’s edge.
Another point in the growing awareness around edge computing is the Fog Computing Conference, held Nov. 19 and 20 in San Jose, Calif. Along with Cisco, other vendors that appeared at the show ranged from top-tier tech companies like SAP, Broadcom, IBM and Salesforce.com to smaller players like Axeda, Aeris and 6fusion to industry organizations like the Linux Foundation. Sponsors included Plat.One and Wi-Next.
All this technology is going to be needed, according to Vernon Turner, a senior vice president at IDC whose research areas include enterprise infrastructure, networks and the IoT.
“We clearly see data and content being created at the edge of the network via the Digital Universe (i.e. if it can create information, it will—be it a human, a car, a house, a factory… sensors),” Turner said in an email to eWEEK. “Quite a lot of this content won’t be sent over the network to be processed by the ‘enterprise-based’ cloud infrastructure. Rather, you will need cloud computing-like processing at the edge,” he said.
“In summary–this is a big deal,” said Turner.
Cisco estimates that there currently are 25 billion connected devices worldwide, a number that could jump to 50 billion by 2020. And these smart devices are generating a lot of data, according to the company.
The 46 million smart meters in the United States today include 1.1 billion data points, while a single consumer packaged goods manufacturing machine generates 13 billion data samples a day. A commercial airliner creates 10 TB of data for every 30 minutes of flight time (and there are more than 25,000 flights per day).
Fog Computing Aims to Reduce Processing Burden of Cloud Systems
In all, more than 2 exabytes of data are generated worldwide every day, according to Cisco. And that will only grow as the number of smart connected devices and systems increases.
That represents a host of challenges for enterprises and tech vendors. Using all of that expensive bandwidth to run the data from the devices to a central cloud data center would be costly. Latency also becomes an issue. Some applications demand immediate response, and consumers increasingly are looking for rapid service delivery.
“For some of the real-time apps that are emerging—in manufacturing, in vehicle-to-vehicle communications—the [speed] requirements are much more stringent,” Mark Hung, research vice president at Gartner, told eWEEK, pointing to accident avoidance technologies in cars as an example. “They want to minimize delays in communication.”
Enterprises also want to derive as much useful business knowledge as quickly as possible from all the data that’s being collected. They don’t want to wait for the data to be sent from the device to a central data center, where it’s analyzed before the results are sent back. They want the data from the sensors to be collected, brought together, analyzed as it comes into the network and managed at the edge.
“It’s a lot of data,” Cisco’s Baker said. “The problem with all that data is that it’s just data, and data by itself is not useful. What’s useful is information.”
Bringing those computing capabilities—from storage to processing to development—closer to the infrastructure edge can address the bulk of the issues around cost, latency and big data analytics.
“Building out a computing infrastructure to support such a large number of devices falls very much into cloud computing attributes—you need scale, speed and cost management,” IDC’s Turner said. “To that end, ‘fog’ computing gives edge processing in an IoT environment the same benefits as public/private cloud computing.”
Businesses are beginning to understand the trend. According to a survey of 800 data center professionals done this year by Emerson Network Power, 67 percent of respondents expect that by 2025, 60 percent of computing will be done in the cloud. Data centers will be smaller, and Internet switching centers will move closer to the end users, according to respondents. Forty-eight percent said they will be as close as the same city, while another 31 percent said the neighborhood and 21 percent said the block.
“While the findings suggest enterprise data centers will shrink in size, the overall load of these enterprises will continue to grow,” Emerson officials said in the report. “Some may decide to shift their load to larger co-location facilities, while others may place nodes on the thickening edge of the network.”
Adam Burns, product marketing director for Intel’s IoT Group, said the chip maker is seeing interest in its IoT gateway systems from a range of industries, including transportation to addressing such issues as saving fuel, smart homes and buildings, industrial and retail, where businesses can locally combine, manage and analyze data from such sources as digital signage, RFD readers and point-of-sale systems.
Intel, along with its McAfee security and Wind River software businesses, has created its Gateway Solutions for the Internet of Things, where they work with an array of original design manufacturers (ODMs) to build pre-integrated and pre-validated intelligent gateway appliances that are placed at the network edge and manage data traffic between the end devices and systems—including legacy machines—and the central cloud environment.
They also help in deciding what data can be kept locally, and which needs to be sent to the cloud, Burns told eWEEK. Some sensors collect data every few seconds, he noted. It would make little sense to send all of that data back to the cloud. Gateways such as those created via Intel’s program can be used to decide how much of that data is sent to the cloud.
Fog Computing Aims to Reduce Processing Burden of Cloud Systems
“You can’t send all of that data over expensive networks … and you don’t want to store it in the cloud in perpetuity,” Burns said.
Cisco is aggressively building out its fog computing efforts and IOx platform. It not only increased the number Cisco products to 16—from switches to routers to IP cameras—that will support the platform, but also the list of partners for the ecosystem around it.
More than a dozen companies have signed on, including IBM, SAP, Xerox, Siemens, General Electric and Honeywell. In addition, application integration and management are 10 times easier, according to Cisco’s Baker.
Through IOx and APIs, Cisco is putting such computing capabilities as business intelligence and analytics, compute, storage, network devices and control applications in the fog, between the devices and the cloud. In presentations, Cisco officials cite several examples where having these capabilities closer to the devices is proving beneficial.
With railway systems, sensors can detect and immediately respond to equipment failures and monitor the health of the trains in real time. In oil and gas fields, they can proactively monitor pipelines for problems, while in cities, fog computing can help manage traffic congestion.
Bit Stew is a 9-year-old company that offers its Grid Director product to the utility industry. Grid Director gives customers an in-depth and real-time view of their smart grids, offering such capabilities as real-time analytics and dynamic event management.
The company—which is a Cisco partner and the recipient of Cisco funding—is working with several utilities in both the United States and Canada, including BC Hydro, whose service area in British Columbia is about the size of California, Oregon and Washington combined, according to Bit Stew CTO Kai Hui.
BC Hydro has about 1.8 million smart meters in the field that all send back a wide range of information, Hui told eWEEK. The meters can, for example, sense when power goes down somewhere in the grid and send a message to the control center. However, when that happens, officials in the control center must then find out if effected area was part of a work order—and the alert can be ignored—or if it’s something new and needs to be addressed. Such tasks are time-consuming and not scalable, the CTO said.
That should change next year, when BC Hydro fully embraces IPv6 and can take advantage of fog computing capabilities offered through Cisco and Bit Stew, Hui said. For example, the utility will be able to add an application and install a policy that tells a sensor that if an outage is tied to a work order, there’s no need for an alert.
In addition—now that Bit Stew can integrate Grid Director into Cisco’s connected grid routers in the network—it can bring the compute capabilities it offers in the utility’s data center (such as analytics and processing) to the edge and better address the concerns of utilities around grid resilience and security.
BC Hydro has about 3,000 connected grid routers in its environment, which Hui said means 3,000 points of fog computing potential.
“It’s pretty powerful,” he said.
Most of the utilities Bit Stew works with are eager to move into edge computing, the CTO said. They see the benefits of better security and resiliency, reduced latency and lower costs.
“The utility industry understands the value of fog and edge computing, but I don’t think they’re all there yet,” Hui said, noting that it is ready to adopt distributed monitoring but more reluctant for security reasons to embrace distributed control. “It’s a cultural and policy thing. They want to ensure 110 percent security before allowing distributed control. … Eventually they will get there.”