The Internet of Things generates a lot of data. How much, you say? Cisco Systems projects that total data generated by all people, machines, and things will reach 600 zettabytes by 2020. Compared with the total traffic over the internet, which crossed the 1 zettabyte threshold at the end of 2016, IoT traffic is becoming a beast.
How can a company wrangle value out of all the data traversing its IoT ecosystem?
Companies will need to find the appropriate spot in their IoT workflows to process data in order to determine how to reap value from it. As BizTech explained here, organizations have three options for data processing beyond the data center: at the edge, in the fog or in the cloud. How your organization architects its compute environment will ultimately make or break your IoT initiative.
What IoT Data Do You Need?
Organizations need to lay out a vision for their IoT deployments to determine the data they want to extract. “Always start with a problem statement — what in your operations do you need to fix?” suggests Rob Enderle, owner and principal analyst of the Enderle Group. “It’s going to be different for every company. Pick the IoT path that best solves that problem.”
To get at the problem, some organizations are turning to Six Sigma methodology, developed by Bill Smith and Mikel Harry in 1986 while they were working at Motorola. Six Sigma is an approach to improving the overall quality of the output of a process by identifying and removing problems and defects.
“You need to figure out what the key data is,” explains Chet Hullum, general manager of industrial solutions at Intel. “Where can you get the biggest bang, and where are the biggest bottlenecks that are impacting your business? Matching Six Sigma ideas to your IoT deployment, you can see where you should put your compute power and where you will need more horizontal computing.”
Evaluate the Existing IT Environment
As part of this problem-solving process aimed at designing an architecture, organizations need to consider their current environment. There is often legacy equipment in use that is not IP-compatible and thus not IoT-ready.
“The market is pushing manufacturers to take their data and turn it into action. At Intel, we end up working in a lot of ‘brownfield’ environments,” says Hullum. “These are legacy facilities that have a good process in place. We work to figure out how to integrate into that environment.”
Integrating legacy operational technology and information technology is a heavy lift. OT, Gartner notes, “is hardware and software that detects or causes a change through the direct monitoring and/or control of physical devices, processes and events in the enterprise.”
As Link Simpson, the IoT and digital transformation practice lead at CDW, points out, this integration also extends to the professionals who work within these once-separate environments. Roles and responsibilities will be affected by a move to IoT, so smart change management within the organization is key to a successful deployment.
Match Data to the Right Compute Channel: Edge or Cloud
As the architecture is mapped out, understanding the strengths and weaknesses of each approach to IoT computing is important. “Edge computing is for when you need decision-making at the edge, right away,” explains Helder Antunes, senior director of Strategic Innovation Group at Cisco. “Think of it as a miniature data center at the edge. This can be expensive, but good for the right situation.”
How can edge computing be used in the real world? “Pattern matching engines are a good example of an edge computing use case,” suggests Hullum. “Think about machine vision control. If I’m doing quality control on a production line, I’m looking for a defect, using video and compute power to find problems. If I can do compute and pattern matching at the edge, I’m onto something. It’s constant learning.”
Cloud computing serves a different function for data processing. “Cloud should be reserved for heavy computation,” continues Antunes. “Any use case that requires immediate action doesn’t support the cloud case.”
“That’s the next level north — going to a server [in a data center] or to cloud,” says Hullum. “Then you can relearn that machine at the edge. You go to the cloud and connect more data points, say, multiple factors and different materials on other machines.”
Cloud also offers other benefits. “Going to the cloud makes sense when interfacing with enterprise systems — to get inputs from a wider pool of data,” Hullum says. “Ecosystem elements may dictate when you need cloud input and access to information from outside the plant.”
Combine Compute Options, Including Fog Computing
“Edge, fog, cloud – these are not separate options,” says Antunes, “they are synergetic. Edge is a part of fog computing, encompassed by fog, as is cloud. Based on your requirements, you will use the compute elements to implement the resources you need. And fog is what holds it all together.”
While cloud computing carries data back to a central server for storage and analysis, fog computing enables analytics and other functions to be performed at a network’s edge, right at the data source.
“Amid all of this data transmission, fog computing is the missing link,” continues Antunes, “making sense of what to push to the cloud and what to keep in infrastructure.”
To better explain Cisco’s conception of IoT, especially where fog computing fits into it all, Antunes invokes the idea of a connected airplane. Aircrafts collect a lot of data in flight. Some of it is acted upon immediately at the edge, such as data connected to collision avoidance.
Once the plane lands, it has data that is pushed to the cloud, where, after the flight, it is reviewed and analyzed. This data might support adjustments to fuel use and routes for other flights.
Other data is directed by fog computing to additional compute resources on the aircraft to do analytics while the plane is in the air. If anomalies are detected in an instrument or part, that data can be transmitted via satellite to the destination airport where a replacement can be arranged upon the aircraft’s arrival.
“For most everyone, it’s some combination of edge computing, fog computing, on-premises and cloud,” says Enderle. “It’s often a combination of different computing situations. You’ll keep it on-premises for main operations and security computing, go to the cloud for data analysis, use edge computing for real-time computing needs.”