Aug 30 2022

When Should You Opt for Edge Computing Versus the Cloud?

Edge computing manages incoming data at the point where it’s created, which allows agility and acts as a buffer against internet connectivity issues. Yet it lacks some of the cloud's advantages.

The rise of the Internet of Things has been astronomic. By 2030, the number of connected IoT devices worldwide is expected to be just over 29 billion. For enterprises that need to monitor and respond to incoming data, edge computing will become as important as cloud migration has been to date.

However, for businesses, decisions about edge or cloud are unlikely to be binary. “It's not an either-or, as you're not going to just have cloud capabilities, and you're not going to just have edge capabilities,” says Michele Pelino, who leads Forrester’s edge computing and IoT research.

Rather, many businesses in a decentralized world will need to carefully assess their needs. “You have to think about the use cases that are going to be helpful in your own organization, the ones you're already deploying, and the ones that may be down the road,” says Pelino. Here are some of the qualities of edge and cloud, along with which option is best suited for which business need.

Click the banner to unlock exclusive cloud content when you register as an Insider.

Edge and Cloud Computing Have Different Levels of Latency

For many enterprises with minimal computing, data handling and local machine needs, the cloud can minimize latency well enough. An organization using, for example, a customer relationship management program is better served by maintaining strong control of data centralized in the cloud.

Edge computing’s extremely low latency becomes necessary for organizations when their data is more distributed (think IoT devices), and they are more dependent on real-time processing of data and responsiveness. Such use cases include companies employing autonomous machines, face recognition, predictive maintenance, or sensors in manufacturing and agriculture. If a self-driving car is about to hit a pedestrian, or a warehouse robot is about to cause a critical error, poor latency can be disastrous.

The cloud, nonetheless, often remains involved. “There is still a need to have information come in centrally,” says Rob Clyde, a former ISACA board chair and board director of IoT cybersecurity company Crypto Quantique.  “For example, you may well do some functional capabilities, like processing literally at that edge location, to make a decision right then and there. But you may also then send some of that data back into the cloud environment for additional artificial intelligence and machine learning.”

Handling Cybersecurity in the Cloud Versus at the Edge

Cloud service providers offer businesses much peace of mind with cybersecurity because they assume responsibility for protecting a company’s data. Edge computing, however, makes cybersecurity more complicated.

“When you start adding in many types of connected edge devices in more remote locations, you are adding layers of potential attack surface,” says Forrester’s Pelino. As a result, adopting edge computing — and its widely distributed endpoints — requires significant investment and ownership. Creating and executing a rigorous cybersecurity strategy that allows for protection and monitoring demands money, time and the efforts of an internal IT department or external partner.

DIVE DEEPER: Find out how to enhance security in public, private and hybrid clouds.

Edge Computing Can Incur Additional Equipment Costs

Among the cloud’s benefits are the cost savings that result from the elimination of expenses for hardware, IT staff, electricity and more. Everything a business needs is provided by the cloud service provider.

Edge computing reintroduces the need for significant physical infrastructure and management. “You may need actual localized servers and data storage near the edge,” explains Clyde. “You will need beefy computers.” The cloud is no longer enough. An organization has to return to owning, managing and supervising its own equipment, all of which adds expenses back to the bottom line.

That said, there are alternatives. “Equipment management can be outsourced to another vendor, who can provide racks that don’t introduce unwanted latency. They can manage that location, keep it secure and provide remote hands if you need to restart something,” says Clyde. “Now, you're not managing the rack; they are.”

There may also be savings to offset added expenses; notably, by reducing cloud storage needs. “You don't need to have everything go back into the cloud for additional processing,” says Pelino. “You're only going to send specific things back to the cloud. So, you're avoiding some of the additional expenses.”

At the end of the day, adding edge computing to existing cloud initiatives will be an investment that pays offs. Research firm Gartner has found that by 2025, about 75 percent of business data will be created outside of the cloud or central data centers. The sooner companies jump on board, the better.

metamorworks/Getty Images

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT