Businesses’ efforts to decentralize their computing resources through hybrid cloud deployments have shined a light on a separate but related tactic: the use of edge computing, in which organizations leverage data center resources on-premises in remote locations or colocation facilities.
Two universal principles define edge computing. First, it is distributed, with computing and processing taking place away from a centralized data center or cloud. Second, it is location-specific, with key computing elements physically placed where data is created or used.
Why Deploy Edge Computing?
Typically, four business needs drive IT leaders to make the leap into edge computing, says Dave McCarthy, a research vice president in IDC’s Cloud and Edge Infrastructure Services practice: the need to access data at faster speeds and reduce latency; to improve security and compliance or achieve data sovereignty; to control costs; and to ensure business continuity or resiliency.
Whether looking to control costs or overcome some of the hurdles of moving data around, businesses increasingly opt to shift processing and computing activities to the location where that data is generated. Doing so typically negates costs otherwise associated with consumption-model cloud storage, because data that doesn’t need to be stored can instead be leveraged for immediate or real-time insight, then discarded.
Edge computing also ensures that businesses make more immediate use of their valuable data, rather than simply storing it to make sense of it sometime in the future, which McCarthy equates to “stuffing money in a mattress.” Manufacturing or industrial use cases, especially, require faster data processing when milliseconds matter to quality or safety outcomes.
While edge computing should not be considered a replacement for the cloud, it is a complementary technology or approach that solves some of the limitations of centralized cloud architectures, McCarthy says.
Click the banner to learn how your institution can benefit from a hybrid cloud environment.
How to Optimize Edge Computing in a Hybrid Cloud Environment
Edge computing architecture encompasses several tiers of infrastructure. A colocation facility operates as an edge, for example. Many telecommunication providers are creating deployment locations to enable edge computing, which is an example of a provider-managed edge location.
A business may also choose to operate its own data center in a retail store, factory or satellite location. Regardless of the scenario, proper planning and consideration of these three priorities will ensure any infrastructure deployed at the edge can be optimized to perform and deliver against mission-critical priorities.
- Determine the appropriate location for edge assets. Optimizing the edge requires understanding precisely where all required applications will run best within an organization’s security, budget and performance requirements. Many businesses start by experimenting with edge, trying out a few different deployment scenarios, then optimizing key elements based on those initial experiments. The answer won’t always be the same, McCarthy advises.
- Don’t assume a cloud-native application will function in the same way on the edge. Cloud-native applications often operate differently when they run on edge-computing assets. That can lead to unfavorable results. Understand whether the application can be scaled to the edge and whether and how it will optimize data flows. What data must be kept local and what can be sent to the cloud? “Many vendors now understand there needs to be a complement to cloud-native,” McCarthy says. “That might be referred to as edge-native, which uses the same constructs, but perhaps only certain elements or functions within that application will run out on the edge versus back in the cloud.”
- Avoid deploying piecemeal components and custom or bespoke solutions. When edge computing first started to gain traction in business, many users “cobbled together their own Frankenstein solutions,” McCarthy says. “Bundled solutions now include the hardware and the software, and more often are available as a turnkey service.”
DIVE DEEPER: Discover the basic steps to optimize your organization’s hybrid cloud environment.
Turnkey solutions may be deployed horizontally (where a partner provides the required infrastructure and a business adds its own applications on top, such as with HPE GreenLake) or vertically (featuring an industry-specific application, such as Microsoft Azure for Financial Services).
Businesses’ top goal when deploying edge computing, after all, is to minimize the gap between collecting data and achieving business outcomes, and these latest edge solutions enable that.
Most of the same infrastructure and equipment common to the data center will be deployed as part of an edge computing environment: a server, storage, connectivity and a platform for applications to run on. The chief difference is that the edge infrastructure will be deployed in smaller configurations. Instead of determining how to scale up a cluster in a data center, teams instead will need to figure out how to scale out to hundreds or maybe thousands of locations.