Dec 13 2022
Management

Tech Trends 2023: Predictive Analytics, Edge Computing and Zero Trust Will Be Hot

Experts predict ‘evolution, not revolution’ in 2023. That sounds nice, for a change.
Tech Trends 2023

If the past several years have proved anything, it’s that trying to predict the future is a fool’s errand.

Prior to 2020, no one could have foreseen a global pandemic that would send workers home and leave businesses scrambling to buy up devices and implement remote work solutions. And who was expecting the kind of brutal supply chain challenges we saw, hampering tech rollouts and driving up costs?

Still, looking back, the organizations that were best prepared for the unexpected were those that had already been adopting emerging technologies that made their environments more agile, secure and user-friendly. Similarly, 2023 is likely to see organizations increase their investments in three areas that forward-looking IT and business leaders have already begun to embrace: cloud computing and edge computing to promote agility, availability and performance; predictive analytics to deliver data-driven business insights; and zero-trust security measures that help protect IT environments in a world where physical perimeters have disappeared.

“We’re expecting to see reasonably straightforward moves ahead along reasonably known lines,” says Mike Bechtel, chief futurist for management consultancy Deloitte. “These are evolutions, not revolutions.”

Tech Trends 2023

 

Edge Computing Takes Its Place Beside the Cloud

It may be a stretch to consider cloud computing an “emerging trend” in 2023, but this could be the year that businesses finally make a decisive shift away from on-premises data center infrastructure — especially for new and expanding workloads.

Also, organizations are increasingly embracing edge computing infrastructure that helps to improve application performance and speed even as they continue to move away from centralized data centers.

“To the extent that you’ve got bottle-necks, needs, questions worth asking or feel otherwise stuck, the cure for what ails you is going to be out there somewhere in the form of cloud tools and edge computing,” Bechtel says.

Harmonic, a San Jose, Calif.-based organization that offers streaming and broadband services, is using a mix of cloud and edge computing to deliver services to its customers, many of which are media companies that distribute video content on television and the internet.

Stephane Cloirec, Harmonic’s vice president of video appliance product management, says that the media and entertainment sector is moving away from satellite and toward a hybrid cloud model to deliver content. “We see more and more traction for the cloud when it comes to distribution,” he says. “But that does not take away our customers’ need for edge infrastructure to repurpose the content into formats that are suitable for their own distribution networks.”

Click the banner below to learn how businesses are achieving zero trust.

Edge computing is especially important, Cloirec says, for use cases where the sheer volume of data makes the cloud impractical from a performance perspective. By having powerful edge infrastructure in place, organizations can decrease the distance that data must travel for certain applications, reducing latency and improving reliability. At the edge, Harmonic has deployed HPE ProLiant servers to its customers’ physical data centers, helping to improve performance and speed for data-intensive applications.

Across industries, Cloirec notes, adoption of the public cloud has rapidly become widespread, with even longtime skeptics finally turning to cloud resources. “The COVID-19 pandemic was a drastic accelerator for cloud adoption,” he says. “We have customers that would not have considered moving their workflows to the cloud beforehand, and that’s no longer the case.”

Cloirec notes that some of its customers could not get their employees back into their facilities to operate their equipment: “That has forced those customers to think differently and explore how to shift to a remote model.”

LEARN MORE: Get the help you need to deploy a zero-trust security architecture.

Predictive Analytics Expected to Emerge

For years, IT and business leaders have been promised predictive analytics tools that use historical data, statistical modeling and machine learning to go beyond mere analysis of present conditions and actually forecast business outcomes. Those tools are finally maturing to the point where businesses are deploying them in earnest.

“Data science has gotten to a speed and capability where it’s helpful, not pesky,” Bechtel says. “Five years ago, predictive analytics would tell you what you should have done differently — too late to be useful. They were like the Star Wars sidekick C-3PO, chirping over your shoulder. Today’s artificial intelligence and machine learning capabilities are more like Chewbacca, a co-pilot that gets you out of harm’s way.”

Polygon Research, a data science company serving the mortgage industry, uses a predictive analytics platform from Qlik to create tools for its customers that help them to comply with fair lending regulations, uncover hidden geographical patterns and make predictions about business outcomes.

“The water level of data is rising all the time, and there are a lot more things that you can do with all of this data at your fingertips than most organizations are currently doing,” says Polygon Research CTO Greg Oliven. “There’s money to be made, certainly, for businesses. Companies have been focused or 10 to 15 years on monetizing their data, but now there are new ways to bring this data together to answer questions and provide insights.”

Edge Computing El Punto

 

Oliven warns that it’s important for organizations to ensure data quality before trying to use that information to make predictions. “As with anything, it’s garbage in, garbage out,” he says. “That’s always the potential pitfall. Having solid processes about your entire data flow is very important.”

Previously, Oliven says, Polygon Research worked with a different data platform provider, but it ran into limitations around complexity and scalability. One thing he appreciates about Qlik is the low- and no-code capabilities for predictive insights.

“The tools are getting more capable, and I think they’re becoming more oriented toward the citizen data scientist, if you will, putting more capabilities into the hands of folks to easily get started and get insights quickly. That’s a big trend we’re seeing right now.”

Zero-Trust Security Becomes Ubiquitous

Even five years ago, the concept of zero-trust security architecture was a reach for all but a handful of organizations. Today, even federal agencies, historically slow to embrace emerging technologies, are engaging in mandated zero-trust programs, and organizations across industries are rapidly moving from planning to implementation.

“In the past, you would build a moat. But then, if someone got in, they had free rein of the whole castle,” Bechtel says. “Zero trust flipped the script and said, ‘We’re going to lock every room, so even if someone gets in, they only have access to that one square meter.’”

Josh Hamit, CIO of Altra Federal Credit Union in Wisconsin and a member of the ISACA Emerging Trends Working Group, says that Altra has deployed a zero-trust platform from vendor Illumio, but he stresses that a zero-trust strategy is an entire set of solutions and practices, not just a single tool. “I don’t necessarily think that you can implement any one product and then say that you’ve got zero trust in place,” Hamit says. “In that way, it may be a bit of an overused phrase. You need to have layered security in your environment to really have zero trust.”

EXPLORE: Learn how to manage cybersecurity and data privacy issues at the same time.

Altra’s zero-trust platform provides network segmentation, which is designed to stop attackers who penetrate an organization’s first layers of defense.

Altra also relies on tools such as Azure Active Directory Conditional Access and Microsoft Authenticator. The Conditional Access tool denies access to systems and data when it detects obviously suspicious behavior; for example, an employee’s account shows someone attempting to access a file from the U.S., then trying to gain access 10 minutes later from a foreign country. Authenticator, meanwhile, provides multifactor authentication.

“What’s important isn’t the name or the specific tools,” Hamit says. “What’s important is the principle: Never trust, always verify. Instead of assuming that the bad guys are out on the perimeter, you treat them as if they’ve made it inside your network, and then you put in the appropriate controls to stop them.”

Illustrations by Brian Stauffer
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT