Full-powered desktop PCs and notebooks have long been the engines that drive business users. This hardware takes advantage of processors that become more powerful with every refresh cycle, runs full portfolios of business applications and can house the storage and memory needed to keep most users working efficiently.
However, this tried-and-true approach also has some notable trade-offs. For one, it can be expensive to manage, maintain and secure.
Enter client virtualization, which places stripped-down hardware on desktops and uses corporate networks to connect that hardware with applications and data managed by IT departments in back-end data centers. It’s the same basic idea as the old data terminals that plugged into hulking mainframes to tap their computing power, but with all the innovations you’d expect after the intervening years of the Information Age.
This raises the question: Despite their challenges, desktop PCs have served businesses well for decades, so why mess with a good thing? The short answer is that, for the right applications and users, client virtualization offers the best of both worlds, with the bonus of delivering on the do-more-with-less goals that drive most organizations today.
What’s so good about client virtualization? At the top of the benefits list is the potential lower IT cost. “Our studies have found that VMware View can reduce desktop total cost of ownership by up to 50 percent,” says Betty Junod, product marketing specialist for end-user computing at VMware.
Costs go down for a number of reasons, starting with lower upfront investment requirements for hardware. Traditional PCs have an expected life span of about three to five years, but in today’s demanding world that timeframe pushes limits.
“In terms of performance, you never want to be on the wrong end of a four- or five-year PC refresh cycle,” says Kevin Strohmeyer, senior product marketing manager for Citrix Systems’ enterprise desktop and applications group. But because virtualized client devices don’t do the heavy lifting when it comes to running applications and processing data, older PCs — even those past their fifth year — often work just fine.
Alternatively, IT managers may opt for thin- or zero-client computers. Thin clients, typically about the size of a small book, contain only a CPU, some memory and connectors to networks, monitors and keyboards. Noticeably absent, by design, is a hard-disk drive, which is unnecessary because all the data and applications reside on data center servers.
Zero clients are even more minimized. Unlike their thin-client cousins, they lack an embedded operating system, which is instead served up by server hosts. Zero clients range from book-size devices to compact power plants embedded into the back of a monitor.
Data center management of IT resources, the key characteristic of client virtualization, is another cost saver. Centralized management gives the IT department a unified view into the status of all enterprise resources via management consoles.
This means technicians can resolve problems without having to visit each affected desktop. The end of “sneaker net” support and other benefits of centralized management can often let businesses double the number of users assigned to each IT administrator, which boosts IT productivity, according to VMware’s Junod.
It also means the IT staff can proactively address problems to increase application uptime and keep everyone working more efficiently. Microsoft, which offers a variety of desktop virtualization options, including Windows Server 2008 R2 Remote Desktop Services and Hyper-V, provides a tool to help companies gauge the total cost of ownership to determine the business benefits of these investments.
Centralization also helps the IT staff roll out resources faster than in traditional desktop PC environments, where you first need to acquire the device, install it on a desktop and load all the necessary software and settings. By contrast, with virtualized clients, when the bare-bones device is connected to the corporate network, the full work environment is available, often within minutes.
Fast deployments are especially important after a large acquisition, Strohmeyer points out. “Suddenly you have 5,000 new employees, and you can light them up as fast as you can enter them into your corporate directory,” he says.
Benefits continue after the initial setup. If end users need to access their applications and data in a conference room, on the shop floor or in someone else’s office, they just log in to the network and their entire workspace appears, regardless of where they are.
SOURCE: CDW Client Virtualization Straw Poll
“Centralized services mean employees can work when they want, how they want and on the device that best fits their needs,” Junod says. “It also allows IT departments to embrace new device types, such as iPads and Android tablets, because it enables access to Windows desktops and applications on the go.”
Dynamic provisioning is another plus. If a workgroup in the finance department needs extra resources to close the month-end books, for example, IT managers simply allocate the additional power to the desktops by drawing on a pool of virtual servers in the data center. When the crunch is over, the extra processing power goes back into the pool, ready to serve someone else’s deadline crunch.
Updating user authorizations, installing new security patches and complying with regulatory requirements can be a full-time job in an environment dominated by desktop PCs. Client virtualization relieves those pressures because it concentrates these important activities within the data center, not across dozens or thousands of desktops. And for companies in highly regulated industries, this data-centric approach can keep close tabs on everyone who tries to look at sensitive data.
“Security managers know exactly when a user accessed the information and from what location,” Citrix Systems’ Strohmeyer says.
There’s a lot to like with client virtualization. But like most IT initiatives, careful upfront planning can mean the difference between success and mixed results.
Strohmeyer cautions IT managers to curb their return-on-investment enthusiasm. “The danger is with the word ‘virtualization,’” he says.
In recent years, server virtualization has become synonymous with relatively quick ROI resulting from money-saving hardware and facility consolidations. The desktop version can also deliver impressive returns, but IT organizations need to be patient.
The reason: Infrastructure investments that support ROI may be needed to accommodate the new desktop model, and there is a possibility that ongoing investments will be required as organizations scale out client virtualization to a larger number of end users, Strohmeyer says.
To choose the right desktop technology, devices and infrastructure components to support client virtualization, IT managers need to define their business goals. “Clearly identify business triggers that can be impacted by delivering desktops differently,” Junod says.
She also advises businesses to assess the capability of their IT staff early in the process to determine who’s available to help with deployments and what outside expertise, if any, might be needed to augment the IT department.
Next, businesses should conduct a thorough IT assessment, looking at everything from the capacities of the data center servers that will be supporting the new desktop infrastructure to the associated network and storage systems.
“Networks and storage resources warrant particular attention,” says Joe Jessen, director of professional services at the Gotham Technology Group, which runs a virtualization consulting practice. “Because desktop virtualization requires a constant flow of data, applications and images between desktops and data centers, IT managers need to provide the bandwidth necessary to support initial client virtualization launches and ongoing scale-outs.”
LAN and WAN optimization technologies may also be required to accelerate throughput and optimize network performance.
Similarly, the advantages of diskless virtualized devices will materialize only if firms have the right networked storage systems in place to handle hundreds or thousands of simultaneous users. Among today’s best storage choices for client virtualization often are storage pools backed by storage area networks, which use high-speed Fibre Channel or Fibre Channel over Ethernet (FCoE) protocols to provide adequate response times. SANs also enable IT managers to allocate storage capacity on the fly to accommodate users’ dynamic requirements.
IT organizations also need an accurate inventory of the business applications that will be part of the virtualization solution. Included in this assessment should be a close review of software licenses. Desktop virtualization may require changes or updates to current contracts, and discussions with the vendor of each package may be necessary to avoid any subsequent legal issues.
Next, the IT team must decide which applications should be part of the virtualization plan and set an implementation timetable for each, giving priority to those programs that will provide quick benefits with the least disruption to users.
Managing end-user expectations is another important implementation requirement. Some skeptics will balk at giving up their personalized and high-powered desktop PC in favor of a stripped-down economy model managed and controlled by a central IT authority. Worries about “big brother” oversight should be addressed.
But client virtualization veterans say these initial concerns aren’t insurmountable. Education is key. Set aside time and resources to prepare end users for the changes and to outline the anticipated business benefits. Pilot projects will go a long way toward calming concerns that performance will suffer under the strain of an inadequate infrastructure.