Michael Sooley, chief information officer at San Francisco law firm Sedgwick, Detert, Moran & Arnold, was planning a server virtualization project using VMware. Sooley, whose firm has 850 employees at nine U.S. locations, wanted the flexibility to manage server resources on the fly. But during planning, it became clear that the storage needs for the VMware servers and for other storage-intensive projects could all be satisfied by one central storage network.
Sooley says as his group examined the storage needs of each project, they came to the conclusion that isolated storage silos were really not needed despite the different requirements of each project.
The CIO says some projects, such as virtualization, required reliable high-speed storage for the VMware. Other projects, such as e-mail archiving and data backups, required high-volume lower-speed storage. “All we needed to do was find the right storage system for our firm,” he explains.
Sooley says his prior experience with storage area networks led the team to focus on SANs instead of network-attached storage or direct-attached storage.
“With a SAN, I can provision storage to each virtual machine, just as with VMware, I can provision server resources,” he says. He recently purchased eight NetApp systems with a total of 140 terabytes of storage.
Sooley and many other IT professionals find that SANs (made by companies such as Hewlett-Packard, EMC, Brocade and NetApp) simplify administration tasks by presenting a single view of storage usage and capacity. SANs also offer more flexibility and scalability because administrators can add or subtract storage allocated to any server on the fly without getting their hands dirty from physically replacing storage devices. It’s often easier to replace servers that boot from the SAN because you don’t have to reload a new storage device with software. And by spanning multiple locations, SANs can be an effective piece of a data replication and recovery architecture.
Although shared storage is not new, Bob Laliberte, an analyst at Enterprise Strategy Group, points out that SANs are becoming less costly and are easier to install, making them a reasonable option for companies with small (or no) IT departments. “The complexity level of configuring and managing a SAN has dropped over the last few years,” he says.
Laliberte agrees with Sooley that for many companies, a SAN is part and parcel of the overall movement toward consolidation and server virtualization. “The biggest SAN driver in today’s market is the virtual machine architecture,” he says.
But Laliberte is quick to add that companies do not necessarily have to pair a SAN project with server virtualization.
For example, 28-employee SwervePoint in Danvers, Mass., which markets promotional merchandise with company logos, found itself adding computer resources, including storage, every year.
Kevin Phoenix, the company’s head of operations and administration, says when his company opened its doors five years ago, it ran with a single server. It quickly added three more servers, including a web server, all with local storage. Then Phoenix realized that the need for storage, not processing power, would drive future server growth.
“We came to the conclusion that, rather than throwing more boxes into the closet, we could get a SAN,” Phoenix says. He bought an EMC CLARiiON AX4 storage system with six 400 gigabyte drives and room for six more.
What is the primary driver for your company to deploy a storage area network?
34% Server virtualization
24% Improved data backup
18% We have no plans to deploy.
14% High-speed data access
6% E-mail archiving
The SAN lets Phoenix allocate storage as needed and add capacity to the SAN for much less than it would cost to add new servers. He points out that at least at his firm’s present level, he has no need for virtualization because his server farm is still modest and physically
Even companies that do purchase SANs as part of a virtualization project often find that the technology delivers benefits beyond those associated with virtualization. Sooley linked the technology to a number of his firm’s IT initiatives, not just virtualization. Accordingly, he was able to pay for it out of a number of different buckets.
One of the most important of these initiatives is disaster avoidance. He plans to store multiple versions of data locally and replicate data between NetApp SAN devices in different offices, providing the firm with a much more robust backup and recovery capability.
Sooley says the SAN also fits well with his firm’s need to manage the rapidly growing volume of e-mail more efficiently. By moving less used e-mail attachments outside of the Exchange storage groups to other high-speed storage space on the SAN, he can increase both the performance and the reliability of the Exchange e-mail systems.
ROI: A Moving Target?
Sooley is not a big believer in calculating an exact ROI. He says on his last job, while he reduced capacity requirements for existing apps using a SAN, at the same time storage requirements were growing. So ROI quickly becomes a moving target. Still, he does expect to achieve some cost savings by reducing the amount of storage he will need.
“If you are using only 50 percent of a 2TB array, that leaves 1TB that no other server can utilize. With a SAN, you can make use of most of that excess capacity,” Sooley says.
Phoenix hopes for a two-year payback. He bought the SAN when he was running out of capacity on a number of servers, so he offset some of the cost of the SAN by not having to purchase new servers. He arrives at a two-year payback by estimating the number of servers he would have had to purchase compared with the cost of adding storage to his SAN.