Many IT administrators are facing a new challenge: the incredible shrinking data center.
A variety of trends, led by cloud computing, virtualization and data center consolidation, is reshaping the look and size of traditional data centers. As applications and storage resources migrate to the cloud, compact, purpose-built IT environments are springing up to support the remaining in-house resources still running at headquarters or remote facilities.
But even though these applications and data reserves don’t require the support of a huge data center, they are still essential to the organization. “Smaller doesn’t mean less important. In fact, it could mean the opposite,” says Srdan Mutabdzija, solutions manager at APC by Schneider Electric. “Enterprises need to pay more attention to these sometimes forgotten spaces because they’re often the main connections to the organization’s customers and to its data in the cloud.”
The irony is that these core services may relocate to a network closet or other small room that wasn’t originally designed to house critical IT equipment. This fact often hits home when organizations encounter unscheduled downtime due to inadequate power and cooling capabilities. “Companies can pack an enormous amount of computing resources into a small space. But when it comes to power and cooling these environments, it’s definitely a case where one size does not fit all implementations,” says Charles King, principal analyst at the consulting firm Pund-IT.
Fortunately, downsized versions of proven technologies from traditional data centers can help IT managers provide proper levels of electricity and temperature management in these spaces to achieve high levels of performance and availability. The key is having a strategy geared for compact areas.
Emerson Network Power has a unique perspective on the small-space, distributed IT phenomenon. As a vendor of power and cooling technology, it helps customers design and outfit reliable IT closets and small rooms. In addition, the company recently underwent an internal data center consolidation project that takes advantage of smaller, distributed IT facilities.
First, Emerson consolidated its email system, manufacturing management processes and hosted applications within a central data center. “Then, each one of our divisions in essence began to phone home; they use a small IT closet as the conduit for information flowing to and from the central data center,” says Peter Panfil, vice president of global power at Emerson Network Power. “This illustrates why we believe closets and small data rooms are actually gaining in importance. More frequently, they are becoming lifelines that connect enterprises to a centralized data center, hosted location or cloud facility.”
As a result, the days of seeing these rooms as a place to store some servers, network switches and routers next to the janitor’s mops and brooms are long gone. “IT managers expect the same from their wiring closets or small data rooms as they do from their enterprise data centers,” Panfil says. “They want the resources to be efficient, to be protected and to meet expectations for high availability. After all, if the edge of the network isn’t reliably available, it becomes a weak link.”
Recent research shows how costly IT breakdowns can be. The 2013 Cost of Data Center Outages study, conducted by the Ponemon Institute, found that unplanned data center downtime costs approximately $7,600 per minute, a major increase from $5,600 in 2010. Total unplanned data center outages averaged a recovery time of 119 minutes, equating to about $901,600 in total costs. In addition, the research estimated that partial outages or those limited to certain racks averaged 56 minutes in length, and costs were approximately $350,400.
Problems with power and cooling in spaces not built for IT resources pose serious challenges to uptime. For example, ranking high in the Ponemon Institute’s list of root causes of downtime were uninterruptible power supply (UPS) battery failures, UPS capacity overloads, UPS equipment failures and failures in power distribution units (PDUs) or circuit breakers.
Problems may first surface when IT managers struggle to get enough power into these spaces to adequately run important equipment. For example, wiring closets and small rooms may be served by 20-amp wall outlets that are standard in commercial offices. But the power density of IT equipment may require retrofitting outlets to 30- or 60-amp services, says Graciano Beyhaut, senior product line manager for Eaton Transactional Power Products. “Enterprises may need to bring in a contractor to upsize the breaker and do some rewiring to get the proper outlet at the right frequency and power draw,” he says.
Cooling these tight, often unventilated spaces also requires close attention. “Some organizations haven’t properly optimized the cooling design, which means these facilities run too hot and too humid,” says Mutabdzija. “Also, we see messy cabling and wires all over the place, which can impair heat dissipation. The conditions inside the rooms often go unmonitored, so people don’t know about problems until something goes wrong with the IT equipment.”
To maintain reliable power supplies and sustain operations during brownouts or long-term disruptions, IT managers need UPSs that not only protect valuable equipment but also accommodate small spaces. UPSs designed for much higher capacity tend to be “big, bulky and heavy, built into a form factor more fitting for a conventional data center,” says Rich Feldhaus, product manager at Tripp Lite. That’s not the case when IT managers use compact wall-mount or open-frame racks to optimize space in closets and small rooms, rather than the conventional full-size racks common in traditional data centers.
The smaller depth dimensions of the thinner two-post racks present problems when organizations use them to hold full-size UPSs optimized for four-post racks. “People sometimes call standard UPSs ‘kneecap busters’ when they’re in a wiring closet because they extend from the rack at kneecap level,” Feldhaus says. As a result, Tripp Lite and other UPS vendors now offer compact rack-mount UPSs with reduced depths. For example, Tripp Lite offers two models that are 16.8 inches deep or 13 percent smaller than conventional units.
How much UPS capacity is enough? In some cases, organizations need enough battery life to support data backups and equipment shutdowns that take 30 to 60 minutes to complete. “It’s important to accurately calculate exactly how much run time you will need during an outage so you can determine the right sized UPSs,” Beyhaut says.
These estimates also play a pivotal role in determining the right PDUs to install in these areas, Beyhaut adds.
To accommodate tight physical constraints, many IT managers choose zero-U, vertical PDU designs that provide outlets to the full height of a rack. Even models with the largest housings may measure only about 2.5 inches square, a dimension appropriate for small areas. In addition, the overall design doesn’t use up any rack space.
The best UPSs and PDUs for small spaces provide critical information for administrators of distributed IT environments. “They’re your eyes and ears to what’s going on in that particular closet,” Feldhaus says.
For example, a “smart” UPS can test itself to confirm that it will be available in case of an emergency and signal IT managers when a battery is going bad or when the UPS is nearing its overload threshold. Intelligent PDUs offer a built-in network jack and individually controllable outlets, which let IT managers remotely reboot a network switch, for example, if a problem occurs with equipment in a wiring closet.
But true digital intelligence requires interoperable components. “If you get a rack from one vendor, a UPS from a different vendor and PDUs from a third vendor, none of this equipment may be able to talk together,” Mutabdzija says. “Look for solutions that all speak the same language. It’s even better when UPSs and PDUs speak the same language as your IT equipment. This will give you a holistic pane of glass to proactively monitor power usage, temperature and humidity, as well as the status of your IT equipment, whether the facilities are in the same building or across the city, country or globe.”