The Tampa Bay Rays play in the American League East, arguably the toughest division in baseball. In competing with the likes of the Boston Red Sox and New York Yankees with a much smaller payroll, the Rays pride themselves on making smart use of technology. That’s why the team revamped its data center two years ago.
Juan Ramirez, senior director of IT, says the Rays had the same challenges that face many organizations: limited power and space. In addition, running their air conditioners to cool the data center became very expensive.
The Rays deployed the SmartRow intelligent, integrated infrastructure from Emerson Network Power. The SmartRow solution includes system controls, sealed racks, cooling, back-up fans, and an integrated fire detection and suppression system. The Rays equipment is also supported by two Liebert NX uninterruptible power supplies (UPS) for power quality and backup power.
“The cooling system allows us to keep the cold air in the racks,” Ramirez says. “It’s also very quiet; you can spend the entire day there and not know that you’re in a data center.” The Liebert gear also enables the team to use data center space more efficiently.
Ramirez says the integrated suppression system was another big plus for the Rays, saving the team the $150,000 expense of retrofitting a fire system. Overall, he estimates that the Emerson power and cooling equipment reduces the Rays’ electric bill by 25 percent annually. “We strive to give the team the computing resources they need to be competitive,” he says. “We just hope that our efforts pay off and the team is back in the playoffs at the end of the season.”
David Cappuccio, a managing vice president for Gartner and chief of research for the infrastructure teams, says more businesses are moving to in-row cooling. “The idea is to bring the solution to the problem,” he says. “Why cool the entire room when you can focus on cooling the equipment where the heat is coming from? We’ve seen organizations reduce their power and cooling costs by as much as 30 percent.”
Up until about a year ago, Rock Springs National Bank in Wyoming used to experience frequent failures of its UPSs and problems with their compressors and air conditioners. “We just had too much downtime because of power and cooling issues,” says Lance Laughter, the bank’s network administrator.
The solution was a full complement of power and cooling equipment from American Power Conversion. RSNB deployed APC InRow cooling, Symmetra UPSs and Switched Rack power distribution units to support its two locations. “We used to do room-style cooling in a server room,” Laughter says. “Now, instead of cooling the entire room, we remove the heat the equipment is generating.”
The new power and cooling system has cut RSNB’s power requirements by 50 percent, according to Laughter. “The previous cooling systems were not as adaptive,” he says. “We used to keep the same temperature set point at night that we would during the day, but the temperatures would tend to go much higher during the day, causing frequent failures.”
As part of the APC solution, RSNB also uses APC’s InfraStruXure management software. If the temperature exceeds a certain threshold, Laughter receives email alerts of the potential failures. “We get a full report of the temperature, humidity, air flow, power, voltage, amperage and the runtime of the batteries,” he says. “It lets us be much more proactive.”
Most data centers must be upgraded while work continues. Gartner’s David Cappuccio offers three tips for retrofitting an existing data center.
Break floor space into discrete sections. Clear out a small section of floor space — roughly four racks of space— for an in-row cooling unit. It could be as small as 60 to 120 square feet and reside on an existing section of raised floor or on a slab.
Depending on the vendor selected, the self-contained rack unit will require power from an existing power distribution unit, or in some cases, a refrigerant or cooling distribution unit. Assume an increase in per-rack space of about 20 percent to take into account additional supporting equipment.
Reconfigure and defragment the floor. It’s unlikely that the workloads moved to the new enclosure will all come from the same racks, which means that the older section of the room will now be heavily fragmented. Move workloads out of underutilized racks to free up additional floor space for the next self-contained installation.
This reconfiguration will take time and affect servers, storage and networking components and connections. Much of the activity will need to happen in off-hours or on weekends, so it’s critical to integrate this work into the organization’s change control processes.
Reconfigure again. By implementing a phased retrofit, data center managers can attain significant growth within the facility while reducing power and cooling requirements.
Implementing more efficient cooling can boost equipment density and PDU utilization at the rack level. A more efficient cooling delivery system also requires less overall power to support a given IT load, freeing up additional power for future growth.