When your typical customer is an impatient, tech-savvy 18-to-34-year-old male, it pays to have a plan to prevent website downtime.
Los Angeles–based Break Media caters to that demographic, serving up videos and other content via such websites as Break.com and All Left Turns, which is dedicated to auto racing. The company’s eight sites average 110 million unique visitors per month.
Nick Wilson, Break Media’s chief technology officer, knows what it takes to keep visitors coming back again and again. “A site that is slow to load could prove problematic for a website’s client population, in addition to its users,” he says.
And that would create issues for a couple of Break’s client populations: not just the users, but also the advertisers who are trying to reach those users with services or products. As Wilson notes, “24x7 availability is key.”
Load times are measured in seconds, and post-load video launches within tenths of a second, Wilson says. Whether users are commenting on a video or clicking through a gallery of photos, “we want them to have a predictably good experience.”
To that end, Break Media has deployed about 100 HP ProLiant DL360 and DL380 servers. Whenever the company kicks off a new initiative — anything from Break’s recent venture to tie into Facebook and its “like” button to a new online game with the potential to go viral in a matter of hours — it overbuilds, pressing into service more server power than Wilson and his team think will be needed. (A recently launched property had two extra servers assigned to it “in case things went nuts” on launch day, Wilson says.) Once a new feature is live, site-monitoring software helps the IT infrastructure team keep tabs on any performance dips — and, if necessary, reallocate servers if traffic patterns merit it.
A one-second delay in website load time increases customer dissatisfaction by 16%.
SOURCE: Aberdeen Group
Wilson says he always tries to anticipate web server needs years down the road. Here are some tips that he and other small-business IT teams offer to maximize website uptime.
1. Don’t nickel and dime on the back end. The real cost isn’t the hardware required to keep a site live, Wilson says, but rather the man-hours and other resources required to bring an offline site back to life.
“Take a look at how much time you waste trying to eke out an existence with not enough hardware,” he advises.
Once sites are live, Steve Ervolino, senior vice president of information services at Dupaco Community Credit Union in Dubuque, Iowa, recommends investing in load-balancing hardware and software to spread the work around.
2. Don’t rush in. When you’re overhauling a website that’s falling short of expectations, “don’t throw up a site quickly just to say you have one,” says Lynn Thomson, director of information systems for the Water Environment Federation, an Alexandria, Va.-based nonprofit organization focused on water quality and treatment issues.
WEF maintains several websites, including some that are informational and others that offer books for purchase. Before a single page can be refurbished, “you have to do a lot of internal work” across all of the organization’s departments to figure out where the current site’s problems lie, Thomson says. It’s the departments, after all, that will provide the content to keep your sites current and compel visitors to return.
3. Take extra precautions with site upgrades. Ervolino recalls with a shudder the time a faulty patch to the credit union’s account database resulted in a couple of hours of unplanned downtime.
Typically, the credit union site’s uptime hovers in the 99.7 percent range annually — roughly equivalent to 25 hours per year of downtime spread among the credit union’s three sites, two of which are transactional.
How far in advance does your company plan for server capacity to support its website?
14% 7 to 12 months
39% We don’t host our own website.
18% Not at all
14% more than a year
12% 1 to 6 months
3% don't know
SOURCE: CDW poll of 395 BizTech readers
“The only time we seem to experience downtime, knock on wood, is when we do [software] updates,” he says. Although “there is no way around that in our environment,” he continues, updates usually require only minutes to perform and can be scheduled when site traffic is light.
4. Continuity of operations is a necessity. “It must be part of your operations,” Ervolino says. “We look at every critical system” — including those supporting online financial transactions — “and say, ‘If we can’t duplicate this in our disaster recovery center, we will not consider it.’”
Church Coffee’s site was offline for two weeks when the vendor providing its shopping cart functionality pushed out a faulty upgrade to clients. That kind of downtime can hurt a company’s image, says Brett Bixler, principal of the Baltimore-based coffee roaster and wholesaler, which supplies beans, cups and other coffeehouse accessories to churches and other facilities that want to offer coffee to members and visitors.
“I don’t know how you put a price on public perception and the relationships you’ve built,” Bixler says. No existing customers ceased ordering from Church Coffee, he says, but “there’s no way of knowing how many prospects we lost.”
5. Bank on users wanting new functionality. As more customers come to rely on your sites for more and different kinds of content or transactions, you need to stay several steps ahead of them.
“In 2000, most people were using [our sites] to balance their checkbooks and do checking-to-savings transfers,” Ervolino says. “But nowadays, people perform personal financial management tasks like bill-paying, initiating stop payments on checks” and more. Nearly half of the credit union’s customers signed up to manage their money online, and that number is expected to grow, he adds.