May 29 2014
Networking

International Speedway Corp. Revs Up Its Fan Experience

Racing giant crosses the finish line with a more powerful data center, faster network, unified communications — and a better fan experience.

It’s not just the racers who have a need for speed at the tracks run and owned by International Speedway Corp.

It’s the fans too.

“We have to compete with the living room experience and improve Internet connectivity, provide a better video experience on property, and leverage social media platforms to engage fans and speak to them in a meaningful and relevant way,” says Brandon McNulty, ISC’s chief technology officer.

In all, the company owns 13 motorsports facilities, including its flagship track in Daytona Beach, Fla. ISC facilities host some of the best high-performance cars that American racing fans will likely ever see. Now, the company has rolled out an equally turbocharged IT infrastructure to increase operational efficiencies and productivity and boost the fan experience.

The company began by beefing up its network with new Cisco Systems networking gear and then modernized its data center with a converged infrastructure that unifies blade servers, storage, virtualization and networking equipment. As the third phase of the revamp, the IT department replaced aging telephony with a new Cisco Unified Communications system that provides expanded voice and video capabilities companywide.

XP Power videoLearn more about ISC's IT revamp in this CDW video case study.

The result? A consolidated, energy-efficient, cost-effective systems environment that’s easier for employees to use and for the ISC tech team to manage, McNulty says.

“We commoditized our IT investments, and that frees up the budget for us to concentrate on enhancing the fan experience,” he explains. “We want to entice fans to come out and experience the event. We want to create an emotional connection and make them want to come back to buy another ticket next year.”

The new infrastructure not only enhances business operations and provides improved disaster recovery, it also reduces the amount of time the IT staff spends maintaining equipment, resulting in big cost savings, McNulty says. That, in turn, lets the IT department focus on new technology projects that help generate revenue, such as attracting more fans to racing events.

Each year, millions of fans attend ISC’s events — predominantly auto races with some motorcycle racing mixed in. But ISC leaders realize that they need to add amenities to keep their current fan base satisfied, as well as to attract new fans.

To that end, company leaders made a strategic decision to pump funds back into the business and rethink technology services. Because driving up attendance is a chief goal, ISC is investing in more fan-friendly technology, such as wireless Internet connectivity and flat-panel video displays, throughout its venues.

75 minutes

The time it takes ISC to restore critical business applications using its new FlexPod architecture

Turbocharging Network and Data Centers

When it began its network and data center upgrades in 2012, ISC saw the opportunity to create a future-forward technology foundation. The two projects provided an opportunity to get ahead of the technology curve and create a platform that was stable yet scalable for future growth, McNulty says.

As it began virtualizing servers, the IT team realized that much of ISC’s server hardware wasn’t meeting increasing processing demands, recalls Jerry Ballenger, senior director of technology engineering and services.

The company turned to CDW for advice but gained a lot more. First, ISC’s IT staff met with CDW solution architects at CDW headquarters in Vernon Hills, Ill., to discuss their options. Those architects, along with CDW engineers, then helped design and install the solutions that give ISC facilities the technological horsepower they have today.

“CDW gathered our requirements, identified our needs, and educated us on current technologies and what’s going to happen in the future,” Ballenger says. “They helped us think years down the road.”

After evaluating the options for retooling the infrastructure, ISC standardized on FlexPod — an integrated technology architecture that weds Cisco UCS B-Series blade servers and Nexus 5000 Series switches, NetApp networked storage and VMware virtualization software into what are essentially data center building blocks.

Following the initial FlexPod installation, ISC engineers migrated applications from the company’s old servers to the new environment — a process that took about three months, says David Luke, ISC’s director of IT engineering. Today, 300 virtual machines run on 10 blade servers. In all, ISC has virtualized about 75 percent of its servers, Luke says.

Because the converged FlexPod architecture works as an integrated unit, it’s easier to manage, Luke continues. If problems arise, support staff from Cisco, NetApp and VMware jump on a conference call with ISC engineers. “In the past, vendors might get into a finger-pointing game. But with FlexPod, they are on the phone at the same time, and we work through the issues together,” he explains.

Another benefit of the architecture is that the Nexus switches handle both Ethernet traffic and Fibre Channel storage connectivity. Networking is simplified because the number of necessary switches, network adapters and cabling is reduced, says Dave Persuhn, CDW’s principal network engineer, who helped install the equipment.

The technology has dramatically increased staff productivity, letting developers spin up servers in minutes instead of days. It has also let the racing giant establish a more reliable disaster recovery program.

The IT team, which previously used tape for backup, now backs up to disk twice a day, which ensures a quicker restore in case of problems. “We cut recovery time to a fraction,” Ballenger notes, adding that what once took days and weeks now can be achieved in four to 12 hours. “It’s incredible,” he says.

The company’s main data center in Daytona Beach features a two-node, high-availability NetApp FAS3270 cluster with 200 terabytes of storage. ISC replicates its data to a NetApp FAS3250 with 120TB of storage at a secondary data center in Concord, N.C.

Floor IT: ISC's New Network Fuels Productivity

While upgrading the data center, ISC increased Internet throughput from 3 megabits per second to 100Mbps at each of its facilities and bolstered the wide area network with Cisco 3945 and 2911 Integrated Service Routers. Each facility has two routers — one connected to the company’s main Internet service provider and another to a second service provider for redundancy, Luke says.

CDW Principal Consulting Engineer Faruk Azam used network monitoring tools to assess network performance and evaluate the business continuity and scalability of the network design. To boost WAN redundancy, Azam and his colleague, Cisco Network Engineer Marcus Auman, fine-tuned ISC’s Border Gateway Protocol implementation, which allows a primary router to failover to a secondary router.

“BGP is like an intelligent traffic cop that knows all of the lanes that are available on the Internet,” Azam explains. “It looks at network availability and can detect a failover situation and then switch from Path A to Path B.”

For its LAN service, the IT team had previously upgraded the network core with 10-Gigabit Cisco switches. But in the past year, it retooled the devices at the network edge, installing Cisco Catalyst 3850 Series switches at the distribution layer and Cisco Catalyst 2960 and 2960-S Series switches in local wiring closets.

With the network humming, the IT staff then moved on to a much-anticipated Voice over IP migration. Before VoIP, the company had multiple primary private branch exchange systems, which made daily management tricky and inefficient. What’s more, some of these PBX systems were more than 20 years old and starting to fail frequently, Luke says.

So last year, the IT department installed Cisco UC on a virtual server at its main data center. For backup, the IT staff deployed a secondary system at its Kansas City speedway. As a result, says McNulty, “we’re seeing tremendous cost savings and stability.”

Today, IT staff can manage and troubleshoot the Cisco VoIP system centrally from headquarters. ISC is also saving on long-distance charges because office calls are now routed over IP on its network. For employees, the new phone system provides added communications capabilities.

“It suits the needs of our mobile workers,” McNulty says. Everyone will be able to hold video conferences using the Cisco Unified IP phones on their desks and can check voicemail on their smartphones through a visual voicemail feature or their email client.

Letting Fans Take the Steering Wheel

The vast improvements in data processing and network performance resulting from the infrastructure upgrades are already translating into improved customer service for fans, Ballenger says.

Corporate applications run faster, and ISC websites deliver content more quickly. “It’s a much better experience as customers click on our websites to buy something,” he says. “If they are buying over the phone or at a ticket window, the customer service experience is faster. Employees no longer have to say, ‘Hold on; the system is updating.’ ” The speedier IT infrastructure has also expedited fan entry at the speedways.

The IT staff and ISC don’t plan to apply the brakes just yet. Instead, they are focused on advancing the fan experience even further. For starters, engineers have embarked on a Wi-Fi proof-of-concept project and intend to roll out wireless connectivity at Daytona and other venues.

“We wanted to get more auto­mation, flexibility and scalability in our data center, and we’ve done that,” Ballenger says. “Now, our engineers actually have more time to deliver on our business objectives and improve the fan experience.”

Tech Rolls Into the Pit Row

NASCAR is a performance sport. To get the most out of their race cars, drivers rely on a lot of technology, says Dr. Eric Warren, director of competition for Richard Childress Racing (RCR).

RCR, owned by former NASCAR driver Richard Childress, features seven teams, including drivers Austin Dillon and Ryan Newman.

On race day, team members specializing in analytics watch the race in the pit area and at headquarters in Welcome, N.C., where they use real-time analytics to help each driver and crew chief make strategic decisions on the fly.

These behind-the-scenes tech crews analyze race footage, audio streams of other racers, historical track data, real-time car performance data and even weather reports in hopes of giving drivers a competitive edge, says Warren, who sits in the pit area to advise the crew chief.

The RCR race team’s 18-wheelers are equipped with a satellite Internet connection, wireless network equipment, servers, notebook computers and tablets. During a race, Warren uses a tablet to access the data and communicates with his staff via Voice over IP and instant messaging.

There’s a lot of real-time data to analyze,” he says. “You can do some predictions, look at different scenarios and alter strategy” — all with the aim of helping the driver be the first to cross the finish line.

Jensen Larson
Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.