The success the Arizona Cardinals football team enjoyed in 2008 came as a result of attention to detail and team discipline. The same attributes were critical to building the IP-converged voice and data network for its new stadium, which opened only two years earlier.
According to Mark Feller, vice president of technology for the Cardinals, the team was determined to ensure that the stadium’s IT infrastructure would support myriad revenue-generating uses.
“We were looking to be able to provide cellular access to anybody using the facility, as well as accommodate any Wi-Fi needs we would have inside the stadium,” said Feller. “We wanted to make sure we could manage it and use it for any potential revenue purposes that we wanted.”
The biggest and most obvious requirement was to enable the approximately 64,000 fans who jam into The University of Phoenix Stadium every Sunday to use their mobile devices for any purpose.
But the stadium also hosts more than 280 events per year, from trade shows and conferences to concerts. The football field itself sits on a set of rails and is actually rolled out of the stadium when it’s not in use; the cavity below the field then becomes the trade show or concert floor, 40 feet below ground level.
This underground facility forced Feller and his team to get creative. “Being 40 feet underground, we realized cellular coverage would be very poor,” said Feller.
To maximize the potential, the team had to ensure that the network could support signals from every wireless voice carrier in the nation, as well as Wi-Fi and public safety radio, in just about every area of the facility.
Feller said he wasn’t free to discuss specific opportunities for generating revenue, but said that the Cardinals organization has looked into a variety of ways it can earn income from the network itself.
“ROI was a consideration,” he said. “We paid for the installation, and we wanted to make sure we had a way to get a return on that investment.”
The team has also discussed possible uses of the Wi-Fi and cellular network “for concessions or merchandise, or things we can do with partners.”
Feller decided to implement a DAS (distributed antenna system) that would centralize all cellular, Wi-Fi and RF (radio frequency) coverage using a single antenna. The DAS also allows Feller’s team to monitor the health of the network, provide boosts and resolve frequency conflicts as needed.
The Cardinals employed CSI (Cellular Specialties Inc.), a VAR, to install the DAS, which provides coverage “in 98 percent of the building,” said Feller.
Using the DAS, each carrier sends its signal to a unique bay station located a mile away from the stadium. The signals are then converted to light, routed into a single fiber cable, transmitted to a main equipment room at the stadium, and then disaggregated and covered back to the RF used by each carrier.
There is a separate interface at the front end of the DAS for each carrier and an amplifier for each, controlled by the team. Each amplifier has a filter that delineates frequencies very sharply to maintain RF purity, reducing overlaps that cause harmonic conflicts.
The DAS is able to monitor the signals coming in and out, and sends an alarm to a management console in the event of either a malfunction within a signal source feeding the system or fiber damage.
Keeping the bay station away from the stadium means the Cardinals don’t have to deal with technicians from the various carriers, each trying to optimize service for their respective company.
“We didn’t want them to have their own antenna systems,” said Feller, because the result would have been chaos behind the scenes. “Every carrier would have had their own preference for antenna type and repeater locations.”
The DAS also allows the Cardinals to limit the visible infrastructure to a single antenna on the stadium roof. “We didn’t want the place looking like it was crawling with porcupines,” said Brian York of Insight Enterprises, which designed and delivered the overall IT solution for the Cardinals in partnership with Cisco SystemsCK.
In addition to providing reliable service, the network had to be flexible and easy enough to use for a small IT staff to reconfigure it in short order. The turnaround time between football games and other events could be as short as two days.
Another advantage of the DAS is that neither Insight nor the Cardinals wanted to have to perform maintenance in hard-to-reach places. With multiple antennas and multiple hardware locations, “you could mount access in very inconvenient locations,” said York.
The team also installed a 10G Cisco Ethernet ring between the stadium in Glendale and the team’s practice facility to provide full redundancy.
Technicians used an air-blown fiber system from Sumitomo to implement its fiber network in the new stadium. With air-blown fiber, compressed air is used to literally blow fiber cable through tubes filled with liquid nitrogen throughout the building.
This system allows contractors to install fiber as needed, without having to pull access panels and physically tug the cable through the building-reducing initial implementation time from one week to a couple of hours.
“We wouldn’t have been able to do the DAS if we had to do another cable pull because it would have competed with the construction schedule,” said York.
The IP network is embedded in the stadium floor, under the field, with manholes every 20 feet for ease of access. Each manhole cover has four pads, with plugs for high and low voltage as well as the Ethernet cabling.
This layout helps reduce the time and expense for setup and teardown for each event, York said.
Indeed, Insight maintains only two people on-site for each event. “We can rapidly reconfigure the space, and this allows the facility to host many more events than a traditional facility and generate more revenue without incurring significant cost,” said York. “Combined with the wireless and other core network components, we can mix and match and meet any [technology] requirement in the building.”
The IP network includes:
–An IP telephony system composed of 800 Cisco IP-based phones, including touch-screen models for luxury boxes, as well as point-of-sale terminals and video conferencing for team executives
–More than 100 Cisco wireless access points supporting 802.11 data coverage throughout the facility
–An APC modular rack in the main data facility that ensures 30-minute uptime for critical cutover to reserve power generators in the event of a power outage, and an APC NetworkAir FM50 cooling unit
Feller and his team had a firm deadline completion deadline: The stadium had to be fully functional for an exhibition game against the Pittsburgh Steelers on Aug. 12, 2006. However, they could start work on their part of the project only in December 2005.
Moreover, Feller discovered that aluminum cladding was added to certain sections of the structure, which meant that the telephony closet had to be moved to another location because the metal interferes with RF waves.
“Engineers had to modify the placements of antenna and repeater equipment to make it more dense in areas where they had the aluminum sidewalls,” said Feller.
York noted that the decision to use a DAS was made relatively late in the game, and if the organization hadn’t used an air-blown fiber system, it might not have been able to lay the cable in time.
Almost three years after the fact, Feller is sure he made the right call: “People are used to being able to communicate wherever they go. It’s a utility now. If I were going to build any kind of building, I’d use an antenna system like ours.”