NYSE’s Data Fortress Powering the Financial Cloud
By Rich Miller, Data Center Knowledge
30 June 2011
The long main hallway of the NYSE Euronext data center provides a sense of the immense scale of the 400,000 square foot facility in New Jersey. (Photo: Rich Miller)
There are few data centers in the world that are more difficult to get into than the new NYSE Euronext facility in Mahwah, New Jersey. The huge building features extremely strong access control, a substantial security perimeter, and is built to survive … well, just about anything.
“It is more robust than your typical structure,” said Steve Rubinow, Chief Information Office at NYSE Euronext, who said the data center “can withstand levels of punishment – both man-made and natural – that other facilities might not withstand.”
The 400,000 square foot data center has been engineered for security, speed and reliability to support its mission as the nerve center for the NYSE”s electronic trading operations. Starting tomorrow it will add a new role: powering the financial industry’s first cloud computing platform, which will let traders lease computing power at the new data center.
Technology Drives Trading Transition
The Mahwah data center, along with a similar facility outside London, provides the horsepower driving a major transition for NYSE Euronext. As the financial markets are being transformed by technology and global trading networks, data centers have become the industry’s new trading floors. NYSE Technologies, the IT arm of NYSE Euronext, set out to build a facility with extraordinary resiliency.
“Every time we conduct a tour, even the hardened IT guys who seen a lot of data centers universally come away impressed,” said Rubinow.
NYSE Technologies is proud of its facility, but cautious about revealing too many details of its security and reliability features. That includes the power feeds from utility providers, which provide particularly robust support. “On utility related matters, we benefit from features that are rarely found at any site currently operating,” the NYSE said. The utility feeds are backed up by banks of generators and a large supply of on-site diesel fuel.
High Redundancy in Power, Cooling Systems
All of the power and mechanical systems in Mahwah are engineered to at least 2N reliability, with backup equipment available for each piece of critical infrastructure. The mechanical and electric infrastructure occupies nearly three quarters of the building, with 100,000 square feet of space for server rooms. Each data center pod is approximately 20,000 square feet, and has dedicated power and cooling systems so that a failure in one pod won’t affect operations of the other data halls.
NYSE Technologies has completed three pods thus far, creating 60,000 square feet of raised floor space to house servers and storage for the exchange, its colocation clients and its new cloud service, the Capital Markets Community Platform. The Mahwah facility has space for two more pods, which will be completed as the existing space nears capacity.
“We made sure the power and cooling systems for each pod were isolated,” said Rubinow. “When we talk about having independent pods, you have to figure out the network and how we’ll keep working in the event of a single pod failure.”
The Need for Speed
NYSE Technologies’ cutting edge network is a key selling point for its colocation and cloud computing customers. In seeking to create one of the world’s fastest low-latency network, NYSE Technologies worked closely with networking vendors, including Juniper Networks. By collapsing the multiple switching layers present in traditional network architectures, the NYSE Euronext network design requires fewer devices and interconnections, leading to improved efficiencies in space, power, cooling and management.
“Performance is key,” Rubinow said. “We wanted to know exactly how long it took to get (data) through their equipment. The other issue is how quickly it can scale. We have to scale it in a flexible fashion.”
For the NYSE, scaling isn’t simply a matter of how many customers it hosts. The network must be able to scale to handle surges in trading volume on its exchanges, providing a mission-critical bursting challenge.
A “Science” to Scalability
“There’s a science to that, and also an art,” said Rubinow. “We can scale for a new customer, and that’s predictable and pretty easy to manage. The harder part is when our large customers increase their trading activity, because they can begin to fill our pipes very quickly. If there’s an unusual event in the world, their models generate lots and lots of traffic. The key question is how much buffer you put in (for traffic spikes).”
There are also network design challenges related to the NYSE’s proximity hosting services for customers who prize high-speed connectivity. A key goal is ensuring that the colocation environment maintains a level playing field.
“It’s very important that our customers are equal citizens,” said Rubinow. “We design the network so that it provides all of our customers the same access. On the things that make a difference to traders, we make sure the paths are operationally the same.”
Site Selection and Expansion
After years of using or leasing third-party facilities, NYSE Euronext didn’t set out to build a greenfield data center.
“When we started the project, we were looking for a space where we could continue to grow and we could consolidate severs,” said Rubinow. “We looked all up and down the eastern seaboard for existing facilities. After looking for a long time, we didn’t find a physical facility that met our needs. Our requirements eliminate a lot of existing data centers and geographies pretty quickly. If you believe that the attributes of the facility add value, the best way to do it is to control your fate.”
Like many data center operators, NYSE Euronext carefully weighed how much to capacity it needed to build. That meant estimating the growth of the market for low-latency colocation services and the amount of space that might be needed to support the exchange’s cloud computing ambitions.
“The data center decision is typically a 15 to 20 year decision,” said Rubinow. “You don’t want to build too much. You might have the kind of growth that (forces expansion), but that’s a high-quality problem. We plan on a 15 to 20 year life for the data centers. In both locations, we have an ability to build little more space if we need it.”