December 12, 2007 08:00 ET

BlueArc Storage Solution Helps Brookhaven National Lab Chart New Frontiers With Proven Technology

Lab Expects Massive Supercollider Data Will Be No Match for BlueArc Titan Server

SAN JOSE, CA--(Marketwire - December 12, 2007) - BlueArc® Corporation, a leader in scalable, high-performance unified network storage, today announced that Brookhaven National Lab (Brookhaven), a multi-program laboratory operated for the U.S. Department of Energy, has deployed a BlueArc Titan 2200 cluster with nearly 300 terabytes of storage. The BlueArc Titan system serves as a massively scalable and reliable foundation for the fastest available access to data resulting from research today -- and in the future.

"We can't afford to experiment when it comes to storage infrastructure," said Robert Petkus, RHIC/USATLAS Computing Facility, Brookhaven National Laboratory. "As Brookhaven prepares to support some of the world's most important particle physics research next year, we've replaced cutting-edge but inadequate systems with BlueArc Titan 2200 servers that can scale effortlessly and respond consistently to shifts in volume and demand."

Approximately 3,000 scientists, engineers, technicians and support staff and another 4,000 or more guest researchers per year depend on data from the Relativistic Heavy Ion Collider Computing Facility (RHIC) that Brookhaven operates at its U.S. facility. Brookhaven also has a major role in international projects such as the ambitious Large Hadron Collider (LHC) under construction by CERN, the European Organization for Nuclear Research and the world's premier particle physics research lab. Data from RHIC experiments is proliferating at an astounding rate, and Petkus anticipates that by 2012, Brookhaven will have more than 4,000 nodes on its storage area network. With so many users and many means of accessing data, Petkus sought a unified storage environment and a single vendor to help him retain control over the implementation.

BlueArc offers precisely the combination of record-setting performance and reliability essential to deliver data that maps the speed of change of subatomic matter. Petkus and his team have deployed a two-node BlueArc Titan 2200 cluster with six-gigabit connections trunked together and 288 terabytes of Fibre Channel disk capacity. Petkus sees a two-fold advantage in the BlueArc Titan solution's distinctive hardware-based architecture, which supports multiple access protocols without requiring modification to Brookhaven's 2,000-node server farm, maximizing the value of the laboratory's technology investments and supporting growth.

"My job is to think ahead as far as I possibly can," said Petkus. "Every node in our storage network is becoming a supercomputer with massive memory and 64-bit architecture. We support huge networks, huge amounts of data and demanding physicists around the world, so I've always got to know what the latest high-performance technologies are and make choices that won't risk our data to unproven systems."

About BlueArc

BlueArc is a leading provider of high-performance unified network storage systems to enterprise markets, as well as data intensive markets, such as electronic discovery, entertainment, federal government, higher education, Internet services, oil and gas and life sciences. Our products support both network attached storage, or NAS, and storage area network, or SAN, services on a converged network storage platform.

We enable companies to expand the ways they explore, discover, research, create, process and innovate in data-intensive environments. Our products replace complex and performance-limited products with high performance, scalable and easy to use systems capable of handling the most data intensive applications and environments. Further, we believe that our energy efficient design and our products' ability to consolidate legacy storage infrastructures, dramatically increases storage utilization rates and reduces our customers' total cost of ownership. Information about BlueArc solutions and services can be found at

Contact Information