TY - JOUR AU - Bockelman, Brian AB - Hadoop is an open-source data processing framework that includes a scalable, fault-tolerant distributed file system, HDFS. Although HDFS was designed to work in conjunction with Hadoop's job scheduler, we have re-purposed it to serve as a grid storage element by adding GridFTP and SRM servers. We have tested the system thoroughly in order to understand its scalability and fault tolerance. The turn-on of the Large Hadron Collider (LHC) in 2009 poses a significant data management and storage challenge; we have been working to introduce HDFS as a solution for data storage for one LHC experiment, the Compact Muon Solenoid (CMS). TI - Using Hadoop as a grid storage element JF - Journal of Physics Conference Series DO - 10.1088/1742-6596/180/1/012047 DA - 2009-07-01 UR - https://www.deepdyve.com/lp/iop-publishing/using-hadoop-as-a-grid-storage-element-dkhX63AgLx SP - 012047 VL - 180 IS - 1 DP - DeepDyve ER -