The following article is Open access

Implementing data placement strategies for the CMS experiment based on a popularity model

, , , , , , and

Published under licence by IOP Publishing Ltd
, , Citation F H Barreiro Megino et al 2012 J. Phys.: Conf. Ser. 396 032047 DOI 10.1088/1742-6596/396/3/032047

1742-6596/396/3/032047

Abstract

During the first two years of data taking, the CMS experiment has collected over 20 PetaBytes of data and processed and analyzed it on the distributed, multi-tiered computing infrastructure on the WorldWide LHC Computing Grid. Given the increasing data volume that has to be stored and efficiently analyzed, it is a challenge for several LHC experiments to optimize and automate the data placement strategies in order to fully profit of the available network and storage resources and to facilitate daily computing operations. Building on previous experience acquired by ATLAS, we have developed the CMS Popularity Service that tracks file accesses and user activity on the grid and will serve as the foundation for the evolution of their data placement. A fully automated, popularity-based site-cleaning agent has been deployed in order to scan Tier-2 sites that are reaching their space quota and suggest obsolete, unused data that can be safely deleted without disrupting analysis activity. Future work will be to demonstrate dynamic data placement functionality based on this popularity service and integrate it in the data and workload management systems: as a consequence the pre-placement of data will be minimized and additional replication of hot datasets will be requested automatically. This paper will give an insight into the development, validation and production process and will analyze how the framework has influenced resource optimization and daily operations in CMS.

Export citation and abstract BibTeX RIS

Please wait… references are loading.
10.1088/1742-6596/396/3/032047