High energy physics, and large scale research in general, has both common and unusual requirements for computing. Software must be distributed across a wide range of non-heterogeneous resources, with single experiments able to continuously utilise many 10,000s of globally distributed machines. Exploitation of data continues for decades after it is first taken, making reproducibility and stability essential. The use of Nix has been tested within LHCb, one of the four large experiments at the Large Hadron Collider (LHC). In this talk we will discuss the conclusions of this testing, how Nix is suited to the needs of the ""big science"" community, as well as presenting some of the challenges which have been found when testing Nix. --- Bio: PhD student in high energy physics at the University of Manchester, UK and a member of the LHCb Collaboration studying the decays of Charm quarks. Interested utilising tools from outside the high energy physics community to make the use of computing more efficient. |