Quarks und Bytes
- Volker Lindenstruth (Frankfurt Institute for Advanced Studies - FIAS)
Abstract
At the Large Hadron Collider (LHC) at CERN in Geneva very large scale experiments are conducted. In these experiments each nucleus-nucleus collision can generate thousands of new elementary particles. For the understanding of the dynamics and the underlying physics it is paramount to identify those particles. They are being measured with a variety of detectors in order to identify their various properties. In addition the particles traverse a large magnetic field in order to determine their momentum. Typically those detectors measure space points, which have to be connected to the particles trajectory, which generates a combinatorial background with makes the computing cost prohibitively expensive. Novel algorithms have been developed which are based on cellular automata and Kalman filters in order to minimize the computational overhead. Those algorithms exhibit a linear dependence of the tracking time on the number of space points. Further the Kalman filter was optimized to equally perform very fast. To date those algorithms are the baseline for many new detector developments. The ALICE experiment at the LHC has undergone a significant upgrade and will start again with beam time in a few weeks. The data rates exceed 600 GB/s and all data of the experiment has to be processed on-line. An appropriate compute farm with 16.000 CPU cores and 2000 GPUs has been deployed, which is capable to handle the task. The presentation will outline the requirements and the algorithms used and the particular solution as an example for this research field.