CERN Accelerating science

RELIABILITY

RELIABILITY

Observing new particles and rare processes with particle colliders requires stable service levels at sustainable operation cost over long time periods. For a research infrastructure comprising up to five interconnected pre-accelerators and a particle collider that is four times larger than the LHC, it is far from obvious to identify the most cost effective levers to achieve this goal. Scientists are therefore performing a sensitivity analysis of the existing CERN accelerator complex to obtain a high-fidelity model using actually observed operation data that can be used to provide information about achievable availability of an even larger machine complex at sustainable operation costs.
The approach requires the development of an innovative modeling and simulation tool for complex technical infrastructures, similar to off-shore oil rigs, automated manufacturing plants and public transport infrastructures. Physics opportunities and costs are associated with availability improvements and change over time, as more operation data becomes available, data quality of incident reporting is improved, granularity of repair and maintenance reports become finer. The fidelity of the model relies therefore on novel methods to process and annotate sensor data from existing technical systems at high-throughput and the capability to properly treat incomplete and human language operation reports.
Key Challenges:
  • Develop a tool to re-produce the reliability and availability of large-scale technical systems with high-fidelity
  • Develop a tool to reliability predict the effectiveness of reliability and availability actions with respect to their investement and operation costs
  • Exploit innovative data analytics and machine learning techniques to speed up the annotation and processing of raw sensor data for reliability and availability reporting and forecasting
  • Determine a sustainable operation scenario for a future circular collider research infrastructure