The Innovation Platform spoke with Verena Kain, Project Leader of the Efficient Particle Accelerator (EPA) project at CERN, to delve into the project’s objectives, the rationale behind its inception, and the innovative strategies being utilised to achieve its ambitious goals.
The Efficient Particle Accelerator (EPA) project is a groundbreaking initiative that aims to revolutionise the operation of particle accelerators at CERN.
With the impending transition of the Large Hadron Collider to its high luminosity phase, the EPA project seeks to enhance the overall efficiency of not only the LHC but future accelerators as well. By focusing on improving performance, flexibility, reproducibility, and sustainability, the project positions itself as a critical blueprint for the next generation of particle physics research.
As CERN navigates the challenges posed by energy consumption and complex operational demands, the integration of innovative technologies such as automation, artificial intelligence (AI), and machine learning (ML) becomes paramount.
To find out more, The Innovation Platform spoke with Verena Kain, EPA Project Leader, to delve into the project’s objectives, the rationale behind its inception, and the innovative strategies being utilised to achieve its ambitious goals.
Can you provide an overview of the EPA project? What are its main goals and objectives?
The Efficient Particle Accelerator (EPA) project aims to enhance the efficiency of particle accelerators, specifically those at CERN. The hope is that this project will serve as a blueprint for future accelerators, aiming to improve performance, flexibility, reproducibility, and sustainability, all within the framework of improved machine efficiency.
How and why did this project come about?
The main catalyst was another project, the LHC Injectors Upgrade (LIU). The LHC is scheduled to conclude in 2026, after which it will undergo an upgrade to become the high luminosity LHC (HL-LHC), marking the next and final phase of the LHC accelerator.
Taking place between 2019-2021 and involving six particle accelerators, the upgrade of the LHC injectors was necessary to increase the brightness of the injected beams. The LIU project led to a series of workshops in which we revisited real performance metrics for the other facilities for the first time since the LHC was installed. It became apparent that while the machines were functioning adequately, there was significant potential for improvement.
In 2022, amidst an energy crisis, workshops were conducted to assess our machines’ energy consumption. It was revealed that the mitigation measures for hysteresis in our iron-dominated magnets were having a significant effect on energy consumption and were not even compensating it fully. Hysteresis means that the fields in the magnets that are used into saturation depend not only on the current provided by the power supply but also the cycling history, essentially which fields the magnet played before.
One of the typically used mitigation measures is to severely limit the magnet cycles or programmes that can be played one after the other. The result is significantly reduced flexibility. Another energy-consuming measure is to play certain cycles fully even if the beam cannot be injected to ensure reproducible fields.
Why is automation so crucial in improving particle accelerator efficiency? Can you explain the role of AI and ML in the EPA project?
The focus is on retaining the hardware as much as possible, keeping the magnets and RF cavities, for example, and aiming to leverage software to automate processes.
The current equipment operation is designed for human decision-making, except for real-time scenarios that require split-second decisions. Real-time feedback systems and electronics have been automated for some time now, but there are challenges in automating slower controls or human-in-the-loop processes.
While the existing system works, it may not be suitable for the scale of machines we will be dealing with in the future. Operator training can be quite lengthy, as operators must become proficient in handling the vast array of components within the accelerators. Despite efforts to simplify the process with layers of abstraction, the task remains intricate.
Furthermore, operators vary in their skill sets and experience, leading to differing repair times and outcomes. The possibility of machines autonomously managing certain tasks concurrently could significantly enhance efficiency, and operator attention could be directed toward more abstract decision-making.
AI and ML are becoming increasingly important in various fields, including at CERN. For example, the LHC, the largest and most complex machine at CERN, requires minimal human intervention due to its advanced automation. However, it still requires human oversight as it’s not fully autonomous.
Traditional automation requires knowledge of the system and the ability to express it symbolically. ML and AI eliminate this restriction, as these technologies can analyse large amounts of data and identify patterns in various data structures without additional input of parametric relations of the data.
For example, describing symbolically what is visible in an image can be quite challenging. AI is capable of accomplishing this task, computer vision, without having to define rules. This capability provides a significant advantage for learning patterns or mappings that were previously difficult to represent accurately in written or spoken form.
What other innovative approaches are being explored in the EPA project to enhance particle accelerator efficiency? Could you elaborate on the nine work packages?
AI is employed for tasks that lack traditional solutions. To support this, we are establishing infrastructure requirements and a blueprint for smart equipment relevant to science labs in general to facilitate our AI efforts. The aim is to develop AI-ready accelerators by providing hierarchical building blocks within a cohesive system. For example, our scientists primarily use Python for programming but do not want to familiarise themselves with all the intricacies of the underlying control infrastructure.
To address this, systems will be designed to hide the control infrastructure complexity while enabling the integration of advanced AI and other algorithms. Additionally, a shared GPU system will be implemented to provide easy access to computing resources without requiring individual purchases. The next generation of equipment will be (more) autonomous. And the idea is that the final accelerator becomes an ensemble of hierarchical autonomous systems. Information distribution will also be revisited.
What are the key challenges faced in enhancing the efficiency of particle accelerators?
Many of the tasks undertaken here at CERN have never been attempted before. While there are technical hurdles, they are not insurmountable.
For quite some time, the greatest challenge was convincing people of the need to improve efficiency. These accelerators have been operational for a considerable time, and there was, and perhaps still is, scepticism about the potential for improvement and the benefits of investing in such a project.
However, once we presented the project and demonstrated the minimal initial investment required for the first step, those arguments no longer stood. In fact, people are now anticipating the results. According to our timeline, significant results for several of the work packages should be evident by the end of next year.
Additionally, the entire project is time-limited, with only five years to complete it. This adds an extra layer of challenge and interest, so we must remain focused and avoid getting sidetracked.
Your team comes from a variety of backgrounds across CERN. How important is collaboration for the project?
The nine work packages may appear distinct, but they are all very much interconnected. For any one of them to succeed, they all need to thrive. Collaboration is essential from the outset to define the requirements, and everyone must contribute to defining the necessary infrastructure and expected results. While this level of collaboration may not be typical, it is crucial in our case. The project has infrastructure projects and work packages that implement solutions in the accelerators to address specific issues but function as one team.
How do you envision the EPA project impacting the field of particle physics, and what are the next steps?
The idea for this project stemmed from an analysis of existing accelerators and preparing for the future of these accelerators. The goal is to develop a blueprint of an exploitation model for future accelerators. The team is closely collaborating with the team working on the Future Circular Collider (FCC) study at CERN, which is envisioned to have an almost 100km circumference.
The current business-as-usual model would not be affordable for the FCC. Therefore, there is no way around innovating. Our project seeks to establish a new operational and maintenance model for future accelerators, aiming to set a new standard for efficiency and sustainability, as well as a set of guidelines for self-sustaining systems.
Please note, this article will also appear in the 20th edition of our quarterly publication.