CERN is widely recognised as one of the world’s leading laboratories for particle physics. At CERN, physicists and engineers probe the fundamental structure of the universe. To do this, they use the world’s largest and most complex scientific instruments – particle accelerators and detectors. Technologies developed at CERN go on to have a significant impact through their applications in wider society.
CERN has had an important role in the history of computing and networks. The World Wide Web (WWW) was invented at CERN by Sir Tim Berners-Lee. The web was originally conceived and developed to meet the demand for automated information-sharing between scientists at universities and institutes around the world.
Grid computing was also developed at CERN with partners and thanks to funding from the European Commission. The organisation also carries out activities in the areas of cybersecurity, big data, machine learning (ML), artificial intelligence (AI), data preservation, and quantum technology.
Digital policy issues
Artificial intelligence (1)
Through CERN openlab, CERN collaborates with leading information and communications technology (ICT) companies and research institutes. The R&D projects carried out through CERN openlab address topics related to data acquisition, computing platforms, data storage architectures, computer provisioning and management, networks and communication, ML and data analytics, and quantum technologies. CERN researchers use ML techniques as part of their efforts to maximise the potential for discovery and optimise resource usage. ML is used, for instance, to improve the performance of the Large Hadron Collider (LHC) experiments in areas such as particle detection and managing computing resources. Going one step further, at the intersection of AI and quantum computing, CERN openlab is exploring the feasibility of using quantum algorithms to track the particles produced by collisions in the LHC, and is working on developing quantum algorithms to help optimise how data is distributed for storage in the Worldwide LHC Computing Grid (WLCG). This research is part of the CERN Quantum Technology Initiative (QTI) activities, launched in 2020 to shape CERN’s role in the next quantum revolution.
– CERN openlab: a public-private partnership in which CERN collaborates with ICT companies and other research organisations to accelerate the development of cutting-edge solutions for the research community, including ML.
– CERN QTI: a comprehensive R&D, academic, and knowledge-sharing initiative to exploit quantum advantage for high-energy physics and beyond. Given CERN’s increasing ITC and computing demands, as well as the significant national and international interests in quantum-technology activities, it aims to provide dedicated mechanisms for the exchange of both knowledge and innovation.
Cloud computing (2)
The scale and complexity of data from the LHC, the world’s largest particle accelerator, is unprecedented. This data needs to be stored, easily retrieved, and analysed by physicists worldwide. This requires massive storage facilities, global networking, immense computing power, and funding. CERN did not initially have the computing or financial resources to crunch all of the data on-site, so in 2002 it turned to grid computing to share the burden with computer centres around the world. The WLCG builds on the ideas of grid technology initially proposed in 1999 by Ian Foster and Carl Kesselman. The WLCG relies on a distributed computing infrastructure, as data from the collisions of protons or heavy ions are distributed via the internet for processing at data centres worldwide. This approach of using virtual machines is based on the same paradigm as cloud computing. It is expected that further CERN developments in the field of data processing will continue to influence digital technologies.
Telecommunication infrastructure (3)
In the 1970s, CERN developed CERNET, a lab-wide network to access mainframe computers in its data centre. This pioneering network eventually led CERN to become an early European adopter of Transmission Control Protocol/Internet Protocol (TCP/IP) for use in connecting systems on site. In 1989, CERN opened its first external TCP/IP connections and by 1990, CERN had become the largest internet site in Europe and was ready to host the first WWW server. Nowadays, in addition to the WLCG and its distributed computing infrastructure, CERN is also the host of the CERN Internet eXchange Point (CIXP), which optimises CERN’s internet connectivity and is also open to interested internet service providers (ISPs).
Digital standards (4)
Ever since releasing the World Wide Web software under an open-source model in 1994, CERN has been a pioneer in the open-source field, supporting open-source hardware (with the CERN Open Hardware Licence), open access (with the Sponsoring Consortium for Open Access Publishing in Particle Physics SCOAP3) and open data (with the CERN Open Data Portal). Several CERN technologies are being developed with open science in mind, such as Indico, InvenioRDM, REANA, and Zenodo. Open-source software, such as CERNBox, CERN Tape Archive (CTA), EOS, File Transfer Service (FTS), GeantIV, ROOT, RUCIO, and service for web-based analysis (SWAN), has been developed to handle, distribute, and analyse the huge volumes of data generated by the LHC experiments and are also made available to the wider society.
- Internet Engineering Task Force (IETF) (in the context of the additional work done by IETF on internet standards)
- Pushing the Boundaries of Open Science at CERN: Submission to the UNESCO Open Science Consultation
Data governance (5)
CERN manages vast amounts of data; not only scientific data, but also data in more common formats such as webpages, images and videos, documents, and more. For instance, the CERN Data Centre processes on average one petabyte (one million gigabytes) of data per day. As such, the organisation notes that it faces the challenge of preserving its digital memory. CERN also points to the fact that many of the tools that are used to preserve data generated by the LHC and other scientific projects are also suitable for preserving other types of data and are made available to wider society.
The CERN Open Data Policy for scientific experiments at the LHC is essential to make scientific research more reproducible, accessible, and collaborative. It reflects values that have been enshrined in the CERN Convention for more than 60 years that were reaffirmed in the European Strategy for Particle Physics (2020), and aims to empower the LHC experiments to adopt a consistent approach towards the openness and preservation of experimental data (applying FAIR standards to better share and reuse data).
EOSC Future is an EU-funded H2020 project that is implementing the European Open Science Cloud (EOSC) that started in 2016. EOSC will give European researchers access to a wide web of FAIR data and related services.
CERN joined the recently formed EOSC Association in 2020. It also currently has a mandate to represent the European intergovernmental research organisations that make up EIROforum.
- DPHEP (Data Preservation in High Energy Physics) (CERN is a founding member)
- The CERN Open Data Policy
- EOSC (European Open Science Cloud) (CERN is a mandated organisation and a member of the EOSC Association)
- Online learning opportunities – through CERN academic training
Future of meetings
More information about ongoing and upcoming events, you can find on the events page.
Social media channels
2-Within its work, CERN refers to ‘cloud computing’ as ‘distributed computing.
3-Within its work, CERN refers to ‘telecommunication infrastructure’ as ‘network infrastructure’.
4-Within its work, CERN addresses ‘web standards’ as ‘open science’.
5-Within its work, CERN refers to ‘data governance’ as ‘data preservation’.