Cross-sectoral digital cooperation:
How the international Geneva ecosystem can bring its contribution towards an enhanced data integration to better achieve SDGs
The session was opened by Jürg Lauber (Ambassador, Permanent Representative of Switzerland to the UN and other international organisations in Geneva) who noted that the journey on the Road to Bern via Geneva was a discovery on many interesting data related organisations and projects in Geneva. As the home of the European Organization for Nuclear Research (CERN) which is one of the world’s largest data processors, and the home of the World Meteorological Organization (WMO), which is one of the most diverse data collector systems, Geneva has a very rich data ecosystem. Geneva is also the place where data policies are discussed. Data figures as a public good in the work of many international organisations in Geneva. However, the availability of reliable data is still missing in places and areas where it is needed the most, which is why the dialogues have called for more efforts towards data inclusion. The effective use of data is not only a matter of technology, but it lies in policies as well: legal frameworks, economic incentives, and fair usage that address all the stakeholders involved. The security of data matters, including technical protection and legal immunities for data used and produced by international organisations. Cross-cutting and cross-sectoral standards are needed for collecting, protecting, processing, and presenting data. Only a holistic approach can ensure the effective use of data, and overcoming policy silos still remains the main challenge ahead.
Jovan Kurbalija (Founding Director, DiploFoundation; Head, Geneva Internet Platform [GIP]), noted that the Road to Bern, or road to cooperation is not a single road; there are rather many roads. Echoing Amb. Lauber’s remarks, Kurbalija reaffirmed that some digital gems of Geneva were discovered in this process such as CERN’s data processing abilities and WMO’s world’s most diverse data collection networks. Potentials for new projects were also spotted during the last eight months. Moving to concrete projects and ideas has been the underlying spirit of this process, and the COVID-19 pandemic accelerated this trend. There are many interesting initiatives and developments striving to identify patterns of the current crisis and find solutions. On this journey, we also realised the importance of policy processes and solutions to data issues. Now more than ever, there is a need for legal solutions, incentives for individuals, companies, and countries to realise data is an asset. More importantly, this journey has highlighted the importance of standards and a holistic approach to data policy and governance. The Geneva scene is very important in addressing one of the underlying challenges of data governance and data policy – moving beyond policy silos, since so does data, as it is its nature to be shared and used. The dialogues showed that a limited geographical space with a high concentration of key players in international governance can help frame new standards, policies, and approaches to sharing data across policy silos.
The Road to Bern via Geneva process: A cross-sectoral approach to data governance
The Road to Bern has been instrumental in bringing together Geneva based partners to discover cross-sectoral solutions that mutually reinforce their collective and individual missions, reiterated Samira Asma (Assistant Director-General, Division of Data, Analytics and Delivery for Impact, World Health Organization [WHO]). The first dialogue in the series sparked ideas on how to use comparative advantages to overcome the challenges to make data timely, reliable, and actionable. Meaningful partnerships have the ability to close the many gaps we face. In the absence of resources to meet the sustainable development goals (SDGs), organisations will need to collaborate to make a measurable impact in the lives of people and on the planet. There are 59 health indicators, and health boosts all 17 SDGs. Recent predictions state that we will meet the SDGs in 2084. Another epidemic or pandemic is inevitable and we must be prepared for it. By 2021, Geneva based UN partners could establish a shared mechanism to collect, analyse, communicate, and use data with a special focus on supporting least developed countries (LDCs) and small island developing states (SIDS). The WHO is developing a new World Health Data Hub that will shorten the path from data collection to predictions and actionable insights. The road to 2021 needs to become a highway without speed limits. The COVID-19 pandemic has forced us to develop principles for data collection across different organisations. The WHO is also trying to fast track its five data principles: (a) treat data as a public good, (b) uphold member state trust in data, (c) support strengthening country data and health information systems, (d) be a responsible data steward and fill public health data gaps. The WHO has a data sharing policy for health non-emergencies, but WHO is also looking into a data sharing policy for health emergencies. The WHO is also working with the Health Data Collaborative, a group of 40 partner organisations and countries to create a unified approach to leveraging resources. The WHO and the UN Statistical Division are co-hosting a secretariat to quantify the global direct and indirect impact of COVID-19. These efforts are guided by the UN Secretary-General’s data strategy.
Petteri Taalas (Secretary-General, WMO) called the WMO ‘the grandfather of big data’, as the organisation, in its different iterations, has been collecting data since 1781. The WMO has been setting standards for observations which ensures that the quality of data is known worldwide and that the data coming from member states can be trusted. The organisation has also completed a reform that put weather, climate, water, and ocean issues under one umbrella. The WMO also runs 13 global centres which are placed in different parts of the world and which use supercomputers to calculate big data sets. Additionally, more than 60 centres worldwide are calculating the future climate. The calculations will form the backbone of the forthcoming Sixth Assessment Report of the United Nations Intergovernmental Panel on Climate Change (IPCC). The WMO is promoting free and open access to data, because it drastically enhances the value of such data. The organisation is also grappling with some data challenges which will be tackled at the WMO Data conference in November. These include: (a) enhancing the availability of data because data gaps have a negative impact on the quality of weather forecasts, (b) dealing with essential data which is important for safety business, (c) promoting the free exchange of radar data and several other data sets, and making data provided by private data providers globally available.
Ideas which were brought forward in the series of dialogues need to be concretised, noted Dorothy Tembo (Deputy Executive Director, International Trade Centre [ITC]).There is a need to foster greater data cooperation within the UN ecosystem by developing common principles, enhancing the exchanges, and addressing the aspects that relate to increasing awareness, insurance of data security, privacy protection, and investment in developing capacities for data collection and processing. Data is being used to facilitate many facets of our lives, and a coherent approach towards data is required as it will subsequently support the progression towards the SDGs. Data needs to be accessible, consistent, and transformed into user friendly intelligence. The ITC supports companies in overcoming information barriers for companies to get market intelligence. The ITC considers data to be not only numbers but also data products, such as the trademap and the market access map, that have enabled many countries to make informed decisions. Tembo also noted that data is different but complementary; and that partners should work together to create greater synergies to make a greater impact. She concluded by stating that, triggered by the dialogues, the ITC has re-examined its internal data management processes, and noticed it may be operating in silos, which is why it will invest funds into systematic data management.
Exercising the concept of openness is a shared direction of partners, highlighted Eckhard Elsen (Director for Research and Computing, CERN; Member, CERN Directorate). There may be short-term benefits in hiding data, but this approach is not sustainable long-term. Openness means advertising the tools used, and the open source code used, which helps develop a common understanding worldwide. However, if data is simply published to the public, it is of no use. Instead, slightly aggregated data and tools to analyse it is needed, so that it can be used further. Scientists might have initial hesitations to publish data and instruments as it is their intellectual effort, however, if the funding came from public sources, then the data and the instruments should be made available to the public. The organisation is working on a general policy statement which will determine what open data should satisfy and will give a prescription of how often data needs to be released so it can be used for further analysis. When it comes to modelling data, simulation tools are becoming more complex and a common interface to easily apply these tools is needed. It would be good to use the public funding to apply some of these tools more efficiently by using common standards or subsets of calculations. If data remains undisclosed over a longer period, it impedes progress, and we need to develop such models in which the latency is key to making things open.
Houlin Zhao (Secretary-General, International Telecommunication Union [ITU]) noted that ICT experts will do the basic jobs around collecting, protecting, sharing data, and use of data. When it comes to data collection, it is important to know whether it is the needed, updated, live or obsolete data. When it comes to data sharing, interoperability is key. Zhao underlined the importance of the trustworthiness of data collectors. He noted that the WHO and the ITU have collaborated on WHO/ITU Focus Group on Artificial Intelligence for Health, where scientists are encouraged to present projects for competition. Zhao concluded by stating that an international platform is needed to encourage people to come together and look at these concerns at the same place.
The Geneva Data Cooperation Sandbox
One of the key elements of this cross-sectoral dialogue has been to highlight the importance of concrete and clear principles for digital cooperation, stated Jean-Pierre Reymond (Chargé de mission; Head, Innovation Partnerships, Permanent Mission of Switzerland in Geneva). The mission and the GIP looked at over 50 organisations based in Geneva and compared their guidelines, principles, action plans, and frameworks. Common principles among Geneva-based international organisations are: (a) purpose, the processing of data should be carried out for a specific and explicit purpose; (b) proportionality, the amount of collected data and data processing activities should not exceed the purpose; (c) lawfulness; (d) fairness, data collection should be fair, transparent, and non-discriminatory; (e) confidentiality; (f) privacy; (g) security; (h) accountability, data controllers must comply with key principles and be held responsible and accountable for any harm caused to data and by data; (i) a mechanism to oversee such compliance allowing individuals to request information about their data and seek redress if data are violated should be available. Reymond concluded that the principles highlight a responsible and data rights centred approach, where transparency and accountability in analytical cognitive approaches and the analysis of the user are key for digital cooperation.
Kurbalija presented the first outline of the pilot project, the Data Sandbox. The Data Sandbox was developed on the basis of Diplo’s Data Engine, which is an ecosystem which combines data collecting, processing, and presenting that data in an understandable manner. It activates existing data sets and includes statistical models of standard deviations, isolation forests, local outlier factors, and box graphs. The key idea is to compare data sets in space and time to find patterns and deviations from patterns. Patterns could indicate correlation, and after identifying possible correlation, users can undertake policy research to identify causation. Deviations from patterns can be incidental, or they can be analysed further. Currently, the sandbox is based on countries, but it is planned to include local communities as well. After the pilot phase, the project will focus on comparing and identifying patterns in time sequence.
Prepared by the GIP team