DCAF – Geneva Centre for Security Sector Governance

Acronym: DCAF

Established: 2000

Address: Maison de la Paix, Chemin Eugène-Rigot 2D, 1211 Geneva, Switzerland

Website: https://www.dcaf.ch/

DCAF is dedicated to improving the security of states and their people within a framework of democratic governance, the rule of law, respect for human rights, and gender equality. Since its founding in 2000, DCAF has contributed to making peace and development more sustainable by assisting partner states, and international actors supporting these states, to improve the governance of their security sector through inclusive and participatory reforms. It creates innovative knowledge products, promotes norms and good practices, provides legal and policy advice and supports capacity‐building of both state and non‐state security sector stakeholders.

Digital activities

Cyberspace and cybersecurity have numerous implications for security provision, management, and oversight, which is why DCAF is engaged in these topics within its work. DCAF has implemented a cycle of policy projects to develop new norms and good practices in cyberspace. At the operational level, cybersecurity governance has become a prominent part of SSR programming.

Digital policy issues

Cybersecurity

Digitalisation and cybersecurity are the challenges of today and tomorrow. They have an overarching impact on the security sector and the role of the security sector and governance reform (SSG/R) in the digital space. In our recent study SSG/R in the digital space: projections into the future policy, we shed light on the complex intersection of digitalisation and security sector governance. It examines how security sector actors have adapted to the digital transition and the emergence of new actors within the security ecosystem. It also provides concrete recommendations on how to navigate the complexities of digital technologies and shape ethical technology use and robust digital governance frameworks.

Capacity development

For newcomers to the field, DCAF offers the introductory series SSR Backgrounders, with a special issue on the impact of digitalisation on good governance in the security sector. It is a first-stop resource to understand the challenges and considerations for best policy and practice. 

DCAF implements projects that focus on improving cybersecurity laws and policies, increasing the capacity of cybersecurity actors, and strengthening accountability in cybersecurity. One of our priorities is to strengthen the individual and institutional capacities of national Computer Emergency Response Teams (CERTs). These teams are responsible for effectively and efficiently preventing and responding to attacks on national systems.

We also run the annual Young Faces research and mentoring programme, which helps to develop the next generation of cybersecurity experts in the Western Balkans. Each year, we select around 30 dynamic, forward-thinking young professionals to join the programme that enhances their knowledge of emerging trends in cybersecurity governance.

Research shows that women, girls, and LGBTQ+ people are the most affected by cybersecurity risks. Our publication and podcast series analyses how they have been pushed out of cyberspaces by abuse and discrimination, and what solutions exist to take a human-centred approach that considers everyone’s needs in cybersecurity.

In our Donors’ Talk podcast series, we spoke with DCAF’s Justice Advisor to draw on her 15 years of experience in justice sector reform to look at success stories, challenges, and what needs to be considered when supporting digitalisation projects related to justice reform. In Morocco, DCAF supported the National AntiCorruption Commission with training on the prevention and investigation of cyber-corruption and financial cybercrimes. The government commission digitalised its internal processes, resulting in more effective tracking and response to citizens’ data protection requests

Digital tools

Legislation databases 

DCAF’s three legal databases gather policies, laws, and decrees governing the security sectors in the Occupied Palestinian Territory, Libya, and Tunisia. Each database covers the main providers of security and justice, the formal supervision and management institutions, and the legislative and regulatory texts covering and authorising the work of informal control actors (political parties, media, NGOs, etc.). 

A resource for legislators, the justice system, academia, and civil society, the databases offer both a current resource and a historical perspective on the evolution of security sector legislation in the respective countries.

Handbook on effective use of social media in cybersecurity awareness-raising campaigns

This handbook provides condensed and easy-to-follow guidance and examples for designing content strategies and the efficient use of social media towards effective public awareness raising on cybersecurity. It shares the do’s and don’ts of social media, and how to have a strategic social media presence to support better cybersecurity.

For more tools and resources on cybersecurity governance and the security sector, visit our website

Social media channels

Facebook @DCAFgeneva

LinkedIn @DCAF

Spotify @dcaf

X @DCAF_Geneva

YouTube @DCAF Geneva Centre for Security Sector Governance

[Diplo] Policy Meets Tech #3: Cryptography

Event description

Event date: 24 January 2024, 13:00–16:00 CET

The third event in the ‘Policy meets tech’ series will be dedicated to cryptography. It will focus on unpacking cryptographic technology and discussing its policy implications.

The series is organised by Diplo, with the support of the US Permanent Mission to the UN in Geneva, and is dedicated to permanent missions in Geneva. This programme features a series of informative sessions for diplomats in Geneva, with the primary goals of demystifying the intricate realm of digital technologies, comprehending their capabilities and limitations, and delving into their policy implications in a manner that is both practical and pertinent for diplomats. From the intricacies of internet protocols to the intricacies of quantum computing, from cryptography to algorithms, these discussions will provide in-depth insights into the technical underpinnings of these technologies, their real-world applications, and the policy opportunities and challenges they present.

The event is only open to permanent missions in Geneva. For details, please contact Ms Sorina Teleanu, Director of Knoweldge, at geneva@diplomacy.edu.

ICT 4 Peace Foundation

[Talk] Cyber operations, armed conflicts, and international law

Event recording

Event description

Event date: 23 June 2022, 17:30–18:30 CEST

In this talk, researchers from the Geneva Academy will shed light on the different examples of cyber operations (e.g. Stuxnet, NotPetya, and SolarWinds) allegedly conducted or sponsored by states to explicate their geopolitical effects and challenges to international law. As the importance of ICT grows in the modern world, cyber operations have become an integral part of state and non-state actors’ strategies against other states and actors. Researchers will present their findings as part of the project on disruptive military technologies.

For more information, and to register, please visit the official page.

[Workshop] Boost your cyber skills: a cybersecurity event for Nonprofit Organizations

Event description

Event date: 22 June 2022, 9:30–15:00 CEST

The International Geneva Welcome Centre (CAGI) and the CyberPeace Institute jointly curated an all-day cyber skills event for NGOs. The uptake of cloud-based technologies and storage of valuable donor and beneficiary data have made NGOs a frequent target of cyberattacks. The reality for NGOs is that they have to safeguard their cybersecurity as much as private businesses do. The event invites local government representatives, cybersecurity experts, and NGOs to partake in testimonial drafting, roundtable discussions, and awareness raising for boosting NGOs’ cyber skills.

For more information, and to register, please visit the official page.

International Electrotechnical Commission

Acronym: IEC

Established: 1906

Address: 3 rue de Varembé, 1211 Geneva 20 , Switzerland

Website: https://www.iec.ch/

Stakeholder group: International and regional organisations

The IEC is the world leader in preparing international standards for all electrical, electronic, and related technologies. A global, not-for-profit membership organisation, the IEC provides a neutral and independent institutional framework to over 170 countries, coordinating the work of more than 20,000 experts. We administer four IEC Conformity Assessment Systems, representing the largest working multilateral agreement based on the one-time testing of products globally. The members of each system certify that devices, systems, installations, services, and people perform as required.

IEC International Standards represent a global consensus of state-of-the-art know-how and expertise. Together with conformity assessment, they are foundational for international trade.

IEC Standards incorporate the needs of many stakeholders in every participating country and form the basis for testing and certification. Every member country and all its stakeholders represented through the IEC National Committees has one vote and a say in what goes into an IEC International Standard.

Our work is used to verify the safety, performance, and interoperability of electric and electronic devices and systems such as mobile phones, refrigerators, office and medical equipment, or electricity generation. It also helps accelerate digitisation, artificial intelligence (AI), or virtual reality applications, protects information technology (IT) and critical infrastructure systems from cyberattacks and increases the safety of people and the environment.

Digital activities 

The IEC works to ensure that its activities have a global reach in order to meet all the challenges of digital transformation worldwide. The organisation covers an array of digital policy issues.

Digital policy issues

Artificial intelligence and the internet of things

AI applications are driving digital transformation across diverse industries, including energy, healthcare, smart manufacturing, transport, and other strategic sectors that rely on IEC Standards and Conformity Assessment Systems. AI technologies allow insights and analytics that go far beyond the capabilities of legacy analytic systems.

For example, the digital transformation of the grid enables increased automation, making it more efficient and able to integrate fluctuating renewable energy sources seamlessly. IEC Standards pave the way for the use of a variety of digital technologies relating to intelligent energy. They deal with issues such as integrating renewable energies within the electrical network but also increased automatisation.

The IEC’s work in the area of AI takes a three-pronged approach. IEC experts focus on sector-specific needs (vertical standards) and conformity assessment, while the joint IEC and International Organization for Standardization (ISO) technical committee on AI, JTC1/SC 42, brings together technology experts, as well as ethicists, lawyers, social scientists, and others to develop generic and foundational standards (horizontal standards).

In addition, IEC Safety Standards are an essential element of the framework for AI applications in power utilities and smart manufacturing. IEC Conformity Assessment Systems complete the process by ensuring the standards are properly implemented.

SC 42 addresses some concerns about the use and application of AI technologies. For example, data quality standards for ML and analytics are crucial for helping to ensure that applied technologies produce useful insights and eliminate faulty features.

Governance standards in AI and the business process framework for big data analytics address how the technologies can be governed and overseen from a management perspective. International standards in the areas of trustworthiness, ethics, and societal concerns will ensure responsible deployment.

The joint IEC and ISO technical committee also develop foundational standards for the IoT. Among other things, SC 41 standards promote interoperability, as well as architecture and a common vocabulary for the IoT.

Hardware

The IEC develops standards for many of the technologies that support digital transformation. Sensors, cloud, and edge computing are examples.

Advances in data acquisition systems are driving the growth of big data and AI use cases. The IEC prepares standards relating to semiconductor devices, including sensors.

Sensors can be certified under the IEC Quality Assessment System for Electronic Components (IECQ), one of the four IEC Conformity Assessment Systems.

Cloud computing and its technologies have also supported the increase of AI applications. The joint IEC and ISO technical committee prepares standards for cloud computing, including distributed platforms and edge devices, which are close to users and data collection points. The publications cover key requirements relating to data storage and recovery.

Building trust

International Standards play an important role in increasing trust in AI and help support public and private decision-making, not least because they are developed by a broad range of stakeholders. This helps to ensure that the IEC’s work strikes the right balance between the desire to deploy AI and other new technologies rapidly and the need to study their ethical implications.

The IEC has been working with a wide range of international, regional, and national organisations to develop new ways to bring stakeholders together to address the challenges of AI. These include the Swiss Federal Department of Foreign Affairs (FDFA) and the standards development organisations, ISO, and the International Telecommunication Union (ITU).

More than 500 participants followed the AI with Trust conference, in-person and online, to hear different stakeholder perspectives on the interplay between legislation, standards and conformity assessment. They followed use-case sessions on healthcare, sensor technology, and collaborative robots, and heard distinguished experts exchange ideas on how they could interoperate more efficiently to build trust in AI. The conference in Geneva was the first milestone of the AI with Trust initiative.

The IEC is also a founding member of the Open Community for Ethics in Autonomous and Intelligent Systems (OCEANIS). OCEANIS brings together standardisation organisations from around the world to enhance awareness of the role of standards in facilitating innovation and addressing issues related to ethics and values.

Read more

e-tech

IEC and ISO Work on Artificial Intelligence

AI for the Last Mile

Computational Approaches for AI Systems

–  IEC Blog

Digital Transformation

–  Video

Ian Oppermann (AI with Trust)

AI with Trust conference interviews AI Governance

Network security and critical infrastructure

The IEC develops cybersecurity standards and conformity assessments for IT and operational technology (OT). One of the biggest challenges today is that cybersecurity is often understood only in terms of IT, which leaves critical infrastructure, such as power utilities, transport systems, manufacturing plants and hospitals, vulnerable to cyberattacks.

Cyberattacks on IT and OT systems often have different consequences. The effects of cyberattacks on IT are generally economical, while cyberattacks on critical infrastructure can impact the environment, damage equipment, or even threaten public health and lives.

When implementing a cybersecurity strategy, it is essential to consider the different priorities of cyber-physical and IT systems. The IEC provides relevant and specific guidance via two of the world’s best-known cybersecurity standards: IEC 62443 for cyber-physical systems and ISO/IEC 27001 for IT systems.

Both take a risk-based approach to cybersecurity, which is based on the concept that it is neither efficient nor sustainable to try to protect all assets in equal measure. Instead, users must identify what is most valuable and requires the greatest protection and identify vulnerabilities.

Conformity assessment provides further security by ensuring that the standards are implemented correctly: IECEE certification for IEC 62443 and IECQ for ISO/IEC 27001.

ISO/IEC 27001 for IT

IT security focuses equally on protecting the confidentiality, integrity, and availability of data – the so-called CIA triad. Confidentiality is of paramount importance and information security management systems, such as the one described in ISO/IEC 27001, are designed to protect sensitive data, such as personally identifiable information (PII), intellectual property (IP), or credit card numbers, for example.

Implementing the information security management system (ISMS) described in ISO/IEC 27001 means embedding information security continuity in business continuity management systems. Organisations are shown how to plan and monitor the use of resources to identify attacks earlier and take steps more quickly to mitigate the initial impact.

IEC 62443 for OT

In cyber-physical systems, where IT and OT converge, the goal is to protect safety, integrity, availability, and confidentiality (SIAC). Industrial control and automation systems (ICAS) run in a loop to check continually that everything is functioning correctly.

The IEC 62443 series was developed because IT cybersecurity measures are not always appropriate for ICAS. ICAS are found in an ever-expanding range of domains and industries, including critical infrastructure, such as energy generation, water management, and the healthcare sector.

ICAS must run continuously to check that each component in an operational system is functioning correctly. Compared to IT systems, they have different performance and availability requirements and equipment lifetime.

Conformity assessment: IECEE

Many organisations are applying for the IEC System of Conformity Assessment Schemes for Electrotechnical Equipment and Components (IECEE) conformity assessment certification to verify that the requirements of IEC 62443 have been met.

IECEE provides a framework for assessments in line with IEC 62443, which specifies requirements for security capabilities, whether technical (security mechanisms) or process (human procedures) related. Successful recipients receive the IECEE industrial cybersecurity capability certificate of conformity.

Conformity assessment: IECQ

While certification to ISO/IEC 27001 has existed since the standard was published in 2013, it is only in recent years that the IEC Quality Assessment System for Electronic Components (IECQ) has set up a true single standardised way of assessing and certifying an ISMS to ISO/IEC 27001.

International standards such as IEC 62443 and ISO/IEC 27001 are based on industry best practices and reached by consensus. Conformity assessment confirms that they have been implemented correctly to ensure a safe and secure digital society.

Read more

Video

Digital tools

IEC has developed a number of online tools and services designed to help everyone with their daily activities.

Social media channels

Facebook @InternationalElectrotechnicalCommission

LinkedIn @IECStandards

Pinterest @IECStandards

X @IECStandards

YouTube @IECstandards

Office of the United Nations High Commissioner for Human Rights

Acronym: OHCHR

Address: Palais Wilson 52, Rue des Pâquis, 1201 Geneva, Switzerland

Website: https://www.ohchr.org/

Stakeholder group: International and regional organisations

The Office of the United Nations High Commissioner for Human Rights and other related UN human rights entities, namely the United Nations Human Rights Council, the Special Procedures, and the Treaty Bodies are considered together under this section.

The UN Human Rights Office is headed by the OHCHR and is the principal UN entity on human rights. Also known as UN Human Rights, it is part of the UN Secretariat. UN Human Rights has been mandated by the UN General Assembly (UNGA) to promote and protect all human rights. As such, it plays a crucial role in supporting the three fundamental pillars of the UN: peace and security, human rights, and development. UN Human Rights provides technical expertise and capacity development in regard to the implementation of human rights, and in this capacity assists governments in fulfilling their obligations.

UN Human Rights is associated with a number of other UN human rights entities. To illustrate, it serves as the secretariat for the UN Human Rights Council (UNHRC) and the Treaty Bodies. The UNHRC is a body of the UN that aims to promote the respect of human rights worldwide. It discusses thematic issues, and in addition to its ordinary session, it has the ability to hold special sessions on serious human rights violations and emergencies. The ten Treaty Bodies are committees of independent experts that monitor the implementation of the core international human rights treaties.

The UNHRC established the Special Procedures, which are made up of UN Special Rapporteurs (i.e. independent experts or working groups) working on a variety of human rights thematic issues and country situations to assist the efforts of the UNHRC through regular reporting and advice. The Universal Periodic Review (UPR), under the auspices of the UNHRC, is a unique process that involves a review of the human rights records of all UN member states, providing the opportunity for each state to declare what actions they have taken to improve the human rights situations in their countries. UN Human Rights also serves as the secretariat to the UPR process.

Certain non-governmental organisations (NGOs) and national human rights institutions participate as observers in UNHRC sessions after receiving the necessary accreditation.

Digital activities

Digital issues are increasingly gaining prominence in the work of UN Human Rights, the UNHRC, the Special Procedures, the UPR, and the Treaty Bodies.

A landmark document that provides a blueprint for digital human rights is the UNHRC resolution (A/HRC/20/8) on the promotion, protection, and enjoyment of human rights on the internet, which was first adopted in 2012, starting a string of regular resolutions with the same name addressing a growing number of issues. All resolutions affirm that the same rights that people have offline must also be protected online. Numerous other resolutions and reports from UN human rights entities and experts considered in this overview tackle an ever-growing range of other digital issues including the right to privacy in the digital age; freedom of expression and opinion; freedom of association and peaceful assembly; the rights of older persons; racial discrimination; the rights of women and girls; human rights in the context of violent extremism online; economic, social, and cultural rights; human rights and technical standard-setting; business and human rights; and the safety of journalists.

Digital policy issues

Artificial intelligence

In 2018, the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression presented a report to the UNGA on Artificial Intelligence (AI) Technologies and Implications for the Information Environment. Among other things, the document addresses the role of AI in the enjoyment of freedom of opinion and expression including ‘access to the rules of the game when it comes to AI-driven platforms and websites’ and therefore urges for a human rights-based approach to AI.

For her 2020 thematic report to the Human Rights Council, the UN Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia, and related intolerance analysed different forms of racial discrimination in the design and use of emerging technologies, including the structural and institutional dimensions of this discrimination. She followed up with reports examining how digital technologies, including AI-driven predictive models, deployed in the context of border enforcement and administration reproduce, reinforce, and compound racial discrimination.

In 2020, the Committee on the Elimination of Racial Discrimination published its General Recommendation No. 36 on preventing and combating racial profiling by law enforcement officials (CERD/C/GC/36), which focuses on algorithmic decision-making and AI in relation to racial profiling by law enforcement officials.

In 2021, UN Human Rights published a report analysing how AI impacts the enjoyment of the right to privacy and other human rights. It clarifies measures that states and businesses should take to ensure that AI is developed and used in ways that benefit human rights and prevent and mitigate harm.

The UN Human Rights B-Tech project is running a Generative AI project that demonstrates the ways in which the UN Guiding Principles on Business and Human Rights should guide more effective understanding, mitigation, and governance of the risks associated with generative AI.

UN Human Rights also weighs in on specific policy and regulatory debates, such as by an open letter concerning the negotiations of a European Union AI Act.

Child safety online Within the work of the OHCHR, ‘child safety online’ is referred to as ‘rights of the child’ and dealt with as a human rights issue.

The issue of child safety online has garnered the attention of UN human rights entities for some time. A 2016 resolution on Rights of the Child: Information and Communications Technologies and Child Sexual Exploitation adopted by the UNHRC calls on states to ensure ‘full, equal, inclusive, and safe access […] to information and communications technologies by all children and safeguard the protection of children online and offline’, as well as the legal protection of children from sexual abuse and exploitation online. The Special Rapporteur on the sale and sexual exploitation of children, including child prostitution, child pornography, and other child sexual abuse material, mandated by the UNHRC to analyse the root causes of sale and sexual exploitation and pro- mote measures to prevent it, also looks at issues related to child abuse, such as the sexual exploitation of children online, which has been addressed in a report (A/ HRC/43/40) published in 2020, but also in earlier reports.

The Committee on the Rights of the Child published its General Comment No. 25 on Children’s Rights in Relation to the Digital Environment (CRC/C/GC/25), which lays out how states parties should implement the convention in relation to the digital environment and provides guidance on relevant legislative, policy, and other measures to ensure full compliance with their obligations under the convention and optional protocols in the light of opportunities, risks, and challenges in promoting, respecting, protecting, and fulfilling all children’s rights in the digital environment.

Data governance

UN Human Rights maintains an online platform consisting of a number of databases on anti-discrimination and jurisprudence, as well as the Universal Human Rights Index (UHRI), which provides access to recommendations issued to countries by Treaty Bodies, Special Procedures, and the UPR of the UNHCR.

UN Human Rights also published a report titled A Human Rights-Based Approach to Data – Leaving no one Behind in the 2030 Agenda for Sustainable Development that specifically focuses on issues of data collection and disaggregation in the context of sustainable development.

UN Human Rights has worked closely with partners across the UN system in contributing to the Secretary-General’s 2020 Data Strategy, and co-leads, with the Office of Legal Affairs and UN Global Pulse, work on the subsequent Data Protection and Privacy Program.

Capacity development

UN Human Rights launched the Guiding Principles in Technology Project (B-Tech Project) to provide guidance and resources to companies operating in the technology space with regard to the implementation of the UN Guiding Principles on Business and Human Rights (UNGPs on BHR). Following the publication of a B-Tech scoping paper in 2019, several foundational papers have delved into a broad range of business-related issues, from business-model-related human rights risks to access to remedies. At the heart of the B-Tech project lies multistakeholder engagement, informing all of its outputs. The B-Tech project is enhancing its engagement in Africa, working with technology company operators, investors, and other key digital economy stakeholders, including civil society, across Africa in a set of African economies and their tech hubs to create awareness of implementing the UNGPs on BHR.

Following a multistakeholder consultation held on 7–8 March 2022, the High Commissioner presented her report on UN Guiding Principles on Business and Human Rights and Technology Companies (A/HRC/50/56), which demonstrated the value and practical application of the UNGPs in preventing and addressing adverse human rights impacts by technology companies.

Extreme poverty Within the work of the OHCHR, ‘extreme poverty’ is dealt with as a human rights issue.

The Special Rapporteur on extreme poverty and human rights has in recent years increased his analysis of human rights issues arising in the context of increased digitisation and automation. His 2017 report to the General Assembly tackled the socio-economic challenges in an emerging world where automation and AI threaten traditional sources of income and analysed the promises and possible pitfalls of introducing a universal basic income. His General Assembly report in 2019 addressed worrying trends in connection with the digitisation of the welfare state. Moreover, in his 2022 report to the UNHRC on non-take-up of rights in the context of social protection, the Special Rapporteur highlighted, among other things, the benefits and considerable risks associated with automation of social protection processes.

Content policy

Geneva-based human rights organisations and mechanisms have consistently addressed content policy questions, in particular in the documents referred under Freedom of Expression and Freedom of Peaceful Assembly and of Association. Other contexts where content policy plays an important role include Rights of the Child, Gender Rights Online, and Rights of Persons with Disabilities. Moreover, the use of digital technologies in the context of terrorism and violent extremism is closely associated with content policy considerations.

UN Human Rights, at the request of the UNHRC, prepared a compilation report in 2016, which explores, among other issues, aspects related to the prevention and countering of violent extremism online, and underscores that responses to violent extremism that are robustly built on human rights are more effective and sustainable.

Additional efforts were made in 2019 when the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism published a report where she examined the multifaceted impacts of counter-terrorism measures on civic space and the rights of civil society actors and human rights defenders, including measures taken to address vaguely defined terrorist and violent extremist content. In July 2020, she published a report discussing the human rights implications of the use of biometric data to identify terrorists and recommended safeguards that should be taken.

Interdisciplinary approaches

Collaboration within the UN system

UN Human Rights is leading a UN system-wide process to develop a human rights due diligence (HRDD) guidance for digital technology, as requested by the Secretary-General’s Roadmap for Digital Cooperation and his Call to Action for Human Rights. The HRDD guidance in development pertains to the application of human rights due diligence and human rights impact assessment related to the UN’s design, development, procurement, and use of digital technologies, and is expected to be completed by the end of 2022.

As part of the implementation of the Secretary-General’s Call to Action for Human Rights, UN Human Rights launched the UN Hub for Human rights and Digital Technology, which provides a central repository of authoritative guidance from various UN human rights mechanisms on the application of human rights norms to the use and governance of digital technologies.

In addition, UN Human Rights is a member of the Legal Identity Agenda Task Force, which promotes solutions for the implementation of SDG target 16.9 (i.e. by 2030, provide legal identity for all, including birth registration). It leads its work on exclusion and discrimination in the context of digitised identity systems.

Neurotechnology

Rapid advancements in neurotechnology and neuroscience, while holding promises of medical benefits and scientific breakthroughs, present a number of human rights and ethical challenges. Against this backdrop, UN Human Rights has been contributing significantly to an inter-agency process led by the Executive Office of the Secretary-General to develop a global roadmap for effective and inclusive governance of neurotechnology.

Secretary-General’s Report on Our Common Agenda

Since the adoption of A/RES/76/6 on Our Common Agenda in November 2021, the follow-up by the UN system has been underway. UN Human Rights is co-leading several proposals in collaboration with other entities, notably on the application of the human rights framework in the digital sphere, mitigation and prevention of internet shutdowns, and disinformation.

Smart Cities

Making Cities Right for Young People” is a participatory research project, supported by the Botnar Foundation, which examines the impact of the digitalisation of cities on the enjoyment of human rights. It also examines strategies to ensure that “smartness” is measured not solely by technological advancements but by the realisation and promotion of inhabitants’ human rights and well-being, and explores ways to promote digital technologies for civic engagement, participation, and the public good, with a focus on meaningful youth participation in decision-making processes. Launched in 2023, this project will survey the current landscape and detail key human rights issues in urban digitalisation. Based on participatory research carried out in three geographically, socially, culturally, and politically diverse cities, it will produce a report with initial findings and develop a roadmap for future human-rights-based work on smart cities.

Migration

In September 2023, UN Human Rights published a study, conducted with the University of Essex, that analyses the far-reaching human rights implications of specific border technologies. It provides recommendations for states and stakeholders on how to take a human-rights-based approach in ensuring the use of digital technologies at borders aligns with international human rights law and standards. The study draws from a collective body of expertise, research, and evidence, as well as extensive interviews and collaborative meetings with experts.

The UNHRC has also mandated the Special Rapporteur on the right to privacy to address the issue of online privacy in its Resolution on the Right to Privacy in the Digital Age from 2015 (A/HRC/RES/28/16). To illustrate, the Special Rapporteur has addressed the question of privacy from the stance of surveillance in the digital age (A/HRC/34/60), which becomes particularly challenging in the context of cross-border data flows. More recently, specific attention has been given to the privacy of health data that is being produced more and more in the day and age of digitalisation, and that requires the highest legal and ethical standards (A/HRC/40/63). In this vein, in 2020, the Special Rapporteur examined data protection and surveillance in relation to COVID-19 and contact tracing in his preliminary report (A/75/147), in which he provided a more definitive analysis of how pandemics can be managed with respect to the right to privacy (A/76/220) in 2021. In another

Privacy and data protection

Challenges to the right to privacy in the digital age, such as surveillance, communications interception, and the increased use of data-intensive technologies, are among some of the issues covered by the activities of UN Human Rights. At the request of the UNGA and the UNHRC, the High Commissioner prepared four reports on the right to privacy in the digital age. The first report, presented in 2014, addressed the threat to human rights caused by surveillance by governments, in particular mass surveillance. The ensuing report, published in September 2018, identified key principles, standards, and best practices regarding the promotion and protection of the right to privacy. It outlined minimum standards for data privacy legal frameworks. In September 2021, the High Commissioner presented a ground-breaking report on AI and the right to privacy (A/HRC/48/31), in which she called for a ban on AI applications that are incompatible with international human rights law, and stressed the urgent need for a moratorium on the sale and use of AI systems that pose serious human rights risks until adequate safeguards are put in place. In September 2022, the High Commissioner presented a report focusing on the abuse of spyware by public authorities, the key role of encryption in ensuring the enjoyment of human rights in the digital age, and the widespread monitoring of public spaces.

The UNHRC also tackles online privacy and data protection. Resolutions on the promotion and protection of human rights on the internet have underlined the need to address security concerns on the internet in accordance with international human rights obligations to ensure the protection of all human rights online, including the right to privacy. The UNHRC has also adopted specific resolutions on the right to privacy in the digital age, addressing issues such as mass surveillance, AI, the responsibility of business enterprises, and the key role of the right to privacy as an enabler of other human rights. Resolutions on the safety of journalists have emphasised the importance of encryption and anonymity tools for journalists to freely exercise their work. Two resolutions on new and emerging technologies (2019 and 2021) have further broadened the lens, for example by asking for a report on the human rights implications of technical standard-setting processes.

The UNHRC has also mandated the Special Rapporteur on the right to privacy to address the issue of online privacy in its Resolution on the Right to Privacy in the Digital Age from 2015 (A/HRC/RES/28/16). To illustrate, the Special Rapporteur has addressed the question of privacy from the stance of surveillance in the digital age (A/HRC/34/60), which becomes particularly challenging in the context of cross-border data flows. More recently, specific attention has been given to the privacy of health data that is being produced more and more in the day and age of digitalisation, and that requires the highest legal and ethical standards (A/HRC/40/63). In this vein, in 2020, the Special Rapporteur examined data protection and surveillance in relation to COVID-19 and contact tracing in his preliminary report (A/75/147), in which he provided a more definitive analysis of how pandemics can be managed with respect to the right to privacy (A/76/220) in 2021. In another

CyberPeace Institute

Acronym: CyberPeace Institute

Established: 2019

Address: Campus Biotech Innovation Park, 15 avenue de Sécheron, 1202 Geneva, Switzerland

Website: https://cyberpeaceinstitute.org/

Stakeholder group: NGOs and associations

The CyberPeace Institute is an independent and neutral non-governmental organisation (NGO) that strives to reduce the frequency, impact, and scale of cyberattacks, to hold actors accountable for the harm they cause, and to assist vulnerable communities.

The institute is a Geneva-based NGO, also working in close collaboration with relevant partners to reduce the harm from cyberattacks on people’s lives worldwide and provide assistance. By analysing cyberattacks, we expose their societal impact and how international laws and norms are being violated, and advance responsible behaviour to enforce cyberpeace.

At the heart of the Institute’s efforts is the recognition that cyberspace is about people. We support providers of essential services to the most vulnerable members of society, ultimately benefitting us all, like NGOs and the healthcare sector. Attacking them can have a devastating impact on beneficiaries and patients, putting their rights and even lives at risk.

To deliver on this mission, we rely on donations and the generosity of individuals, foundations, companies, and other supporters. This support enables us to assist and support vulnerable communities, including NGOs, to enhance their resilience to cyberattacks.

The Institute also provides evidence-based knowledge and fosters awareness of the impact of cyberattacks on people, to give a voice to and empower victims to highlight the harm and impact of cyberattacks. We remind state and non-state actors of the international laws and norms governing responsible behaviour in cyberspace, and advance the rule of law to reduce harm and ensure the respect of the rights of people.

Digital activities

Created in 2019, the Institute assesses the impact of cyberattacks from a human perspective, focusing on the rights of people. We ground our analysis on evidence and the impact on human well-being, telling the story of people, linking it with the technical reality of cyberattacks, and assessing it against the violation of laws. The Institute advocates for an evidence-based, human-centric approach to the analysis of cyberattacks as essential to the process of redress, repair, and/or justice for victims. It works collaboratively in our research, analysis, assistance, mobilisation, and advocacy. We engage with vulnerable communities to understand their needs for cybersecurity support and provide free and trusted cybersecurity assistance to vulnerable communities.

The CyberPeace Institute

  • assists NGOs and other vulnerable communities to prepare for and recover from cyberattacks.
  • investigates cyberattacks targeting vulnerable communities, analysing these attacks to provide alerts and support and for accountability.
  • advocates to advance the rule of law and respect for the rights of people.
  • anticipates threats to people associated with emerging and disruptive technologies.
    • Examples of operational activities
  • Assisting humanitarian and other NGOs with free and trusted cybersecurity support.
  • Analysing cyberattacks and highlighting their impact on people and how they violate the rule of law.
  • Documenting violations of international laws and norms and advocating for strengthened legal protection in cyberspace.
  • Offering expertise and support to states and civil society in relation to responsible behaviour in cyberspace.
  • Foreseeing and navigating future trends and threats in cyberspace.

Digital policy issues

Critical infrastructure

Cyberattacks against critical infrastructure have been on the rise, from attacks against hospitals and vaccine supply chains to attacks on the energy sector. When such disruptions occur, access to basic services is at risk. It is vital that there is an increase in the capacity and ability to improve resilience to cyberthreats in critical sectors, such as healthcare. The CyberPeace Institute urges stakeholders in diplomatic, policy, operational, and technical areas to increase their capacity and resilience to cyberthreats.

The Institute advocates for capacity building aimed at enabling states to identify and protect national critical infrastructure and to cooperatively safeguard its operation. This includes capacity building, implementation of norms of responsible behaviour, and confidence building measures. In strengthening efforts to protect critical infrastructure, the Institute calls for the sharing of lessons learned between countries to assist those with less capacity and fewer capabilities.

NGOs in civilian-critical sectors, for example water, food, healthcare, energy, finance, and information, need support and expertise to help them strengthen their cybersecurity capabilities. While these NGOs provide critical services to communities and bridge areas not covered by public and private actors, they lack the resources to protect themselves from cybersecurity threats.

Examples of the Institute’s work in this regard:

  • Calls to governments to take immediate and decisive action to stop all cyberattacks on hospitals and healthcare and medical research facilities, as well as on medical personnel and international public health organisations.
  • Capacity building is essential for achieving cyber preparedness and resilience across sectors and fields, and activities focus on providing assistance and capacity building to NGOs that might lack technical expertise and resources.
  • Publication of the strategic analysis report Playing with Lives: Cyberattacks on Healthcare are Attacks on People, and launch of the Cyber Incident Tracer (CIT) #Health platform that bridges the current information gap about cyberattacks on healthcare and their impact on people. This is a valuable source of information for evidence-led operational, policy, and legal decision-makers.
  • Analysis and evaluation of cyberattacks and operations targeting critical infrastructure and civilian objects in the armed conflict between Ukraine and the Russian Federation through the publicly accessible Cyber Attacks in Times of Conflict Platform #Ukraine and a two-part video series to offer visual representation of key findings further developed in our quarterly analytical reports.
  • An interactive platform named The CyberPeace Watch to expand the monitoring to other contexts including other situations of armed conflict and to the application of relevant laws and norms. This informs policy and legal processes and developments, the preparedness and protection of critical infrastructure, and cyber capacity building.
  • Participation in the INFINITY project to transform the traditional idea of criminal investigation and analysis. INFINITY has received funding from the European Union’s Horizon 2020. Its concept is based around four core research and technical innovations that together, will provide a revolutionary approach and convert data into actionable intelligence.
  • Participation in the UnderServed project, an EU- funded initiative to address the lack of adequate cybersecurity measures for vulnerable sectors, including humanitarian, development, and peace non-governmental organisations (NGO). The primary objective of the project is to establish a comprehensive platform for reporting and analysing cyber threats. This platform is tailor-made for NGOs vulnerable to cyberattacks, which often lack the resources to effectively mitigate such threats.

Network security

NGOs play a critical role in ensuring the delivery of critical services, such as the provision of healthcare, access to food, micro-loans, information, and the protection of human rights.

Malicious actors are already targeting NGOs in an effort to get ransoms and exfiltrate data. Often these NGOs do not have the budget, know-how, or time to effectively secure their infrastructures and develop a robust incident response to manage and overcome sophisticated attacks.

With this in mind, the Institute launched its CyberPeace Builders programme in 2021, a unique network of corporate volunteers providing free pre- and post-incident assistance to NGOs supporting vulnerable populations.

This initiative brings support to NGOs in critical sectors at a level that is unequalled in terms of staff, tools, and capabilities. It assists NGOs with cybersecurity whether they work locally or globally, and supports them in crisis-affected areas across the globe.

Capacity development

The Institute believes that meaningful change can occur when a diversity of perspectives, sectors, and industries work together. To address the complex challenges related to ensuring cyberpeace, it works with a wide range of actors at the global level including governments, the private sector, civil society, academia, philanthropies, policymaking institutions, and other organisations. The Institute contributes by providing evidence-led knowledge, emphasising the need to integrate a genuine human-centric approach in both technical and policy-related projects and processes, and by highlighting the civil society perspective to support and amplify existing initiatives.

Training

The CyberPeace Institute is providing comprehensive training for NGOs Boards and Staff, Foundations and Volunteers designed to empower organisations with vital tools for safeguarding their missions.

We recently launched a Cyber School, in partnership with Microsoft, to create a unique, free offer to
participate in an 8-week virtual course for everyone who is interested in taking their first step into a new career path.

Interdisciplinary approaches

To contribute to closing the accountability gap in cyberspace, the Institute seeks to advance the role of international law and norms.

It reminds state and non-state actors of the international law and norms governing responsible behaviour in cyberspace, and contributes to advancing the rule of law to reduce harm and ensure the respect of the rights of people.

Contribution to UN processes

  • In 2021–2022, the Institute contributed to and commented on various UN-led processes (notably the United Nations Group of Governmental Experts on Advancing responsible state behaviour in cyberspace in the context of international security (UN GGE) and the Working Group (WG) on the use of mercenaries as a means of violating human rights and impeding the exercise of the rights of peoples to self-determination).
  • Since its inception, the Institute has closely followed the work of the UN Open-Ended Working Group (UN OEWG) on developments in the field of information and telecommunications in the context of international security, advocating recognition of the healthcare sector as a critical infrastructure and raising concerns about the lack of commitment towards an actionable and genuine human-centric approach.
  • In the Open-Ended Working Group on security of and in the use of information and communications technologies 2021–2025 (OEWG II), the Institute set out three key action areas and related recommendations, and is contributing its expertise in relation to the protection of humanitarian and development organisations from cyberattacks.
  • – The Institute issued a Statement at the Ad Hoc Committee to Elaborate a Comprehensive International Convention on Countering the Use of Information and Communications Technologies for Criminal Purposes (Cybercrime Convention
  • Moreover, the Institute sought to advance the Cyber Programme of Action (PoA) by offering recommendations concerning the range, organisation, and approaches for stakeholder participation.
  • Also, the Institute welcomed the call for civil society organisations to contribute to the Global Digital Compact and provided a set of recommendations.

Participation in international initiatives: The Paris Call Working Groups

The Paris Call for Trust and Security in Cyberspace is a multistakeholder initiative launched by the French government at the Paris Peace Forum in November 2018. The Call itself sets out nine principles promoting and ensuring the security of cyberspace and the safer use of information and communications technology (ICT).

At the World Economic Forum meeting in Davos, in May 2022, the CyberPeace Institute joined Access Now, the Office of the High Commissioner for Human Rights (OHCHR), Human Rights Watch (HRW), Amnesty International, the International Trade Union Confederation (ITUC), and Consumers International to call on decision-makers to take action and initiate a moratorium limiting the sale, transfer, and use of abusive spyware until people’s rights are safeguarded under international human rights law.

This is in addition to a call made in 2021, in which the Institute joined more than 100 civil society organisations calling for a global moratorium on the sale and transfer of surveillance technology until rigorous human rights safeguards are adopted to regulate such practices and guarantee that governments and non-state actors don’t abuse these capabilities.

EU Processes

At the Institute, we conduct an evaluation of best practices in implementing EU regulations, focusing on
their evolution and development to ensure effective execution. Simultaneously, we analyse EU mechanisms like the EU Cyber Diplomacy Toolbox, aimed at countering malicious cyber activities and bolstering resilience, while providing targeted observations and recommendations.

Digital technology plays an important role in conflict mediation and global peacebuilding. It can extend inclusion, allowing more women or people from marginalised groups to take part in or follow a mediation process. It can make mediation faster and more efficient and can allow mediators to draw on resources from around the world.

However, digital technology brings risks, too. It can increase polarisation, for example, and allow disinformation to spread to more people, more quickly. It can increase vulnerability to malicious actors, spying, and data breaches. These risks can undermine trust in the process.

Mediators work in low-trust, volatile contexts and don’t always have the knowledge to assess the risks posed by digital technology. A new online platform helps to raise awareness of those risks, as well as offering training on how to deal with them. The Digital Risk Management E-Learning Platform for Mediators was created in 2021 by the CyberPeace Institute, CMI – Martti Ahtisaari Peace Foundation, and the UN Department of Political and Peacebuilding Affairs (UNDPPA) Mediation Support Unit.

As part of the integration and engagement with the stakeholder ecosystem in Geneva, the Institute is a member of the Geneva Chamber of Commerce, Industry and Services (CCIG). Various academic collaborations are ongoing through participation in conferences, workshops, and lectures,
namely with the Ecole Polytechnique Fédérale de Lausanne Centre for digital trust EPFL (C4DT), the University of Geneva (UNIGE), and the Graduate Institute (IHEID). In 2020, the Institute formed a strategic partnership with the SwissTrust Valley for Digital Transformation and Cybersecurity.

The Institute and its staff have received several awards for innovative and continuous efforts promoting cyberpeace including the 2020 Geneva Centre for Security Policy (GCSP), second prize for Innovation in Global Security, and the Prix de l’Economie in 2021 from CCIG.

Social media channels

The Institute maintains a website providing alerts, blogs, articles, and publications on key issues related to its mission for cyberpeace, and shares video materials and discussion recordings on YouTube channel.

The latest news and developments are shared via:

Facebook @CyberpeaceInstitute

Instagram @cyberpeaceinst

LinkedIn @cyberpeace-institute

X @CyberpeaceInst

Sign up for the monthly newsletter to receive updates about what’s happening at the Institute, as well as news about cyberpeace.

Skip to content