Adobe>
21 February 2025
Digital platforms empower civic engagement and activism, but also pose serious risks, such as government surveillance, targeted cyberattacks, and sophisticated disinformation tactics. Ransomware attacks on healthcare systems, government networks, and infrastructure illustrate how cyber threats can disrupt essential services and national security. Disinformation campaigns, amplified by AI-generated deepfakes and bot-driven misinformation, have been used to shape political narratives, weaken trust in democratic institutions, and incite social divisions.
Our latest research brief ‘Behind the Lens: Exploring the Problematic Intersection of Surveillance, Cyber Targeting, and Disinformation’ examines the complex relationship between digital technologies and their misuse in surveillance, cyberattacks, and disinformation campaigns. This joint study written by Erica Harper, Jonathan Andrew, Florence Foster, Joshua Niyo, Beatrice Meretti and Catherine Sturgess details how the increasing reliance on digital systems has made them primary targets and tools for controlling societies - with deep implications for human rights, human agency and global security.
Using global examples the authors highlight the role of technology companies in regulating these threats, and emphasize the need for a balanced approach that preserves digital freedoms while implementing safeguards. The research brief concludes by outlining policy recommendations for governments to enforce rights-based regulations, private companies to enhance transparency and ethical oversight, and civil society to advocate for digital rights.
This report is part of the Academy’s broader work related to new technologies, digitalization, and big data. Our research in this domain explores whether these new developments are compatible with exis ting rules and whether international human rights law and IHL continue to provide the level of protection they should.
Adobe
Our research brief 'Neurotechnology - Integrating Human Rights in Regulation' examines the human rights challenges posed by the rapid development of neurotechnology.
Geneva Academy
The Geneva Academy’s latest publication explores how cities, municipalities, and regional authorities are becoming key players in global human rights governance.
Wikimedia
This Human Rights Conversation will explore how cross-mandate cooperation can be enhanced, and how academia can play a more strategic and aligned role in supporting mandate holders.
LATSIS Symposium
This interactive, two-part workshop will explore how modern data-science tools – including machine learning and AI – can be leveraged to support the United Nations in promoting and protecting human rights.
UN Photo / Jean-Marc Ferré
This training course will explore the origin and evolution of the Universal Periodic Review (UPR) and its functioning in Geneva and will focus on the nature of implementation of the UPR recommendations at the national level.
ICRC
Participants in this training course will gain practical insights into UN human rights mechanisms and their role in environmental protection and learn about how to address the interplay between international human rights and environmental law, and explore environmental litigation paths.
Paolo Margari
This research aims at mainstreaming the right to a clean, healthy and sustainable environment and the protection it affords in the work of the UN Human Rights Council, its Special Procedures and Universal Periodic Review, as well as in the work of the UN General Assembly and UN treaty bodies.
Adobe Stock
This project addresses the human rights implications stemming from the development of neurotechnology for commercial, non-therapeutic ends, and is based on a partnership between the Geneva Academy, the Geneva University Neurocentre and the UN Human Rights Council Advisory Committee.