Adobe>
21 February 2025
Digital platforms empower civic engagement and activism, but also pose serious risks, such as government surveillance, targeted cyberattacks, and sophisticated disinformation tactics. Ransomware attacks on healthcare systems, government networks, and infrastructure illustrate how cyber threats can disrupt essential services and national security. Disinformation campaigns, amplified by AI-generated deepfakes and bot-driven misinformation, have been used to shape political narratives, weaken trust in democratic institutions, and incite social divisions.
Our latest research brief ‘Behind the Lens: Exploring the Problematic Intersection of Surveillance, Cyber Targeting, and Disinformation’ examines the complex relationship between digital technologies and their misuse in surveillance, cyberattacks, and disinformation campaigns. This joint study written by Erica Harper, Jonathan Andrew, Florence Foster, Joshua Niyo, Beatrice Meretti and Catherine Sturgess details how the increasing reliance on digital systems has made them primary targets and tools for controlling societies - with deep implications for human rights, human agency and global security.
Using global examples the authors highlight the role of technology companies in regulating these threats, and emphasize the need for a balanced approach that preserves digital freedoms while implementing safeguards. The research brief concludes by outlining policy recommendations for governments to enforce rights-based regulations, private companies to enhance transparency and ethical oversight, and civil society to advocate for digital rights.
This report is part of the Academy’s broader work related to new technologies, digitalization, and big data. Our research in this domain explores whether these new developments are compatible with exis ting rules and whether international human rights law and IHL continue to provide the level of protection they should.
Adobe
Our recent research brief, Neurodata: Navigating GDPR and AI Act Compliance in the Context of Neurotechnology, examines how effectively GDPR addresses the unique risks posed by neurodata.
Adobe
Our research brief, Neurotechnology and Human Rights: An Audit of Risks, Regulatory Challenges, and Opportunities, examines the human rights implications of neurotechnology in both therapeutic and commercial applications.
ICRC
Co-hosted with the ICRC, this event aims to enhance the capacity of academics to teach and research international humanitarian law, while also equipping policymakers with an in-depth understanding of ongoing legal debates.
Adobe Stock
The event, as part of the AI for Good Summit 2025 will explore how AI tools can support faster data analysis, help uncover patterns in large datasets, and expand the reach of human rights work.
Adobe
This training course, specifically designed for staff of city and regional governments, will explore the means and mechanisms through which local and regional governments can interact with and integrate the recommendations of international human rights bodies in their concrete work at the local level.
ICRC
Participants in this training course will gain practical insights into UN human rights mechanisms and their role in environmental protection and learn about how to address the interplay between international human rights and environmental law, and explore environmental litigation paths.
Olivier Chamard/Geneva Academy
UN Photo / Jean-Marc Ferré
Geneva Academy
Geneva Academy