top of page
Search

Safety Showcase: Reimagine Gender In Technology


“We’re going to lose female and queer voices – and their amazing ideas – in the public service because we didn’t create an environment to keep them safe.” 

-Hera Hussain, Founder and CEO of Chayn


Those steeped in the work of technology-facilitated gender-based violence know the statistics and the seemingly Sisyphean uphill battle all too well. Nearly 4 in 10 self-identified women on the Internet have been personally affected by violence online, almost 9 in 10 women have witnessed online violence against another woman, and 6 in 10 overall have experienced some form of online harm. The technologies we all rely on each day to work, learn, express ourselves, and engage in civic life are designed and driven by those who may not have an awareness of the dangers that technology poses to women and girls in all their diversity. Dangers that compound for those from lower resourced environments, from ethnic and cultural minorities, LGBTQI+ communities, and other marginalized groups. 


Fortunately, there are those among us working every day to chart a new course for technology; one that is designed not only for safe expression, but which seeks to realize technology as a pathway for empowerment, prosperity, and joy. On 11 March 2025, the Safety Showcase: Reimagine Gender in Technology gathered innovative designers, entrepreneurs and thought leaders to share a sample of what that world can look like.


The Safety Showcase is a joint programme initiated by UNFPA, the UK’s Foreign Commonwealth and Development Office, Numun Fund, with the support of Australia’s eSafety Commissioner, to amplify innovators who share a passion for safe and ethical technology which places gender equality, inclusion and the lived experiences of women and girls at the heart of the design and development process.  


True and lasting change requires shifts across the field – from education and digital literacy initiatives to early career opportunities, training for organizations and advocates, policy reform across companies and institutions, legislative change, exemplars of leadership, and of course, better designed technology. That is why in this first iteration of the Safety Showcase, partners issued an open call for tech products and tools that place gender at the centre of their design, and sought to share a snapshot of global solutions, potential in the field, and gaps that we can work together to close.


Upon issuing an open call that ran for almost three weeks, over 100 applicants from fifty countries and regions demonstrated the first insight into the field: products and solutions for TFGBV are global and growing. Trends across the applications included: 


  • A demonstrated need to provide a welcome interface for those who had been subject to TFGBV and were seeking services from taking abusive content offline to finding mental health support to seeking legal recourse. 


  • When legal support is offered, rarely do survivors wish to pursue a criminal case either against the perpetrator or the platform. They instead prefer to have harmful content removed, and to prioritize their own wellbeing. 


  • Many organizations wish to use chatbots to scale their interfaces to reach more people, and many were using technology tools already for providing advice and access to reporting. 


  • Of the applications that were in the early stages of development and therefore not eligible for the Safety Showcase, most focused on training, developing taxonomies, and collecting survivor stories to share.


All organizations invited to the Safety Showcase demonstrated a deep adherence to participatory design with the target audience involved in the development, feedback, and iteration of the products. Stand out applications recognized the centrality of data privacy by either not collecting or storing identifiable data or when it is collected, taking precautionary measures to secure data and provide protections for those who may be at a higher risk of data violations or other violations stemming from intimate partner violence (IPV). We also noted applications that engaged with AI never used a tool out-of-the-box but always with modification and adaptation to a specific user base. 


The Safety Showcase highlighted the current landscape of technologies designed with gender at the centre as a key mitigation strategy to prevent and respond to TFGBV. It offered a broad representation of the types of tools being used and the stakeholders being engaged – from capacity building for first responders to initiatives targeting university students.


Reflecting the evolving nature of the field, each tech product or tool featured is a work in a state of continual improvement. The products selected through the open call and identified below met the criteria outlined in our Expression of Interest and adhered to principles stemming from our research, including the UNFPA’s Guidance on the Safe and Ethical Use of Technology to Address Gender-based Violence and Harmful Practices; the eSafety Commissioner’s guide on Technology, Gendered Violence and Safety by Design  and Safety by Design Principles; Feminist Principles of the Internet; and standards including Orbits and The Feminist Tech Principles. They also adhered to the criteria of the UNFPA Assessment Tool for TFGBV Safety, and represented a breadth of geographies, audiences, and technological interventions.



Products in the Safety Showcase:


Zuzi AI

Country of Origin and Scope: South Africa with plans to scale

Founder: Leonora Tima

Company: GRIT

Zuzi AI is an innovative, multilingual chatbot dedicated to supporting survivors of gender-based violence (GBV) in South Africa. Developed by Gender Rights in Tech (GRIT), Zuzi provides crucial assistance by offering legal guidance, emotional support, and sexual and reproductive health (SRH) information. Designed with a feminist ethical AI approach, Zuzi ensures accessibility and cultural sensitivity, catering to diverse communities, including youth, sex workers, and LGBTQIA+ individuals. By integrating user feedback, Zuzi adapts to real-life challenges faced by survivors, making justice and care more accessible. Operating via WhatsApp, Facebook, and the web, Zuzi bridges the gap between survivors and legal resources, empowering users with knowledge and connections to legal professionals and support services. As a game-changer in tech-driven social justice, Zuzi AI represents the power of ethical AI to drive systemic change, ensuring survivors receive timely, empathetic, and relevant support.


BullyID

Country of Origin and Scope: Indonesia

Founder: Agita Pasaribu


Bullyid App, a pioneering tech charity in Indonesia, combats online gender-based violence, cyber harassment, and digital threats through an integrated platform offering confidential legal and mental health support. Utilizing AI-powered tools, professional counseling, legal aid, and reporting systems, it empowers victims—particularly women and children—to navigate the digital world safely. The charity advocates for robust online safety policies and corporate accountability. Having impacted over 2Million+ individuals, Bullyid App trains educators and parents in cyber safety, enhances digital literacy, and prevents abuse. Collaborating with law enforcement, universities, and tech firms, it advances AI-driven safety measures and fights online exploitation. By blending legal, psychological, and technological expertise, Bullyid App strives to create safer, more inclusive online spaces free from harassment, making it a vital resource for digital well-being and protection.


Lizzy

Country of Origin and Scope: Germany with plans to scale to EU

Company: Frontline

Co-Founder: Babs Williams


Lizzy is an AI-powered domestic abuse risk assessment tool designed to help frontline services accurately assess the risk of repeat violence and determine the need for support. Lizzy was developed using nationally representative data from Germany and validated on UK-wide data. Currently, it is being tested in over 29 counselling services and shelters across Germany as a GDPR-compliant, user-friendly digital web app. Lizzy assesses multiple forms of abuse—including emotional, digital, and financial abuse—and provides insights into both ongoing and future risks. The output includes a detailed risk profile by abuse type, along with a map of the frequency and severity of reported abuse.


CampusPal

Country of Origin and Scope: Nigeria

Company: Gender Mobile 

Founder: Omowumi Ogunrotimi 


CampusPal is a tech-enabled tool strengthening both online and offline sexual violence prevention and response mechanisms in higher education institutions through survivor-centred reporting, access to policy/ resources and support community, effective case management, and community-led bystander. It is a product of extensive community engagement that embodies survivors' vision of justice. It enhances institutional accountability through case tracking, democratizes access to resources through a learning centre, promotes data autonomy through confidential reporting, and customized privacy setting, aids impact measurement through climate survey and is defined by feminist principles of power diffusion, solidarity and support. The CampusPal tool is part of an emergence of higher education institutions where women and girls in all their diversity have safe, healthy, and equitable access to education without experiences of gender-based violence.


Reliabl

Country of Origin and Scope: USA with partners in the U.S., Nigeria, and Kenya including TechWorker Community Africa

Founder: Annie Brown of Reliabl and and Mophat Okinyi of TechWorker Community Africa


Reliabl provides B2B software for data annotation and ML fine-tuning that helps our customers build more accurate, less-biased models with context-rich, user-driven insights. We have developed the first AI classifier capable of labeling socially complex data, unlocking new levels of nuance and fairness in AI decision-making. Our collaborative annotation framework embeds intersectionality, ensuring AI models recognize race, class, disability, and gender in online harm prevention. By making AI participatory and user-driven, Reliabl reshapes the structures that produce bias rather than simply reducing harm, and actively disrupts oppression by embedding feminist principles into metadata and machine learning frameworks.


Techworker Community Africa (TCA) is a pioneering organization dedicated to advocating for the rights and well-being of African tech workers, including content moderators, data labelers, and AI trainers. TCA seeks to bridge the gap between technology, labor rights, and social impact by providing education, legal support, and mental health resources. The organization empowers workers through training programs, awareness campaigns, and policy advocacy, ensuring that African tech labor is valued and protected in the global AIecosystem. TCA also collaborates with unions, researchers, and policymakers to push for ethical labor standards in the tech industry. By fostering a strong, informed community, TCAis shaping a future where African tech workers are recognized as key contributors to AI development, rather than undervalued or exploited labor. Its mission is to create a sustainable, equitable digital economy that benefits workers and society as a whole.


Euki

Country of Origin and Scope: USA with users in Germany and Latin America

Founder: Ana Ramirez


#EukiApp puts privacy first. It's the only free mobile period tracker that connects users to evidence-based information on censored or stigmatized sexual and reproductive health experiences like abortion, contraception, menstruation, and sexuality. Euki also helps users safely search for the sexual and reproductive health care they want and need—all without collecting a single bit of personal data. Validated by privacy experts like the Mozilla Foundation and Vagina Privacy Network, Euki reduces risks from digital and over-the-shoulder surveillance through optional PIN protection, no phone number or email sign-up requirements, and local data storage (all data entered into the app is stored only on your device, so only you have access to it).

Co-created with a diverse User Advisory Team, Euki is designed to be inclusive of ALL sexual and reproductive health experiences. Because when we center the people who face the greatest barriers, we build a product that benefits everybody.


Shhor

Country of Origin and Scope: India

Founder: Aindriya Barua


Shhor AI is a SAAS API for online platforms, designed to detect and take tangible actions against hate speech and doxxing in India's diverse linguistic landscape. Trained on an in-house dataset of 45k tagged data points, Shhor AI understands eight hate types: queerphobia, communalism, casteism, sexism, electoral hate, racism, ableism, and general violence. Shhor AI is unique because it understands code-mixed language, which is common in the Indian context but overlooked by existing content moderation tools designed for the Global North.  Shhor AI's ability to combat hate speech across online platforms ensures the safety of marginalized communities. Shhor AI uses art and social media to make AI and digital rights more accessible to the masses.



The challenges for technology that centres gender as a key risk mitigation strategy for TFGBV are immense. The Safety Showcase provided an opportunity not only to identify and highlight best practices but also a space to share and understand the challenges and consider collaborative opportunities to overcome them.During the development and execution of the open call, a range of key concerns were identified including:


  • There is a systemic reticence to invest in women-led organizations or in gender-centered technology. There is an unfounded perception that women-led businesses and women-focused causes are more risky investments and that any prioritization of women belongs within a csr or social impact portfolio. However, with high-profile demonstrations of products that are thriving and capturing greater market share, we anticipate and will work toward a greater growth of funding of gender-centered technology.


  • Growth of technological platforms fueled by a venture capitalist industry supports a mindset that demands exponential growth and the timelines for delivery can be incompatible with the work required to support gender-centred technology including AI tools. Nearly all open source AI models are trained on data that is biased against women, gender-diverse people and other population groups at the intersection of multiple forms of discrimination, requiring additional investment and training data to improve their accuracy.  The meticulous work requiring extended timelines to create AI models that do not rely on biased data is often not understood as a necessity by funders and investors. 


  • When grant funding is secured, the timelines to obtain and distribute grants can be incompatible for compete with venture-driven funding. When funds are sought through grants, corporate social responsibility programs, or other sources that are designed for social impact (and are therefore receptive to gender-centered technology), the timelines for application and delivery are slow (certainly compared to that of venture capitalist funding). This means that the design ideas have been outpaced by the market before the project is set to commence. Furthermore, financing the required compute for running an AI system is often misunderstood or unaccounted for by social impact organizations and funders.


  • The enthusiasm to integrate artificial intelligence (AI) into all aspects of technology demands a solution that is often not useful to the problem. The push to integrate AI in any product has meant a great expenditure of time, funding resources, and environmental impact from AI computational needs when the need for AI or for any machine learning intervention is not often required for a meaningful and sustainable intervention.


Going forward, we aim to expand our network and pursue facets of the field that need improvement: data, policies, legislation, leadership, and beyond. We invite those involved in this work to join as contributors and collaborators, and as always, we welcome opportunities to learn more from those active on the frontlines. 


Thank you to all who continue to build technology that is inclusive, safe and grounded in the needs of all users.



 
 

Let’s Work Together to Promote Safety for Women and Girls Online

© 2024 by Safety Showcase. Read our Privacy Policy.

bottom of page