Since the UN Security Council adopted the Women, Peace, and Security (WPS) Agenda in 2000, the world has started to facilitate women’s participation in peace and security processes while protecting them from gender-based violence. But technology-facilitated gender-based violence (TFGBV) is undermining these advancements. An October report by UN Women focussing on TFBGV highlighted an intensification of online misogyny and hate speech targeting women.
Australia should address this global phenomenon with a new bill and lead international efforts to improve transparency and accountability on major digital platforms. Proposed legislation to fight disinformation presents an opportunity to do this. The country should also be looking at further measures, such as promotion of digital literacy and participation of women in policy that pertains to TFGBV.
AI-enabled TFGBV, including doctored image-sharing, disinformation, trolling and slander campaigns affected 88 percent of women surveyed in UN Women’s report, with those in public-facing jobs systematically targeted. If no steps are taken, women may be deterred from meaningfully participating in public discussions and decision-making processes, known as the ‘chilling effect’. This is especially true for female politicians, journalists and human rights defenders who often face politically motivated or coordinated attacks.
Domestically, Australia has seen a positive trend in female representation in politics, with participation in state and territory parliaments increasing from 22 percent in 2001 to 39 percent in 2022. However, in a global ranking comparing the percentage of women in national parliaments, Australia has fallen from 27th place to 57th over the last 25 years.
TFGBV poses a direct threat to women’s participation in Australian politics. A 2022 study by Gender Equity Victoria revealed the prevalence of violent rhetoric and material mostly directed at women and gender diverse people. Prominent female politicians such as Julia Gillard, Penny Wong, Sarah Hanson-Young and Mehreen Faruqi have faced relentless online abuse, often involving implied threats of offline physical harm.
The impact of such abuse is often compounded for women who are religious or culturally and linguistically diverse. This discourages marginalised communities with intersectional backgrounds from participating in democratic processes.
Such harassment is not only deeply personal but widely damaging for Australian democracy. When women face a greater risk of gendered online harassment, fewer will pursue public office or meaningfully engage in political discourse. TFGBV is effectively forcing women out of key decision-making spaces and processes, risking the regression of women’s rights and freedom of speech.
The Albanese government’s recently tabled Combatting Misinformation and Disinformation Bill would grant the Australian Communications and Media Authority (ACMA) new powers to regulate digital platforms, with the aim of addressing harmful content while safeguarding freedom of speech. This aligns with the WPS pillars of protecting the rights of women and girls.
The bill presents an opportunity to tackle TFGBV as part of a broader approach to digital safety, especially as the growth of female representation in government may be at stake. By empowering ACMA to clamp down on disinformation campaigns that disproportionately target women, the bill could provide a crucial pathway for women to engage in public life without fear of TFGBV.
Additionally, the government must build resilience against TFGBV by establishing digital literacy programs that address online safety. These programs should educate the public on identifying digital threats, navigating online harassment and reporting abuse. Integrating TFGBV prevention into educational curricula and workplace policies could equip women to protect themselves online and encourage safer, more secure digital environments.
To address TFGBV proactively, the government also needs to increase female representation in cybersecurity, policymaking and technology governance. It should invest in initiatives that offer scholarships, mentorship programs and career development for women in STEM, with a focus on digital security. This would empower more women to participate in developing cyber policies and gender mainstreaming strategies and ensure that the gendered dimensions of digital security are fully considered.
Since TFGBV also stems from AI algorithm bias on social media platforms, Australia should lead international efforts to establish transparency and accountability standards for AI applications, including through UN Secretary-General Advisory Body on AI where currently Australia does not have any representation. It should require digital platforms to disclose AI applications, detect harmful content and protect users’ data. Additionally, Australia should advocate for measures that ensure algorithms do not inadvertently target women with harmful content or amplify misogynistic narratives.
In leading these initiatives, Australia can build on its WPS National Action Plan 2021-2031, which serves as a framework for efforts to enhance women’s participation in peace and security processes.
Given the borderless nature of digital spaces, Australia needs to collaborate with other like-minded partners to address TFGBV. Regional partnerships could involve information-sharing agreements, joint training on addressing TFGBV, and collaborative research on the trends of AI-driven gendered-harms and how to counter them. A united stance by Australia and its partners would bolster digital security for women, fostering a safer environment for women in public roles.