Young girl using phone

ADM+S Early-Career Researchers Convene Gendered Online Harms Workshop

Author Dr Rosalie Gillett
Date 1 April 2022

In late 2021, a group of ADM+S early-career researchers convened a half-day virtual workshop on gendered online harms, supported by seed funding from the Australian Research Council Centre of Excellence for Automated Decision-Making and Society (ADM+S).

Led by Dr Rosalie Gillett (Queensland University of Technology (QUT)) and Dr Anjalee de Silva (University of Melbourne), the research team also included ADM+S researchers Dr Zahra Stardust, Dr Ariadna Matamoros-Fernández, Dr Aaron Snoswell, Louisa Bartolo, and Nick Dowse from the Centre’s QUT node, as well as Dr Lyndal Sleep from Central Queensland University.

The workshop brought together 20 stakeholders from civil society, industry, and academia to better understand how women experience gendered online harms, what digital platforms are doing to tackle these harms, and how platform interventions could be improved.

Dr Anjalee de Silva highlighted the impetus for the event, explaining: “For those of us in the research team, the workshop and broader project are borne of a deep desire to see gendered online harms adequately and appropriately addressed, as well as a deep disappointment with what we see to be significant failures in this regard.”

Dr Mehreen Faruqi, Greens’ Senator for New South Wales, and a panel of civil society representatives launched the workshop by describing their experiences of gendered online harms through their work. Following this, the 20 participants were divided into four breakout rooms on Zoom to discuss gendered online harms and possible ‘solutions.’

In March 2022, Anjalee, Ariadna, and Rosalie presented preliminary findings from the event at the ADM+S ‘Data Harms’ workshop, a new program of work within the Centre led by Prof Megan Richardson.

Participants in the workshop described the problems associated with how platforms self-regulate their services. These discussions primarily focused on the failings of content moderation practices. In particular, participants observed that gendered online harms often result from related and reinforcing patterns of behaviour that women are subjected to online, rather than discrete incidents or pieces of content. This means that the approach platforms have taken to moderate individual pieces of content is not enough to prevent the culture that enables systemic online harms against women.

The workshop participants also emphasised that the automated content moderation systems that platforms employ to detect and remove content struggle to understand context. Because of this, platforms can remove important critical and harmless content, while failing to effectively moderate harmful content.

In light of these findings, the research team argue that platforms need better frameworks to define what constitutes harm, and better mechanisms to guarantee women’s safety online.

At the data harms workshop, Dr Rosalie Gillett questioned how platforms might reimagine the regulation of their services: “What if we investigated ways to rehabilitate those who have caused harm, transform communities, centre victim-survivors, and enable people to feel safe online? How can platforms use automation to create tools that cultivate a sense of community and community accountability that can mitigate harm? And what should platforms do instead where this won’t work?”

She continued: “These are crucial questions that, if answered, could help platforms cultivate safe and inclusive digital environments for all.”

In recognition of the significant contribution made by members of civil society organisations, most of the ADM+S seed funding was used to compensate the civil society participants for attending and contributing to the workshop. Civil society organisations who have first-hand experience advocating for women’s safety online were essential to fulfil the aims of the project.

The research team is now working to write up the findings for publication in an academic journal.


Send this to a friend