close
arrow_back_ios
close
arrow_back_ios
close
arrow_back_ios
close
arrow_back_ios
close
Blogs

#AccelerateAction for a safer digital world

  • Published Mar 14, 2025
13 03 2025 TAG IC Accelerate Action IWD Social2

Last Saturday (March 8), International Women’s Day was celebrated. The theme for 2025 was #AccelerateAction, and the campaign called for an increase in the rate of progress toward gender parity. International Women’s Day calls for groups to implement effective strategies to address the systemic biases faced by women to move the dial on gender equality faster. These strategies include tackling gender-based violence and discrimination, leveraging technology to improve inclusion, and strengthening legal and policy reforms.

 On this backdrop, Ofcom’s publication of its latest draft guidance, ‘A Safer Life for Women and Girls’ at the end of February is timely. This draft is open for consultation until 23rd May. It was developed per Section 54 of the Online Safety Act (2023), which requires Ofcom to provide industry guidance to digital service providers on how they should tackle harmful activities that 'disproportionately affect women and girls'.

 The Act, which is intended to protect children and adults online by placing new obligations on digital service providers to make them responsible for user safety, also requires Ofcom to produce Codes of Practice on how providers must act against illegal content harmful to children. These Codes of Practice are legally binding and come into force this year, and the Online Safety Act explicitly states that the Guidance on creating a safer life online for women and girls may refer to provisions contained in these Codes of Practice. They may also, per the Act, contain examples of best practices for assessing and mitigating risks of harm to women and girls.

 The draft Guidance does both, referring to the legally binding Codes of Practice and setting out best practice measures, which are voluntary. It does so by dividing its measures into ‘foundational steps’ (steps which providers can implement to fulfil their duties under the Codes of Practice – duties which service providers are legally required to fulfil) and ‘good practice steps’ (steps which include practical details of how providers can go beyond these obligations and implement good practice). These ‘good practice steps’ are voluntary, and service providers aren’t obligated to implement them. In this way, the Guidance, beyond reiterating the requirements of the Codes of Practice, is not legally binding. Ofcom does, though, explain that the regulator expects digital service providers to act in three areas: (1) taking responsibility, (2) preventing harm, and (3) supporting women and girls on their services.

How does this relate to International Women’s Day?

Ofcom’s evidence demonstrates that these steps are needed: their Online Experiences Tracker reports that women and teenage girls are “more likely than men and teenage boys to report being negatively impacted by the harms they experience online (24% and 29% vs. 11% and 19%, respectively).” Existing gender-based online harms have also been exacerbated by the increasing use of generative AI, with a helpline for victims reporting more than a 100% increase in reports in 2023 from the previous year. Digital service providers need to act now.

 This is especially important given recent events and a worrying shift away from commitments to diversity, equity, and inclusion. For example, Mark Zuckerberg, the CEO of Meta, recently called for more ‘masculine energy’ in the workplace and rolled back its content moderation provisions, with the Centre for Countering Digital Hate recently sharing that Meta has dropped its policies on “gender identity and gender. This has the effect of allowing misogyny and online gender-based harms to spread.

 These are some of the largest digital service providers in the world, and the proliferation and spread of misogyny on these platforms could increase harm to women by shifting socially accepted norms. Instead of accelerating action, the failure of service providers to combat these online harms sets us back. This is, at least in part, why the International Women’s Day campaign identifies enforcing and strengthening laws that promote gender equality and protect against gender-based violence as the top priority for accelerating progress. Hence, much more needs to be done on the level of both legislation and policy in addressing the issue of gender-based violence and inequality. Does this Guidance from Ofcom go far enough to deliver on this objective?

What are people saying about it?

 Critics say it doesn’t. End Violence Against Women, a coalition of organisations working to end all forms of violence towards women, has said that while it welcomes Ofcom’s Guidance, “it remains the case that Ofcom is hamstrung by the fact that the proposals are voluntary only, with no actual requirement on tech companies to put in place any of the recommended good practices.” To make the Guidance into a Code of Practice would be to make it legally binding and place additional requirements on companies to comply with what Ofcom has set out.

 Another means of putting additional legal weight behind at least one of the issues Ofcom is seeking to address through its Guidance would be the introduction of criminal offences for certain forms of online gender-based violence. For example, it’s been suggested that a stronger legal framework is needed to address non-consensual intimate image abuse, which involves intimate images or videos being produced, published, or reproduced online without consent and which is one of Ofcom’s areas of focus in its Guidance.

 A new report from the Women and Equalities Commission, a House of Commons committee, points out that while the Online Safety Act created criminal offences for individuals concerning this form of abuse and placed obligations on tech companies to remove such content from their services, additional measures are needed to address this form of online harm fully. The Commission draws attention to the fact that non-consensual intimate image abuse is “a deeply gendered threat” and that “in 2023, 71% of reports received by the Revenge Porn Helpline were made by women (where the client’s gender was known).” To combat this form of online gender-based harm, the Commission is clear in stating that more must be done.

 They note that Ofcom’s enforcement action may be too slow to effectively combat the scale of the problem as well as mitigate harm to individuals. Ofcom is not at this stage able to respond to or adjudicate on individual complaints. The Commission argues that while the regulator’s enforcement powers are welcome, more provisions must be made to enable individual victims to have abusive images of themselves removed from non-compliant websites.

The need for redress

 Part of the reason behind this issue is that the approach the Online Safety Act and Ofcom have taken is a preventative approach targeted at systems and processes rather than supporting individuals’ right to redress. This ‘systems and processes’ approach is intended to tackle the root causes of harm online by working toward systemic improvements that reduce risk at scale. It’s a way of capturing the interactions of various risk factors and how they influence one another to fully understand (and consequently mitigate) complex risks and harms.

 While this is the right approach to addressing online harms at scale, protecting users’ rights, and ensuring that service providers bear the burden of responsibility about online harms rather than users, it doesn’t empower users to exercise their right to redress outside of the criminal justice system, through civil remedies. Redress in this context refers to the seeking of justice through the remedying of the harm suffered by victims of image-based abuse and online gender-based harms. The Women and Equalities Commission’s report strongly recommends that the Government introduce a civil process that would empower individuals to take action to have their non-consensual intimate images removed or blocked from a service.

 While the introduction of such a process is outside of the scope of Ofcom’s powers now, Ofcom’s ‘A Safer Life Online for Women and Girls’ draft Guidance contains several references to redress. For example, in its ‘good practice steps’, Ofcom suggests to digital service providers how they can “create safer experiences for women and girls, give their users more autonomy, and provide assurance that users can seek appropriate redress for any harm that does occur.”

 In a case study on good practice about ensuring that governance and accountability processes address online gender-based harm, Ofcom advocates for digital service providers’ use of external oversight mechanisms for the handling of appeals against content moderation decisions. Doing so would “introduce accountability and oversight for bias in decisions and policies” and, by providing feedback to service providers on their policies, enable better implementation. More consistent, less-biased content moderation should lead to improved systems and processes and reduced risk of harm to women and girls.

 While engaging with an external appeals body following a content moderation decision is not the same as establishing a civil process for legal remedy, both measures aim to empower users, particularly those impacted by online gender-based harms, to exercise their right to redress. A gap in the regulatory framework exists. Ensuring that women and girls have access to a safer digital landscape requires more than voluntary commitments from digital service providers—it demands strengthened enforcement mechanisms, expanded legal protections, and access to meaningful redress. Taking these steps will empower women and girls and hold perpetrators accountable. To build a safer digital world, prioritising effective avenues for redress must remain central to efforts to #AccelerateAction.


To learn more about International Women’s Day, see here.

To read Ofcom’s A Safer Life Online for Women and Girls draft Guidance, see here. To contribute to Ofcom’s consultation on the draft Guidance, see here.


Want to learn more? We’re happy to help.