Invisible Chains: Sex Workers Lead a Critical Dialogue on AI & Human Rights at RightsCon 2025

Invisible Chains: Sex Workers Lead a Critical Dialogue on AI & Human Rights at RightsCon 2025

In the bustling tech hub of Taipei, Taiwan, RightsCon 2025 emerged as a critical platform for addressing the most pressing digital human rights challenges of our time. Among the conference's most powerful and thought-provoking sessions was a workshop that brought into sharp focus the often-overlooked impacts of artificial intelligence technologies’ on sex workers' labour and human rights.

The ESWA session, "Invisible Chains: Exploring the Negative Impacts of AI on Sex Workers' Labour and Human Rights," took place on Tuesday, February 25th, drawing a full room of 58 attendees eager to understand the complex intersections of technology, labour rights, and human dignity.

 

The workshop featured an impressive panel of experts who brought both lived experience, professional and academic expertise to the discussion:

  • Kali Sudhra: Sex educator, performer, anti-racist activist and Chair of the Board of the European Sex Workers' Rights Alliance
  • Yigit Aydinalp: Senior Programme Officer, PhD Student
  • Sabrina Sanchez: Sex worker, trans rights activist, trade unionist, and Executive Director of ESWA

Real Stories, Real Impacts: Case Studies from the Field

What made this session particularly powerful was its grounding in real-life experiences. The presenters shared three compelling case studies that brought the abstract threats of AI into sharp relief.

Algorithmic Moderation: Reinforcing Existing Biases

The panel highlighted the story of Roxy, a Black sex worker and advocate living in Spain who has worked many years in the adult industry. Roxy uses her social media platforms to share educational content about sexual health, safety, and labour rights for sex workers. However, her content is disproportionately flagged, restricted, and removed by algorithmic content moderation systems.

This case study illuminated how AI-powered content moderation often embodies and reinforces societal biases, disproportionately targeting content created by sex workers, particularly those who are also racial minorities or LGBTQ+.

Deepfakes: A Threat to Livelihood and Safety

The panel also discussed the case of Angel, a cisgender woman and sex worker living in the United Kingdom. As a single mother of two and the sole provider for her household, Angel works part-time as a hotel receptionist while also engaging in sex work, providing outcall services to clients when her children are at school.

Recently, Angel discovered deepfake pornographic content created using her images without her consent. This not only violated her autonomy but threatened her custody of her children and her part-time conventional employment. The case highlighted how emerging AI technologies can have devastating consequences for sex workers who often carefully separate their professional and personal lives.

Biometric Age Verification: Exclusion and Discrimination

The third case study focused on Shirley, a Thai trans sex worker who migrated to the Netherlands. After obtaining a Dutch residence card, she began working in Amsterdam's red light district. Due to her genetic background, Shirley appears younger than her actual age, which has led to persistent problems with biometric age verification systems.

These systems, increasingly used on platforms and at physical boundaries, frequently misidentify Shirley as underage, blocking her access to work platforms and creating legal complications. This case highlighted how seemingly "neutral" AI technologies can have discriminatory impacts based on racial and genetic characteristics.

Collaborative Problem-Solving: Breaking Down the Workshop

In a powerful demonstration of participatory education, the 58 attendees were divided into three working groups, each focusing on one of the case studies presented. The groups tackled these critical questions:

  1. What is the specific harm faced by sex workers in each case study?
  2. Are there other communities that experience similar harm from the same technology?
  3. Who is responsible for addressing this problem? (Sex workers? Technologists? Policymakers? Civil society?)
  4. What immediate and long-term strategies could mitigate this harm?

This structure allowed participants to move beyond passive listening to active engagement with the complex issues at hand. The format recognised that solutions must come from collaborative efforts between affected communities, technologists, and policymakers.

 

A Vision for a Better Future: When Society Listens to Sex Workers

The final portion of the workshop shifted from present challenges to future possibilities. The presenters invited participants to imagine a world where sex workers' expertise is centered in technological and policy development. This reimagining painted a compelling picture of what could be possible:

Technology That Protects Rather Than Exposes

In a world where sex workers' needs and insights drive technological development, we could see:

  • Consent-based identity verification systems that protect privacy while providing necessary security measures
  • Content moderation algorithms trained on datasets that don't reproduce biases against sex, sex workers, sexual education, LGBTQ+ content, or racialised people
  • Platform policies developed with direct input from sex worker-led organisations, creating systems that protect without excluding

Legal Frameworks That Empower Rather Than Criminalize

With sex workers at the policy table, legal systems could evolve to:

  • Recognise sex work as legitimate labour, bringing it under the protection of labour laws rather than criminal codes
  • Create specialised data protection frameworks that address the unique privacy needs of stigmatised professions
  • Develop nuanced approaches to content regulation that protect against exploitation without censoring consensual adult content or educational materials

A Tech Industry That Collaborates Rather Than Dictates

When tech companies truly listen to sex workers, the industry could:

  • Employ sex workers as consultants in the development of safety features and content policies
  • Create accessible appeals processes that recognise the power imbalance between platforms and users
  • Design technologies with the most marginalised users in mind, creating systems that work for everyone rather than just privileged majorities

As one workshop participant noted: "When you design for the margins, you create better systems for everyone. The solutions that protect sex workers from AI harms will ultimately create a safer digital world for all users."

Communities That Stand in Solidarity

Perhaps most powerfully, a future where society listens to sex workers would be one where:

  • Intersectional coalitions form between sex workers and other marginalised communities facing similar technological threats
  • Digital rights advocates recognise sex workers' rights as fundamental to broader human rights frameworks
  • The general public understands that technological harms to sex workers often foreshadow broader social threats

From Invisibility to Leadership: A Path Forward

As the workshop concluded, there was a palpable sense of possibility amid the challenges. The presenters were clear: sex workers are not merely victims of technological harm but visionaries for more equitable digital futures.

Sabrina Sanchez captured this sentiment in her closing remarks: "We are not asking to be saved. We are demanding to be heard. When sex workers lead the conversation about technology and rights, we create possibilities not just for ourselves but for everyone who values dignity, privacy, and autonomy in digital spaces."

The path from the present challenges to this hopeful future is neither simple nor guaranteed. It requires sustained commitment from technologists, policymakers, civil society, and the public. Yet the RightsCon workshop demonstrated that the expertise needed to navigate this path already exists within sex worker communities.

Conclusion: From Invisible Chains to Collective Liberation

The "Invisible Chains" workshop at RightsCon 2025 was more than an exposé of technological harms. It was a powerful demonstration of the transformative potential of centering marginalized expertise in our approach to technology and society.

As attendees filed out of the packed session room in Taipei, many carried with them not just a deeper understanding of AI's impacts on sex workers, but a broader vision of how technology could serve rather than oppress, unite rather than divide, and liberate rather than constrain.

In the words of Kali Sudhra: "The future of technology should be written by all of us, especially those who have been written out of the story until now. When sex workers speak about AI, we're not just fighting for our rights -we're imagining new possibilities for everyone's digital dignity."

This vision- of a world where technology is shaped by those most vulnerable to its harms - may be our best hope for ensuring that the digital future serves human flourishing rather than undermining it. And it begins with the simple but revolutionary act of listening to those who have been silenced for too long.

 

You can email our Senior Programme Officer Yigit Aydinalp at [email protected] for questions and other inquiries.

Related articles

Digital rights
Labour rights
online platforms
research report
Digital rights
research
Oppostunities
Digital rights
Digital literacy
Other

Subscribe to ESWA Newsletter

Subscribe to stay informed about new campaigns, policy briefings and resources, events and other activities you can get involved with.