In this dialogue, we explore how online violence—intensified by AI—is impacting liberatory expression, and reflect on what feminist digital infrastructures could look like.

1. In what ways do mainstream digital platforms fail users facing misogynistic, casteist, and other kinds of backlash online?
NOOPUR: The reason I never say digital platforms are “failing” marginalised groups or survivors is because that would be dignifying the owners of these platforms too much, as if they care about honouring anything but their own quest for power and profit.
BISHAKHA: The real problem with platforms is that they refuse to be held accountable to their users, meaning more than 5 billion people online. Most of these platforms are corporate, and they think that they are accountable only to their shareholders for the profits that they make. So, they fail us completely when we face problems. Like, they refuse to be the first port of redress. So, for instance, if I face hate speech or slurs on a social media platform, I want the platform to do something about it. I don't want to have to go to the cops for something like this. But the platform does nothing, or it has reporting mechanisms, but we've known for the last 10 years that these don't work at all. And misogyny and casteism and queer and transphobia are growing on these platforms at a rate where it's really becoming a toxic environment for many, many of us.
NOOPUR: Yes. And part of the problem is that people continue to worship tech billionaires behind proprietary media. Elon Musk has held an iconic status, shaping dominant tech culture. We now know that he wrote to Jeffrey Epstein, asking, “What day/night will be the wildest party on your island?” after publicly stating that he had refused Epstein's invitations. Expecting accountability from these mega-rich tech bros seems pretty futile. We have to find new ways to hack the system.
2. What structural gaps—legal, technological, and in platform-governance—make feminist or liberatory creators especially vulnerable to coordinated digital harassment?
NOOPUR: We are in the era of vectoralism—scholar McKenzie Wark describes it as a system within which those who control digital platforms hold disproportionate power over hackers (by which she doesn’t mean computer hackers or tech bros, but anyone like us who creates content, new knowledge, or culture in digital spaces). The patriarchal monopolies are getting bigger, and the state protects their interests. That’s the big structural issue. Besides, structural power balances are reinscribed in digital spaces—they don’t exist outside of systems. If anything, they provide easy access to viral mobs, exacerbating the violence on marginalised bodies.
BISHAKHA: This is a great question. So, I think the issue is that online harassment is not seen as segmented. For example, when one person harasses you continuously, that's one form of harassment. When you have a lynch mob or a cyber or the cyber troops descending on you online, with coordinated attacks by hundreds of accounts. That's a different kind of violence. And I'm not building a hierarchy between the two because both of these feel awful to the person experiencing them. But what I do want to say is that the law, software developers, as well as platforms, could think of structural solutions where the law defines certain kinds of violence differently, and has different remedies for them. And particularly platforms are in a position where they can quickly gauge at the back end that a coordinated attack is happening. And frankly, we know that it wouldn't take that much for them to put into place technological features to immediately block these kinds of attacks. So, these are actually really structural failures.
3. What would a feminist or liberatory digital infrastructure look like in practice?
NOOPUR: A liberatory approach to creating digital infrastructure would mean decentralising it, stepping aside from surveillance and extraction, and moving towards reimagining it as capable of being harnessed for community care and interconnectedness. These would be things too drab for vectoralists obsessed with colonising Mars and creating a post-human species.
At Smashboard, we may have been awarded the title of a “global breakthrough digital innovation with the potential of profound and lasting impact on digital society”, yet we face constant barriers. We are perceived as threatening because backing a platform that empowers survivors would mean admitting there is violence, which would mean there are perpetrators inflicting it. That would be an admission of guilt or admitting patriarchal structures exist. They feel less threatened when we keep using their platforms, which most of us do, given the limited choices and resources.
Sexist tech bros wielding expertise and patriarchal investors wanting to blunt our political edge are everywhere. And tech isn’t cheap to build or maintain because it is meant for private ownership rather than the commons.
BISHAKHA: This question gives me so much joy, and it's so timely because I think we are at a moment where imagination is turning out to be a big political tool that we really need to take seriously. And the feminist imaginary of digital spaces is vastly different from the white tech-pro billionaire imagination of digital spaces. At Point of View, we recently did consultations with groups in Mumbai, Bangalore, Goa and Delhi on what social media should look like.. And we came up with some brilliant insights. One is, I think everybody said that safety needs to be flagged right up front in digital infrastructure, so that you don't have to go looking for it everywhere. It should be like safety by design, which also means privacy and consent are parts of safety by design. And instead of signing like this legal gobbledygook, which constitutes the terms and conditions, actually, you should be signing something where you sort of agree to certain behaviours vis-a-vis other users. And we talked about imagining digital infrastructure, digital apps, and sites being programmed for joy, for pleasure. I think the other very, very important thing, of course, is imagining an infrastructure where we have bodily integrity in phygital (physical and digital) spaces. So regaining control over our bodies just like, we, as feminists, want bodily integrity in physical spaces. We need more choice and options regarding what we may agree to, when we can revoke consent. This is important especially in the context of third party data usage and data scraping for AI.
4. What does the current state of digital activism reveal about the limits of visibility as a strategy for social change?
NOOPUR: Whether in the digital space or offline, creating a hierarchy of individual leaders and followers is complicated because it opens up room for corruption. When we are enamoured with personal brands, we undermine the collective. This is not to say that initiative is unwelcome. It just becomes tricky when we rely on those chasing attention in a space where the algorithm offers rewards for performativity, outrage, and sensationalism much faster than in the offline world. Not to forget how systemic privilege gives some access to increased visibility.
BISHAKHA: What I'm seeing is that for a lot of people who are queer or in sex work, you know, or porn performers, etc. Visibility is really a double-edged sword because, on the one hand, of course, it validates and builds community, etc. And it confirms one's presence in the world. It counts us in, but on the other hand, it is very, very difficult as we set ourselves up as a target for the kind of hate that marginalised persons face online. So that's a tough one.
5. How are digital publics conditioned to treat feminist and liberatory content creators, especially women and folks from marginalised groups?
NOOPUR: We are all aligned with the view that the sexist and misogynist public is predictably oppressive, as are the platforms—and we are all battling with that. What is also becoming important to name is that feminists and radical activists in the digital space have not yet broken free from the culture of perfectionism, where punishment remains the organising principle of the collective body. Too often, we move at lightning speed to “like” and amplify content that aims to attack and hurt individuals who don’t meet our standards, stripping them of the safety, dignity, and belonging that every human being deserves. That’s not transformation, that’s wielding “power over” rather than “power with.”
BISHAKHA: Digital publics, actually take their cues from the powers that be. And part of the problem is that the world is moving towards the right and governments, politicians, corporations, you know, billionaires, business people, etc., tend to subscribe to a worldview in which women and others are seen as subordinate to men. And when this worldview gets cemented or bought into or sort of embedded or coded, let's say, by digital platforms in their policies, in their practices, we essentially have a situation where the structures of power are influencing digital infrastructures to treat women, gender diverse persons, queer and trans persons as less than cis people, heterosexuals and less than men. This is actually a huge problem.
6. You mentioned, Noopur, that feminist spaces themselves can reproduce perfectionism and punishment. And this question is for both of you—what conditions make this policing so intense in digital environments?
NOOPUR: We are embodied very differently in conflicts outside the digital space, in the physical world. I like to look at this through the filter of kinesics and somatics, which look at communication between people through movement, gestures, and nervous system cues. We might be more attuned or even sensitive to cues from the person we attack, which go missing when we are adding fuel to the fire in comments. Also, it’s easier to tune out of the conflict when it’s in the digital space–you block someone, you unfollow, you delete your app.
BISHAKHA: I would add a third ‘p’ to the mix. Performativity. So, what happens in our digital spaces is because we are public all the time, and we tend to unconsciously perform. We signal our virtue through intensifying our political punitive gaze through cancel culture, through many other means. So, I think the cue really lies in the publicness of digital spaces and what that publicness makes us do whether we want to do it or not.
7. How can feminist communities cultivate accountability without slipping into punitive or dehumanising practices?
NOOPUR: The culture of punishment is seeped inside of us. When we label and ostracise people to assert our moral or ethical authority, we feel in control. But we need to see how we might be reproducing cycles of harm. An anti-carceral abolitionist politics requires us to step away from this culture. Instead of being fixated on punishment, we can move toward generative forms of conflict within liberatory networks, where discord is seen as an opportunity to learn and grow with compassion for each other rather than to blame and shame.
BISHAKHA: There are now a range of examples of restorative justice and other practices that we as feminists can really look to as we try and tackle the scourge of online violence and harassment. I always think of South Africa's Truth and Reconciliation Commission, I also think instead of always relying on criminal law and carcerality, we might want to look outside at civil law. We might want to look at economic fines or penalties.
8. What emerging forms of digital violence are most concerning for you (deepfakes, sexualised intimidation, targeted disinformation)?
NOOPUR: All forms of breach of privacy and consent are concerning because they are extreme tools of psychological and physical harm. They cut deep. And usually, marginalised bodies are the worst hit. AI is at the service of those who have access to it with time and resources, and that includes perpetrators of patriarchal violence. I find deepfakes particularly unsettling as they mess with reality in ways that seem very intimidating. At the evolutionary level, our brains rely on an accurate perception of what is real vs. fake for survival. However, the deception of deepfakes robs us of agency. Moreover, if we can’t control how the world sees us, it can trigger deep terror and a real risk of being kicked out of the community. Though other forms of digital violence can also be very traumatic.
BISHAKHA: As we saw recently with Grok's undressing features, I think we're really entering an era where it's going to be very, very hard to tell what is real and what is fake, and where tools like AI are going to be used to create new forms of online gender-based violence. And it's going to be really difficult to keep up with these and deal with them.
9. What immediate platform or policy-level reforms could meaningfully reduce harm for feminist and liberatory creators?
NOOPUR: Digital spaces must be community-owned, and monopolies must be eradicated. We really have to keep building political momentum for something more radical than relying on proprietary apps or lawmakers to bring about change. That can’t wait anymore.
BISHAKHA: There are so many small technical features that could be introduced into existing platforms, into new platforms being created, which build in safety from the outset. We need, at a larger policy level, sort of, you know, antitrust measures, breaking up monopolies, really trying to reduce the power of big tech. It At the same time, it is possible for platforms that are being created by a different set of creators, including feminists, to have different imaginaries and to start building in some of these features coming from lived experience, way easier to understand, and sort of almost create a new imaginary of platform accountability.
Read: Moon Landing Revellers Threatened Me with Rape, but I Won’t Let Them Make the Rules