top of page
Navigating the surveillance state for trans liberation
by Jean Linis-Dinco, PhD

The 2024 Paris Olympics was meant to be a celebration of athletic might yet it became a political battleground where the politics of identity were as fiercely contested as the sports themselves. The Olympics showcased an unsettling moment when Algerian boxer Imane Khelif faced a torrent of bigotry for allegedly not appearing woman enough. The likes of JK Rowling, Logan Paul and tech libertarian Elon Musk were amongst the many that amplified the disinformation campaign against Khelif, which has then spiralled into a public spectacle that not only cast doubt on Khelif's identity but also painted a vivid picture of the broader systemic issues at play. Khelif is not trans. Let us put it out there. Yet, the furor at the Paris Olympics over her gender identity vis a vis gender expression mirrors the disturbing ideologies that trans people have long endured at the hands of transvestigators–the very same people who intrusively scrutinise appearances in their witch hunt to "expose" the next trans individual. Even popstar Taylor Swift did not escape scrutiny as people zoomed in on her ‘bulge’ under her navy blue bathers. It's called mons pubis, people.

Screenshot 2024-11-19 173631.png

This is an example on how Visage Technologies classifies if a face is feminine or masculine. Screenshot from https://visagetechnologies.com/gender-detection/

And this is what Automated Gender Recognition (AGR) seeks to automate. AGR attempts to correlate physical attributes with gender identity, a premise that has already been challenged by contemporary research. An article published by Nature journal strongly argues that anatomy does not definitively determine someone's gender. The author further highlights the complexity of sex and gender as spectrums, which include a variety of biological, psychological, and cultural factors. Therefore, relying solely on physiological traits to define someone’s gender can lead to inaccuracies and harm, as gender identity encompasses more than just visible or genetic characteristics. Another paper worth mentioning is that of Daphna Joel, who finds that human brains exhibit a "mosaic" of features, some more common in females compared with males and vice versa. This undermines the notion that there are distinctly male or female brains, highlighting the complexity and variability of brain characteristics across genders.

By embedding these flawed assumptions into its algorithms, AGR technology institutionalises the discriminatory practices that sparked such controversy at the Olympics. Not only does it systematically enforce a flawed understanding of gender, it has also leveraged surface-level data to make profound decisions about people’s identities. This obsession has real life repercussions of perpetuating, at best, biases and exclusion, and at worst, death. At airports, AGR’s potential to out transgender individuals could be especially harrowing, particularly for those living in countries where their existence means capital punishment. When travelling in the United States, for instance, TSA agents will select your gender based on how you present. While that sounds good in theory, it fails to accommodate individuals whose gender identity does not conform to traditional binary norms, as well as transgender individuals who have not undergone gender-confirming surgeries. If a pre-op trans woman, for instance, goes through the female gender screening, the resulting image in the scan will show a square highlighting the groin area. This will then be followed by a patdown by an agent with the same gender as the traveller. However, if the pat-down does not resolve the security concern to the satisfaction of the TSA agents, the traveller may be subjected to a more invasive search in a private room.

While TSA checkpoint procedures do not utilise AGR, the issues they reveal are emblematic of broader challenges that could escalate with technological advancements. The problems inherent in such subjective assessments provide a cautionary tale as we consider the future of security technologies. And with rapid technological advancements, the potential adoption of AGR is not far-fetched. This shift towards automation poses a real danger that these technologies could be appropriated by governmental and people who now use transgender individuals as scapegoats to foster fear or justify discriminatory policies for political gains. We are beyond the issue of privacy here and even past beyond just acknowledging risk. We are venturing into realms where the very existence of transgender people are at stake.

 

Outside airport security, the reach AGR could extend into everyday spaces such as public restrooms. And this is not just speculative as we are already seeing AGR being employed in various contexts, such as the Giggles for Girls app, the dating app L’App, and even a restaurant in Oslo that targets ads based on gender, showing men pizza and women salad. In areas with deeply conservative values, like the American Bible Belt, the deployment of AGR systems in these spaces to enforce gender norms is a real possibility. The city of Odessa, Texas, has already introduced a $10,000 bounty for reporting transgender individuals who use bathrooms that correspond to their gender identity. This illustrates a troubling trend where AGR could potentially be used to enforce discriminatory laws and stoking fear among transgender communities.

The application of AGR in public security systems, online platforms, and even in everyday consumer technologies frames trans bodies as subjects of suspicion and scrutiny. This invasive oversight is driven by the marriage of the state and the capital that seek to monitor and control societal norms, including rigid adherence to gender binaries. In this setting, trans people are perceived as deviations, bugs in the system if you will, and consequently, treated as threats to social or public order, as is the case for the arrest of transgender people in many countries including  Malaysia, Indonesia, India and the Philippines.

 

Transgender people, by their very existence, challenge these rigid gender norms and, by extension, the division of labour that underpins many economic and social policies. Capitalism relies heavily on the gender binary to sustain the nuclear family model, which in turn supports the reproduction of existing power structures. Angela Davis wrote a strong critique about this in her book  ‘Women, Race and Class’. She highlights that Black women were subjected to a dual exploitation—both as labourers and as reproducers of more slaves, which was crucial for the perpetuation of the slave economy. This exploitation was not just a byproduct of slavery but a deliberate effort to uphold and benefit from the racist and sexist economic structures. The implications of this history are vast and echo in many ways modern capitalist societies continue to exploit bodies considered 'other'— may it be through racial, gender, or sexual discrimination. The surveillance and control of trans bodies through technologies like AGR is a continuation of this legacy, in which certain bodies are monitored and regulated more strictly to conform to existing social norms that benefit the capitalist system.

 

Surveillance becomes a tool to ‘rectify’ deviations or bugs within the system. It allows those in power to determine whose lives are deemed livable and whose are not, enforcing a normative standard from which deviation must be monitored and controlled. For trans people, this can mean the difference between visibility and erasure, between recognition and death. The utilisation of surveillance technologies not only strips individuals of their agency but also places them at an increased risk of violence and discrimination. When the state and societal institutions possess the tools to 'watch' and 'correct,' they wield a significant power that transforms surveillance from a simple security measure into a tool of social control.

 

We are past the point of reform. Surveillance, as it exists at the very depths of the capitalist inferno, is and always has been a tool for subjugating one class by another. Any reform to surveillance that may appear to arise from below either replacing the face of the oppressor, is still fundamentally rooted on power dynamics. As Paulo Freire poignantly noted, "When education is not liberating, the dream of the oppressed is to become the oppressor." No amount of diversifying the faces of those who operate the surveillance apparatus will alter its intrinsic function as an instrument of control. It is a tool designed not just to watch but to maintain and enforce the status quo, to keep existing power structures intact. Replacing the operators of capitalism with more palatable identities does not change the fundamental operation of the system itself to reduce complex identities into manageable data points that can be controlled and manipulated. The answer to surveillance is not the absence of data on marginalised minorities. The answer to surveillance is the abolishment of it.

 

And as we reflect on this reality today, on Transgender Day of Remembrance, we are compelled to honour the memory of those who have suffered and died under the weight of such oppressive mechanisms, but also act upon these ideas. The time for idealism is over. Let this day serve as a call to action to build a more just and equitable society, one that truly honours the diverse tapestry of human experience and fiercely protects it from the corrosive effects of unwarranted surveillance. Remember, not one us can be free until all of us are free.

bottom of page