‘You will have to choose one of two options’ — Being Nonbinary in a Binary World

In western contexts, a fixed binary notion of gender is deeply ingrained — something I experience personally on a nigh-daily basis. With recent changes in, e.g., Austrian, Dutch and German legislature, nonbinary genders are, at least in some areas within Europe, finally accepted as part of legislative systems. This change makes evident just how much gender is encoded in multiple infrastructures, most prominently in technological forms across educational, commercial, medical and governmental contexts, often mandating the disclosure of gender limited to binary options. 

Curiously, the binary does not even hold in the most traditional of gendered titles (consider Mrs., Ms. and Mr.) and yet, implementing titles reflecting non-binary genders appear to require an insurmountable amount of work, as I found out when challenging infrastructures, I encountered since receiving my own legal status as non-binary. Concretely, to illustrate the pervasiveness of (binary) gendered digital infrastructures and the impact this can have on non-binary individuals encountering them, I conducted an autoethnography. For more than a year, starting with me receiving a legal non-binary status, I documented the range of technological systems that did not allow me to register my gender correctly and/or appropriately, including cases from university systems, online shopping platforms, health insurance, and government services.

To understand how this comes to be, we first have to understand how databases comprise not just a type of text, but a form of speech. The way they articulate gender is subtle, but constantly present. As the backbone to any kind of digitised data intake (e.g., through web or paper forms), when faced with gendered information, they (literally) encode gender in return. Even in cases where options for self-selected gender might be more multi-facetted on the surface, the backend might rely on a binary gendered representation. Predominantly, this is achieved through the use of a Boolean, an exclusively binary data type. Booleans can take on one of two states: 1 (true) or 0 (false). In textbooks on databases (and in how databases where taught to me personally), gender is often modelled as a fundamental construct part of determining a person, so fundamental that gendered data is often forced along a presumed binary even in cases where this is entirely unnecessary. This is particularly problematic for non-binary people, who often have to decide which flavour of wrong address we/they have to endure every single time they will interact with the entity we/they just submitted our/their data to (e.g., a web shop).

Between August 2019 and January 2021, I documented about 55 cases in which I contacted institutions that did not allow me to record gender in any kind of non-binary fashion. Most of those occurred before March 2020, when the global COVID19 pandemic hit Europe — largely due to the emotional labour required on my end to process through the sustained upheld exclusion. In fact, only in nine cases, systemic change was implemented and five further cases allowed for an individual solution for me personally. That means that in 75% of the cases (41/55), I was either ignored, told that changes were maybe coming in some undetermined future, referred to external constraints and the complexities of databases (even though recipients were made aware through my signature that I was a computer scientist), or told to be patient while the institution figured out how they handled this case. Subsequently, I became more and more exhausted, frustrated and angry about the project as every time I contacted a service I would not know whether I would be persistently and deliberately misgendered, dismissed as irrelevant or maybe acknowledged as a human being (all of which happened). 

However, the infrastructures are not just relevant for acknowledging non-binary people in our data, not doing so has material consequences that fundamentally exclude us/them from core services. For example, since 2019, I could not buy any insurance as all the risk models appear to be exclusively calculated along binary genders — meaning that certain risks are now my own to cover. In another contexts, a train company suggested to me to use single cabins for night trains (which cost three times as much) as they only provide shared cabins for binary genders. Systematically not accounting for non-binary people in our infrastructures, hence, comes with the material effect where being recognised in one’s gender becomes a privilege with the risk of taking high financial consequences. 

Changing our database designs and the way we model a person comprises just the first step here, albeit a fundamental one — starting with not requiring essentially private, individual and personal information to be required across so many instances of digitised interactions. However, what my work additionally shows is how computing and especially a critical analysis of its normativising and excluding effects requires marginalised people to research their own interactions with such technologies. Instead of focusing on notions of generalisation, I encourage us to consider the particular and the specific. Instead of discarding the outliers, we need to understand how they ended up where they are and how such exclusions might operate on a scale larger than a given study. Otherwise we risk narrowing the already slim margins that define our encoded norms.