I want to talk about race conditions.
And I don't just mean race conditions in the software sense, but the issue of bias based on race and other factors in software, and how these human issues can affect something as small as a logical error, to something as big as how an entire application can adversely affect groups of people.
See, race conditions (in the software sense) and racial bias in technology are both related by fault of assumption. We assume that entities, whether it's humans or external services or other points of communication, will behave in a certain, specific way. Quite often, our assumptions are wrong. And when they're wrong, the software no longer behaves optimally for the users.
It's well-understood that machine learning models are only as good as their data. When the data is biased, so are the models. Recently, there was a federal study of top facial recognition algorithms that found "empirical evidence of bias", leading to greater inaccuracies based on race, as well as gender and age.
This is known as algorithmic bias, which should be frightening to you, as someone who (I assume) works in technology. The bias is encoded into the software that you work with. You might say that you're not biased, but the applications and software that you're developing might very well be. And you know as well as I do that software in production doesn't change quickly, especially if we deem the bias to not be priority or time-critical to the "acceptable" everyday operation of the software.
But hopefully with these changing times, we're realizing that these human issues with addressing bias, whether it's based on race, gender, age, or other factors that affect a large group of the humans that use our software, should be prioritized.
So what does this have to do with the software you write every day? Well, you and I both have what is called implicit bias:
Also known as implicit social cognition, implicit bias refers to the attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner.
Our natural implicit bias is what causes us to code race conditions without even knowing it. Like I said before, we create race conditions by assuming how entities will behave with our software.
But we're only human! There is always a non-zero chance that we will miss something, whether it's unexpected orderings of events, various branches of nested if-statements, cases in switch-statements, or states that should be impossible. If you've ever wrote this comment in your code:
// This should never happen
...then you know exactly what I'm talking about. If we assume that events will always come in
A, B, C order, then race conditions are very likely to occur for the cases when events come in
B, C, A or
A, C, B, etc. order. And we need to account for all of those potential combinations.
On a lighter note, let me illustrate race conditions with a funny example. Back when cruises were a bit more enjoyable and less worrisome due to a global pandemic, I took a cruise trip with my family. If you've ever been on a cruise, you know that on-board WiFi is 💸 expensive. But I needed to work (yeah yeah, I know, even on vacation), so I needed the WiFi.
The flow for getting WiFi was interesting. First, there was a login screen. After logging in, there was a "checking status" screen. Supposedly, it was making a network request to see if there was already a device registered. After a few seconds, it transitioned to a "connecting" screen where it connected to the WiFi network, and eventually, I was connected.
The problem is that this only worked for one device, and I wanted WiFi on both my phone and my laptop, without having to pay for two separate devices.
Let's think through that flow:
Obviously, no luck connecting to 2 devices on WiFi, since there is logic to check if a device is already connected. But then I thought... whoever programmed this is making the assumption that only one device would be checked at a time.
Are you thinking what I'm thinking? Let's try a different flow:
Lo and behold, that worked! The secret was to try connecting with both devices at almost exactly the same time. There is that "race condition period" between "checking status" and "connected" where no device is connected yet, so we can trick the logic and connect multiple devices at the same time.
In this case, race conditions were used to the benefit of the user who exploited it (me), but in most cases, race conditions are unintentionally encountered, and are generally detrimental to the user experience.
So how can we become better developers (much less better humans) and avoid race conditions and prevent letting implicit bias leak into our software that affects so many other people?
The answer is simple but not easy: Strive to eliminate all implicit bias. This is a conscious decision, not one where you can just think "oh I won't be racist/sexist/etc." but one where you must actively work on eliminating those biases.
Let's face it. Racism is prevalent in software, from the code that's written to the people who write the code, and the people who hire those people, and on and on. I personally know many people who have faced racism in tech (myself included), and it's one of the many ways that bias can adversely affect large groups of people quickly. And especially now, it's extremely important for us to tackle.
By being consciously aware of your implicit bias, you can become a better programmer, and a better human.
Many open-source projects, like React, Formik, Redux, XState, MobX and more have been making a great initiative to bring awareness to fighting racism. See the showcase at blacklivesmatter.dev of other tech companies, projects, and individuals taking the initiative.
You can support the Equal Justice Initiative here.
I would also strongly recommend supporting Black Girls Code, whose mission is:
... to increase the number of women of color in the digital space by empowering girls of color ages 7 to 17 to become innovators in STEM fields, leaders in their communities, and builders of their own futures through exposure to computer science and technology.
Let's be better humans.