Implicit bias workshop

A couple of weeks ago I gave a workshop on implicit bias for a group of about 50 physicists at the Heidelberg university. I’m not a social scientist or a psychologist, but rather a computer scientist, and it was the first time I was doing something like this.

Having a bias means having a preference for a social group. Having an implicit bias means being unaware of these preferences, which are based on stereotypes rather than our conscious knowledge. For me personally, learning about the implicit bias helped me realize why people (including myself) act and react in certain ways and adjust my behaviour. In the past few years, my interest towards the bias studies kept growing. This interest was one of the things that motivated me to join the Homeward Bound program (see my previous post on why I joined Homeward Bound).

 A friend of mine is regularly participating in the “women in STEM” lunch at the Heidelberg university. This month, they decided to invite male colleagues and were looking for a topic to focus on during this meeting. I suggested implicit bias and they invited me to run it.

I started with the old puzzle: A father with his daughter are in a minor car accident. Nothing serious, but the child needs to be taken to the hospital for an examination. In the hospital, the doctor examines the girl and says: “This girl needs a minor operation, nothing serious. But I cannot do it, because she is my daughter”. How is it possible that both the driver and the doctor are the parents of the girl?

A couple of people who knew the puzzle as well as one person claiming that they immediately got it were asked to stay quiet. For the rest of the people, it took about 5 minutes to figure it out. The usual versions were proposed: gay couple, biological vs foster father, transgender, the same person, time travel etc. Then we talked about why people think about these much more exotic scenarios instead of thinking of the mother.

After I made a short intro into the implicit bias topic, everyone took out their laptop and did the Harvard bias test on gender and career (there are more tests in this project focusing on race, LGBT etc). As expected, most participants showed the man-career preference of different strength. Surprisingly, several people who considered themselves to be well aware of the bias phenomenon, showed a strong man-career bias. Four participants showed a women-career preference. One of the four participants had a special family situation with a special attitude towards gender. The other two of the four participants were the only ones coming from India. This fact made me think of a couple of papers I read when preparing for this workshop.

The first study explores gender differences in maths performance. It shows that in more gender-neutral countries such as Iceland and Sweden the maths gender gap disappeared, whereas in the countries with a stronger gender bias, e.g. Turkey, boys significantly outperform girls. Another study shows, however, that in countries that have traditionally less gender equality there is a paradoxically larger percentage of women choosing STEM carriers than in more gender-neutral countries (see also a more specific study performed in Sweden). The proposed explanation is that women in countries with higher gender inequality are seeking the clearest possible path to financial freedom. And often, that path leads through STEM carriers.

The last part of the workshop was an open discussion. My primary goal was to create a space where the participants could freely express their thoughts, especially if they were sceptical or critical about the topic and the Harvard test. The topic of bias and discrimination is a problematic one and is sometimes perceived negatively, especially by people belonging to privileged social groups. People in a privileged position are often facing the topic in a conflict situation, which makes them develop a negative attitude towards it. That was something I really wanted to avoid.

A couple of participants were sceptical about the test. The main criticism concerned its training part, where the participants were trained to associate a) women with career and men with family as well as b) women with family and men with career. The order of the training modules (first a and then b or vice versa) was chosen randomly for each participant. Some people thought that the order influenced the test results for them. Interestingly, I also realized that some of the participants did not understand the concept of implicit bias: “I don’t believe that women shouldn’t make a career. Why did I get the strong bias? There should be something wrong with the test!”

Then somebody asked if there was a way to improve the bias. “Absolutely” said I and went on talking about how our awareness can change unconscious behaviour. Each time we take a decision or experience an emotion, we can ask ourselves if this decision or emotion was triggered by our biases. Once we become aware of the reasons of our actions and reactions, we can consciously change them with a time. This is what I said. Big mistake. A second later another participant noticed that the Harvard test website said that there was no scientific evidence that implicit bias could be improved. I mumbled something about my previous statement being non-scientific and a matter of my opinion, but my reputation was already harmed.

Right after the workshop, I was nevertheless quite euphoric and convinced that everything went great. The workshop was entertaining and engaging, I had meaningful conversations during the break, and met a few interesting people. The organizers of the “women in STEM” lunch seemed to love it. Later on, however, I received some feedback in private and realized that there were things I could have done better. Here is the list:

  • stick to the facts instead of expressing an opinion, unless someone explicitly asks for it
  • during the discussion, don’t comment on what people said, unless the comment is about a related fact or unless someone explicitly asks for the comment
  • during the test, don’t look into people’s monitors
  • after the test, don’t show excitement about participants getting neutral or non-typical (e.g. women-career) preference
  • don’t go into the “quotas for women” topic in the discussion

Next time it will definitely go perfectly, right? 🙂

Suggested reading


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s