Removing Racist Cops: From Implicit Bias Training to Hiring the “Unbiased Brain”
By Kate Webb
![]() |
Image Courtesy of Wikimedia |
Following the unjust murders of George Floyd and Breonna Taylor, the United States entered a period of civil unrest reminiscent of the civil rights movement in the 1950s and 60s. Mr. Floyd’s murder, at the hands of four police officers, ignited months of protests across the world. Community activists in nearly all major cities announced (or re-announced) demands for an overhaul of the American policing system. Although there have been calls for police reform after nearly every wrongful police killing, major transformation of criminal justice systems has usually been obstructed.
For decades, the data on police violence—though limited—has described policing institutions as riddled with systemic racism.1-3 Early published reports indicate that concerns about police violence disproportionately impacting marginalized communities began to rise in the 1960s4-6; however, a surge of public interest in recent years has initiated more research attention in this area.
In 2014, a study analyzing federal data discovered young Black men were 21 times more likely to be murdered by police than their White counterparts.7 More recently, researchers documented that police use of force was a leading cause of death of young men of color, with Black men having a 1 in 1,000 chance of being murdered by police.8 As the public increasingly demands more just and unprejudiced policing, many police academies have adopted implicit bias training programs.9
What If Implicit Bias Training Is Not Enough?
Psychologists categorize bias into two categories: implicit and explicit. While most people can identify if someone is exhibiting explicit bias, such as when a co-worker uses a racial slur, it can be harder to identify implicit biases.
Implicit bias refers to subconscious associations and underlying beliefs which may cause the individual to act or feel a certain way.10 The goal of implicit bias training is to expose “blind spots.” The training theoretically helps people detect how structural racism and personal bias may have inadvertently seeped into their minds and are influencing their behavior.
A good example of this is stereotyping. For example, if a police officer subconsciously believes that Black people are more likely to commit crime, they may subconsciously and disproportionately pull over more Black drivers than White drivers. Unfortunately, the effectiveness of implicit bias trainings has been dubiously under-studied, but it appears the longer the intervention, the better.11
The problem with implicit bias training for police is that most courses are less than a day long and many departments don’t require annual participation9; therefore, the trainings are likely providing little to no long-term impact.
![]() |
Image courtesy of Built In Chicago |
Given the limited longitudinal research on the usefulness of implicit bias trainings in the first place, it’s unclear if these trainings are offering anything besides placation.
Neuroscience techniques, however, may offer a unique and missing dimension to generating solutions with better efficacy. What if we could evaluate candidates for public service in criminal justice by looking into their brains?
How to Evaluate the Biased Brain
With functional magnetic resonance imaging (fMRI), neuroscientists can monitor brain activity when individuals are participating in a variety of tasks. One way to explore which brain regions play a role in racial biases is to show participants faces of Black and White individuals12 and then characterize the evoked neural patterns.
Using machine learning algorithms, scientists can actually predict the race of the faces presented during the task using only the participant’s brain activity.12,14 A series of neural structures have been classified as the “prejudice network.”14 These brain regions show increased activity when the participant is responding in a biased manner. Greater activity in these regions is associated with racial bias, both implicit and explicit.
Critically, the neural activation patterns between different types of biases are distinct: gender bias has activity in certain brain regions that are not activated with racial biases and vice versa.15 Neuroscientists have also defined regions in the frontal lobe, which are critical for higher-level processes like planning and monitoring other brain regions, as areas that regulate and reduce prejudice.
Neuroimaging (and Then Hiring) the Unbiased Brain
Research on prejudice demonstrates that we can identify bias in the brain. Police officers must already undergo fingerprinting and background checks in addition to completing physical examinations and written tests.
What if we also required police officers to undergo fMRI scanning as one of the ways to determine the extent of their biases?
Police recruits could view faces of different races while their brain activity is being recorded.14 Those who exhibit less activation in the “prejudice network” when viewing minority faces could be classified as more unbiased and offered a position within the department.
At first read, the idea of using brain activity to determine whether someone should be hired for a job probably seems ludicrous, but this proposal has already been explored in other professions, including the medical field.17
In a study with medical students and accredited surgeons, researchers used neuroimaging techniques to evaluate bimanual hand coordination (the ability to use both the right and left hand together to accomplish a goal).17 With a trained machine learning algorithm analyzing the brain data, the students were successfully classified by their surgery skill-level. In fact, the neuroimaging data was better at predicting skill than an actual hands-on surgery exercise.17
![]() |
Image courtesy of Wikimedia Commons |
In fact, a range of career-specific skills have been assessed by applying a machine learning algorithm to brain data collected with fNIRS and EEG. From medical students18 to airplane pilots19-21, these neurotechnologies have been used to monitor training growth and capacity. Presently, these types of applications are still in preliminary phases and tested primarily in individuals who are in training or mid-career.18-21
In professions where people’s lives are at stake, such as in the medical or criminal justice fields, perhaps the process of hiring should be data-driven. Considering potential police recruits are already subject to fingerprinting, background checks, and physical examinations, could they also be required to undergo brain scanning?
Are Bias Detection Algorithms…. Biased?
The short answer is yes. Machine learning algorithms are incredibly flexible tools that learn to recognize patterns and predict outcomes, but they are not without limitations.
And no one is free of biases. In the words of renowned data-scientist Cathy O’Neil: “algorithms are opinions embedded in code.”22 Indeed, a plethora of reports23 and films24 have signaled even data-driven techniques are overwhelmingly pervaded with biases.
Prejudice can be introduced to the algorithm at multiple levels: from scientists’ underlying beliefs to inherent biases in the dataset.21 For example, on average, Black Americans earn less than their White colleagues. If an algorithm was trained to decide how much someone should earn based on average salaries of the American people and their demographics, the tool would unjustly suggest that Black Americans should earn less.
There are ways to identify and correct prejudice in machine learning.25,26,27 Prior to deploying any neurotech to address societal problems, scientists should carefully scrutinize the algorithm and the training data. One way to “debias” new technology is to ensure that it is being developed and tested by diverse teams. Individuals from different educational and demographic backgrounds bring unique and valuable viewpoints to innovating new technology and fixing existing issues.28
“Debiasing” tools also involves accepting that there are limitations to models and to data.28 And part of a serious proposition for using brain imaging in evaluation for employment would be careful exploration of the neurotechnologies reliability and reproducibility beyond the laboratory.
For all these reasons, the application of neuroimaging to tackle racist policing must always be accompanied by other approaches.
Combatting Racism in Policing: A Multi-Faceted Approach
Other methods of tackling racism in policing include significant policy changes. Federal and state laws that better regulate the use of force and limit negligent procedures, such as the no-knock warrant that was used during the murder of Breonna Taylor, may also ensure that police behavior is fair and ethical.
![]() |
Image courtesy of Chicago Tribune |
A myriad of solutions, especially those which are community-based and led, will be required to identify and combat racist systems which remain deeply embedded in our society. Neuroimaging tools may be useful to deploy alongside the grassroot initiatives, policy changes, and educational trainings currently attempting to reduce discrimination in policing.
fMRI scans are expensive, but when considering how much funding is dedicated to implicit bias training and the minimal subsequent benefits, the scans may be well worth the money. In exploring the possibility of using neurotechnologies for this purpose, we may even be able to engineer cheaper, portable, and more widely accessible solutions.
fNIRS and EEG are two model contenders as they have been used to assess career-specific skills in other fields.18-21 However, neuroscientists first need to determine whether data collected from these tools can be used to classify an individual’s racial biases. Preliminary testing must also evaluate whether machine learning algorithms can use these data to discriminate between different types of biases (e.g., gender versus racial bias).
The police are public servants called to “protect and serve”; as such, we should ensure they are treating all citizens fairly. Studies, and the lived experiences of Black, Indigenous, and People of Color, have clearly established there is a problem with American policing and, while implicit bias training may help, neuroscience techniques could provide a more definitive, evidence-based method for removing officers with racial biases before they are even hired.
Interested in Evaluating Your Own Implicit Biases?
Take the Implicit Association Tests (Part of Harvard’s Project Implicit): https://implicit.harvard.edu/implicit/takeatest.html
References
- Piliavin, I., & Werthman, C. (1967). Gang members and the police. The Police: Six Sociological Essays. Edited by David J. Bordua. New York: John Wiley & Sons, 56-98.
- Terrill, W., & Reisig, M. D. (2003). Neighborhood context and police use of force. Journal of research in crime and delinquency, 40(3), 291-321.
- Jacobs, D., & O'brien, R. M. (1998). The determinants of deadly force: A structural analysis of police violence. American journal of sociology, 103(4), 837-862.
- Johnson, K. E. (2004). Police-black community relations in postwar Philadelphia: Race and criminalization in urban social spaces, 1945-1960. The Journal of African American History, 89(2), 118-134.
- Dominic J. Capeci, Jr., The Harlem Riot of 1943 (Philadelphia, PA 1977), 1-30, 115-33.
- St. Clair Drake and Horace Cayton, Black Metropolis: A Study of Negro Life in a Northern City (New York, 1945), 482-84.
- Meyer, M. W. (1980). Police shootings at minorities: The case of Los Angeles. The Annals of the American Academy of Political and Social Science, 452(1), 98-110.
- Edwards, F., Lee, H., & Esposito, M. (2019). Risk of being killed by police use of force in the United States by age, race–ethnicity, and sex. Proceedings of the National Academy of Sciences, 116(34), 16793-16798.
- CBS News Team. 2019, August 17. CBS News. Retrieved from: https://www.cbsnews.com/news/racial-bias-training-de-escalation-training-policing-in-america/
- Amodio, D. M., & Mendoza, S. A. (2010). Implicit intergroup bias: Cognitive, affective, and motivational underpinnings.
- Devine, P. G., Forscher, P. S., Austin, A. J., & Cox, W. T. (2012). Long-term reduction in implicit race bias: A prejudice habit-breaking intervention. Journal of experimental social psychology, 48(6), 1267–1278. https://doi.org/10.1016/j.jesp.2012.06.003
- Amodio, D. M. (2014). The neuroscience of prejudice and stereotyping. Nature Reviews Neuroscience, 15(10), 670-682.
- Worden, R.E., McLean, S.J., Engel, R.S., Cochran, H., Corsaro, N., Reynolds, D., Najdowski, C.J., & Isaza, G. (2020). The impacts of implicit bias awareness training in the NYPD. Retrieved from: https://www1.nyc.gov/assets/nypd/downloads/pdf/analysis_and_planning/impacts-of-implicit-bias-awareness-training-in-%20the-nypd.pdf
- Kaste, M. (2020, September 10). NYPD study: implicit bias training changes minds, not necessarily behavior. Retrieved from: https://www.npr.org/2020/09/10/909380525/nypd-study-implicit-bias-training-changes-minds-not-necessarily-behavior
- Brosch, T., Eyal Bar, D., & Phelps, E.A. (2013). Implicit Race Bias Decreases the Similarity of Neural Representations of Black and White Faces. Psychological Science 24 (2): 160–66.
- Knutson, K. M., Mah, L., Manly, C. F., & Grafman, J. (2007). Neural correlates of automatic beliefs about gender and race. Human brain mapping, 28(10), 915-930.
- Nemani, A., Yücel, M. A., Kruger, U., Gee, D. W., Cooper, C., Schwaitzberg, S. D., ... & Intes, X. (2018). Assessing bimanual motor skills with optical neuroimaging. Science advances, 4(10), eaat3807.
- Y. Gao et al., "Functional brain imaging reliably predicts bimanual motor skill performance in a standardized surgical task," in IEEE Transactions on Biomedical Engineering, doi: 10.1109/TBME.2020.3014299.
- Verdière, K. J., Roy, R. N., & Dehais, F. (2018). Detecting Pilot's Engagement Using fNIRS Connectivity Features in an Automated vs. Manual Landing Scenario. Frontiers in human neuroscience, 12, 6. https://doi.org/10.3389/fnhum.2018.00006
- Gateau, T., Durantin, G., Lancelot, F., Scannella, S., & Dehais, F. (2015). Real-time state estimation in a flight simulator using fNIRS. PloS one, 10(3), e0121279.
- Bartosz Binias, Dariusz Myszor, Krzysztof A. Cyran, "A Machine Learning Approach to the Detection of Pilot’s Reaction to Unexpected Events Based on EEG Signals", Computational Intelligence and Neuroscience, vol. 2018, Article ID 2703513, 9 pages, 2018. https://doi.org/10.1155/2018/2703513
- O’Neil, C. (2017). The era of blind faith in big data must end. Ted Talk retrieved from: https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end?language=en
- Buranyi, S. (2017, August). Rise of racist robots – how AI is learning all of our worst impulses. Retrieved from: https://www.theguardian.com/inequality/2017/aug/08/rise-of-the-racist-robots-how-ai-is-learning-all-our-worst-impulses
- Girish, D., (2020, Nov). ‘Coded Bias’ review: When the bots are racist. Retrieved from: https://www.nytimes.com/2020/11/11/movies/coded-bias-review.html
- Gruver, J. (2019, May 7). Racial wage gap for men. Retrieved from: https://www.payscale.com/data/racial-wage-gap-for-men
- Acharya, S. (2019, March). Tackling bias in machine learning. Retrieved from: https://blog.insightdatascience.com/tackling-discrimination-in-machine-learning-5c95fde95e95.
- Yapo, A., & Weiss, J. (2018, January). Ethical implications of bias in machine learning. In Proceedings of the 51st Hawaii International Conference on System Sciences.
- Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2019). A survey on bias and fairness in machine learning. arXiv preprint arXiv:1908.09635.
______________
Kate Webb is doctoral student in the neuroscience program at University of Wisconsin-Milwaukee. As a member of Dr. Christine Larson’s research team and the Milwaukee Trauma Outcomes Project, she leverages neuroimaging techniques to evaluate factors which may bestow risk or resilience of developing post-traumatic stress disorder (PTSD). Her current line of work investigates how structural inequities (i.e. social determinants of health) may influence the neurobiological mechanisms underlying PTSD.
Webb, K. (2021). Removing Racist Cops: From Implicit Bias Training to Hiring the “Unbiased Brain”. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2021/03/removing-racist-cops-from-implicit-bias.html