Site icon Behavioural Public Policy Blog

Reimagining Policing

Lori Foster, Professor of Psychology, North Carolina State University

What if we used science rather than intuition to reimagine policing? Emerging data, analytics, and behavioral science insights make this prospect entirely possible, suggesting a promising path forward. 

In June of 2020, the state of New York passed a historic police transparency bill (Ciccolini, 2020), which makes disciplinary records more readily available – not only to the public at large, but also to behavioral scientists and data scientists with the skills and tools to help reimagine and reform policing at scale. Such data could be supplemented by records of excellence, allowing for clearer insights into what “right” looks like, and what we want community police performance to entail going forward. Information like this enables a deep and actionable understanding of the qualities to prioritize in the recruitment, selection, training, and promotion of the police force of the future.

However, the data alone are not enough. Analytics and behavioral science are needed to help unlock insights and turn them into action. Part of this comes in the form of algorithms. Properly regulated, algorithms can increase the transparency, accountability, and effectiveness of selection and promotion processes in a way that is both scrutable and scalable (Kleinberg, Ludwig, Mullainathan, & Sunstein, 2020). This can lead to more rapid and widespread transformation than solutions that hinge on changing the hearts, minds, and biases of decision makers currently shaping the composition of law enforcement agencies nationwide (Ludwig & Sunstein, 2019). This is good news, considering implicit bias training’s less than stellar track record when it comes to producing meaningful behavioral change (e.g., Chang, Milkman, Gromet, Rebele, Massey, Duckworth, & Grant, 2019).

To understand how data and algorithms can lead to police reform, it’s important to know a bit about algorithms. Think of an algorithm like a recipe of sorts. In cooking, a recipe can be built to produce one of many different possible outcomes – for example a stew, a chili, a soup, or as Rupert Nacoste might say, a gumbo (Nacoste, 2010). Which ingredients you use and how you combine them determines the outcome you get. Sometimes, tweaking the ingredients to improve one criterion like flavor can adversely affect another criterion, like thickness. And sometimes you have to sacrifice a tad of the thickness you want in order to achieve the desired flavor. The trick is to experiment until you’ve optimized for and reached acceptable levels of all the things that matter – flavor, thickness, texture, quantity, color, nutritional value, etc. This requires knowing up front the outcomes or criteria you’re aiming for.

Algorithms work in a similar way. First, we have to tell the algorithm what we’re trying to achieve, what criteria are important. When it comes to police performance, are we looking to limit absenteeism or minimize turnover? Are we looking for people who are good at making arrests or handling stress? Do we want officers who will refrain from misconduct, shootings, verbal, and physical abuse? How about police officers who excel at building rapport with members of the community? If we measure and track such outcomes, algorithms can be built to optimize for any of them, and more (Kleinberg, Ludwig, Mullainathan, & Sunstein, 2018). As we reimagine policing, we should think carefully about what we want to optimize for, how to validly measure those things, and how to make such data available for analysis. Legislation like police transparency bills can be helpful in this regard.

Several years ago, Aaron Chalfin and colleagues ran an experiment demonstrating the power of algorithms in the context of policing (Chalfin et al., 2016). They looked at whether nearly 2,000 officers selected by the Philadelphia Police Department and enrolled in academy classes from 1991-1998 had ever been involved in a shooting or accused of physical or verbal abuse. Then, they used data science to build an algorithm identifying the qualities statistically associated with misconduct. Afterwards, they ran a “what if” experiment, using the data to quantify what would have happened if the police department had used their algorithm to screen candidates. The conclusion? Had the algorithm been used, there would be far fewer cases of police misconduct. 

It is useful to think about what to optimize for at the individual level of analysis as suggested above – that is, what kind of police officer do we want to recruit, select, retain, and promote? It is also interesting to think about what to optimize for at the collective agency level of analysis – that is, what mix of police officers is needed as we reimagine policing? One important question is whether race and sex should be explicitly considered in efforts to create law enforcement agencies with demographic profiles mirroring that of the communities they are charged with protecting (Berube & Holmes, 2016). More than thirty years ago, Sullivan (1986) offered a number of reasons why public and private employers might wish to adopt affirmative action policies – not as an atonement for sins of the past or even as a matter of demographic balancing for its own sake – but rather as a matter of enlightened self-interest. More recently, Kleinberg et al. (2018) put a finer point on this, stating “Having greater numbers of African-Americans in the police force could be important if it improves the functioning of the force, whose relationship to the relevant community might be better if it is not entirely or mostly white” (p. 123). This follows from Hong’s (2017) research in the United Kingdom showing that racially representative policing makes a positive difference.

Once the individual and collective criteria are clearly laid out, the next step is determining what ingredients or qualities are important for achieving the desired outcomes, and then recruiting and selecting for those qualities in appropriate measure. In theory, predictors of performance could include a wide range of ingredients — knowledge, skills, abilities, and other characteristics such as social and emotional attributes like empathy, self-control, service orientation, resiliency, self-awareness, and adaptability. Together, behavioral science and data science can be combined to help identify and measure important attributes in a fair, valid way. 

The Society for Industrial and Organizational Psychology argues that law enforcement lags behind other industries when it comes to hiring technology and suggests the need to leverage computerized assessments and methodological advances that reduce faking on self-report measures (SIOP, 2020a). For example, rather than asking people to self-rate how empathic or adaptable they are, we can ask candidates to complete validated, behavior-based computer exercises to assess such attributes objectively. We can analyze how candidates act in response to different behavioral scenarios and compare their responses to those of officers with a positive track record in the community. And instead of asking a recruiter or supervisor to use their judgment to identify top candidates for a job or promotion, we can use behavioral data to build algorithms that score and rank candidates objectively, according to the qualities that matter.

Reducing the role of human judgment at specific points along the way can level the playing field to attract and select a more diverse pool of talent. Humans have a long history of bias, both overt and implicit, when asked to make decisions about themselves and other people. In some cases, these biases and decisions have life-altering and life-threatening implications. For example, Hehman, Flake, and Calanchini (2018) examined police officers’ use of lethal force across the United States. Analyzing well over a million data points, the researchers found disproportionally more lethal force against black people in regions of the United States where white residents demonstrated higher rates of unconscious racial prejudices and stereotypes as measured by an implicit association test administered online. 

In addition to influencing life and death decisions regarding the use of force, implicit bias also creeps into other areas, such as who gets hired and promoted. When it comes to policing, this affects us all.

I don’t mean to suggest that algorithms cannot produce biased, lousy results. They can. Just as human judgment can favor one group over another, so too can algorithms – even those designed with prosocial aims in mind. An internet search for “algorithmic bias” will unearth many examples of automation gone awry. For example, a recent application of algorithms in the UK’s education sector resulted in the lowering of many students’ grades, prompting public outrage and predictions of “battles to come regarding the use of technology in public services.” Clearly, algorithmic bias is not to be taken lightly. Proper laws, regulations, ethical principles, validation, and auditing are required. 

When screening candidates for the police force of the future, auditing for bias is not only helpful, it’s essential. This is one area where algorithms offer new possibilities. Relative to a human screener, algorithms can be more easily checked for bias. And if bias occurs in an algorithm, it can be detected and corrected. Because algorithms can be inspected, any inherent bias can be identified, quantified, and fixed prior to using the algorithm for decision making (Mullainathan, 2019). This sharply contrasts with the opaquer human decision maker’s thought process where prejudices often lurk unknown, undetected, or simply unprovable.

Algorithms are a good starting point, but not a panacea. Police reform requires a lot more. The Society for Industrial and Organizational Psychology (SIOP) has highlighted a number of scientifically-based solutions to challenges facing police departments today, offering evidence-based guidance on aspects of police reform ranging from recruitment, selection, and leadership development to using behavioral science to understand decision-making under stress and organizational change readiness (SIOP, 2020b). 

Notice that SIOP’s recommendations go beyond the point of hire. Police reform requires properly equipping our law enforcement agencies for the police force of the future not only with the right personnel, but also with the right environments. Behavioral scientists have repeatedly demonstrated that we are all capable of behavior we’d rather not admit, and our environment influences how we behave (Ariely, 2012). Even small details in the environment can make a difference in encouraging people to be their best selves, including at work. Getting the environment right means leveraging theory and data to develop and apply behavioral science insights on inclusion, belongingness, teamwork, psychological safety, counterproductive work behavior, organizational citizenship behavior, integrity, and organizational culture, to name a few examples.  

Behavioral science can be applied to other parts of the system as well, including the community. Consider, for instance, the “community action deck” created by a group of social and behavioral scientists in response to a 2015 Presidential Task Force on 21st Century Policing (Social and Behavioral Sciences Team, 2016). Meant to inspire people to begin the conversations that lead to concrete actions to address issues facing their communities, each card in the deck helps community members translate ideas into clear action steps to improve things like trust, oversight, and safety. For example, one of the cards offers steps for creating a report comparing the makeup of your local law enforcement to the makeup of your community to identify groups that are not well represented on the force. Far from being “pie in the sky,” each card offers an example of a community that has successfully implemented the action listed (Office of Evaluation Sciences, 2016).

In conclusion, Black Lives Matter has put police reform high on the policy agenda across countries. If we want to get this right, we need to incorporate data science and behavioral insights. They’re not a cure-all, but they have an important role to play. Applied properly, they can help strengthen law enforcement and community engagement in policing, to everyone’s benefit. 

Acknowledgements

I would like to thank Dan Ariely, Sendhil Mullainathan, Cass Sunstein, Frida Polli, Martine Cadet, and Michael Ganci for their input on an earlier version of this article.

Exit mobile version