Categories: Culture Wars / New PR in Age of Populism / Opinion research / Trust and reputations

17 December 2017

No comments

Give a big fat no to the concept of unconscious bias

Denise Young Smith, Apple’s first vice president of diversity and inclusion, went on the record opining that there can be 12 white, blue-eyed, blond men in a room and they’re going to be diverse too because they’re going to bring a different life experience and life perspective to the conversation. Declaring that diversity is the human experience, Smith said: ‘I get a little bit frustrated when diversity or the term diversity is tagged to the people of colour, or the women, or the LGBT.’

Smith, who had been with the company for 20 years, paid a high price for committing this thought crime against Apple’s diversity polices. She was forced to quit her job, officially by the end of this year; after less than one year in the role. Early in 2017, Google fired James Damore for the crime of criticising the firm’s diversity programs and for questioning the efficacy of encouraging staff to attend classes on how to combat unconscious bias.

Damore’s criticisms were dismissed as being expressions of prejudices rooted in his ‘unconscious mind’. While Smith was accused of expressing contentious ideas about diversity that Apple could not endorse or indulge or tolerate. Both firings shocked me, so I decided to take a close look at the underlying issues.

The implication of the concept of unconscious bias is that resistance to certain ideas is not rational, or a matter of conscience, or worthy of proper debate. Instead, it suggests that the audience’s viewpoint requires a psychological solution to a pathological condition. That makes it a particularly useful concept in situations in which campaigners, corporations and governments find persuasion, even when using their best arguments, ineffective.

Then there are really perverse manifestations of the unconscious bias debate. These reveal how it is being used by management to blame human psychology for the embarrassing results of their considered policy choices. For instance, in 2014 the Metropolitan Police Commissioner, Sir Bernard Hogan-Howe, said he was reviewing whether ‘unconscious bias’ among his officers was blighting rape cases.

Hogan-Howe claimed that his investigators might be unconsciously prone to doubt the accounts of rape victims. This could, he said, explain why they were reluctant to come forward to make complaints. To remedy this situation the police created a ‘believe the victim’ culture, which was biased in favour of accusers at the expense of defendants. They hoped that this would put an end to the British police’s reputation for providing a sympathetic ear to sex abusers.

Hogan-Howe’s new culture resulted in the police and the Crown Prosecution Service orchestrating intense humiliations of innocent public figures. These included Harvey Proctor, Lord Bramall and the late Lord Brittan, all of whom were accused by unreliable and disreputable witnesses of committing heinous child sex abuses.

The just-published report by Lord Carlile of Berriew into the Church of England’s handling of claims against George Bell, the former bishop of Chichester, also highlights the potential of ‘unconscious bias’ to provide convenient excuses. The Church of England, like the Metropolitan Police, had already made a decision to cleanse its brand of its reputation for protecting sexual abusers at the expense of complainants.

So when George Bell’s reputation was in the frame, lack of evidence against him was not considered to be of great importance. What mattered was perception management of the response. That is why the Church of England ‘rushed to judgement’ and ‘wrongfully and unnecessarily damaged’ Bell’s previously highly esteemed legacy. Shamefully, the Church named him in a public apology, when it paid compensation to the self-proclaimed victim. Yet according to Carlile, the allegations were so weak that had the bishop been alive, there was a ‘low’ probability of a successful criminal prosecution. (See also: Anglican church ‘rushed to judgment’ in George Bell child abuse case.)

Now, after the spectacular collapse of the Liam Allan rape trial, Angela Rafferty, QC, chairman of the Criminal Bar Association, warns that the failings there were ‘not an isolated incident’ and police and the CPS may be ‘unconsciously bias[ed]’ in favour of those who report sex offences. (See: Rape case scandal is just ‘tip of the iceberg’ Police and prosecutors may be biased, says QC.)

If we accept Rafferty’s explanation, this lets the criminal justice system completely off the hook. Contrariwise: the real sex abuse scandal was that the Metropolitan Police, Crown Prosecution Service and the Church of England knew exactly what they were doing.

Engineering diversity sows division

Despite the controversies, unconscious bias training has gone mainstream; in the cause of promoting awareness about diversity issues. And there are increasing calls in influential circles for human resources and PR professionals to work harder to alter the hidden content of our inner minds.

We are being told that unconscious bias training and diversity initiatives increase creativity, improve the retention of best talent and boost productivity. Supposedly, modern research reveals that unconscious bias influences hiring decisions, promotional prospects; determining who gets handed the best projects or a platform to speak on.

According to Sarah Stimson, chief executive of the Taylor Bennett Foundation, in order to make our workplaces fit for diversity, we need to probe and manipulate the mysteries of our neurological make up:

Research shows that unconscious bias leads to discrimination. Our minds are wired in such a way that we make judgements about people based on stereotypes, even if we don’t mean to. Our biases are influenced by our own experiences, our upbringing and our environment and it’s unlikely that you are fully aware of your own biases. It’s a shortcut to making decisions and it means that groups of people may be unfairly discriminated against. [Diagnose your unconscious bias in a bid to tackle diversity]

Keen advocates of this technique for altering our biologically pre-programmed responses to our social experience include Facebook, Coca-Cola and the CIA. Yet it worries me, as I hope it worries others, that popularizing the notion that one’s colleagues, despite their claims to the contrary, are acting instinctively against whole groups of people, merely fuels institutionalised paranoia, discord and distrust. It is a recipe for creating a tsunami of irreconcilable conflicts.

According to Christine Medina, a strategic planner at FCB Garfinkel, a New York-based subsidiary of an advertising giant, her biggest takeaway from an unconscious bias training session was just how prevalent bias is: ‘It’s everyone’.

Yet not everyone likes being told that they are innately hostile to their colleagues. So Stimson accepts that unconscious bias training is likely to meet stiff resistance in the workplace. To counter this hostility, she cites an article by Joelle Emerson entitled Don’t Give Up on Unconscious Bias Training — Make It Better. Emerson says we should arm ‘unconscious bias’ trainers with what he considers to be compelling arguments and strategies:

One concern with teaching people about unconscious bias, or talking about diversity efforts more broadly, is that majority group members can become defensive. Training can be designed to reduce defensiveness by explaining that we don’t have unconscious biases because we’re bad people – we have them because we are people.

But in my view, telling colleagues that they are ‘people’ and therefore blameless, is most likely going to make the training seem either pointless or disingenuous.

‘Science’ behind mind manipulation

We are being asked to accept that our employers have a right to meddle with our unconscious brains. So I decided that it might prove useful to review the cited research in support of this proposition. First I clicked on Stimson’s research link. It led to a paper Unconscious Bias Theory in Employment Discrimination Litigation by Audrey J. Lee, Senior Mediator at Boston Law Collaborative. There, Lee outlines Professor Mahzarin Banaji’s Implicit Association Test (IAT) for detecting instances of unconscious bias, stating how:

Participants’ preferences are measured by their response times in pairing ‘positive’ words, such as ‘peace,’ and negative words, such as ‘war,’ with alternating white and black faces, with quicker association response times indicating an implicit preference for one association (e.g., white face and ‘wonderful’).

The test is premised on the idea that it takes a biased person longer to associate two items (white or black face with positive or negative word) that they view as being incompatible with his or her prejudices; the test creators argue that this time differential may be quantified to provide an objective assessment of people’s implicit attitudes.

The billion-dollar question, which Lee fails to ask, is what constitutes an acceptable level of reliable scientific evidence? The reliability of psychological tests is measured on a scale of r1 to r-0, where the closer to zero a result is, the less scientific it can claim to be. To score r = 1 results, tests have to be repeatable. So r = 1 means that a given test could be administered multiple times to the same group of people, and produce exactly the same results in the same order every time.

As Jesse Singal, contributing writer to New York Magazine, points out, where exactly on the scale of r1 to r-0 credible results need to be depends a lot on context. Though, generally speaking, researchers are comfortable if a given instrument hits r = .8 or so. The problem with AIT, however, is that even its creators admit the results are nowhere near r = .8:

 The IAT’s architects have reported that overall, when you lump together the IAT’s many different varieties, from race to disability to gender, it has a test-retest reliability of about r = .55. By the normal standards of psychology, this puts these IATs well below the threshold of being useful in most practical, real-world settings. [Psychology’s Favorite Tool for Measuring Racism Isn’t Up to the Job. Almost two decades after its introduction, the implicit association test has failed to deliver on its lofty promises. By

Signal says that many highly qualified critics have accused Banaji and her close associate Anthony Greenwald of over-hyping AIT’s potential in the media and political worlds. These strongly critical voices include Wharton School professor Philip Tetlock, who wrote Studying why some people are better at making predictions than others. Similar concerns have been raised by Hart Blanton of the University of Connecticut (a methods expert, currently a visiting scholar at the University of Texas at Austin), Gregory Mitchell of the UVA school of law, Fred Oswald of Rice University, Hal Arkes of Ohio State University, and James Jaccard of NYU.

Some studies, however, have produced evidence that contradicts Banji’s and Greenwald ‘s assumptions. An analysis of one extensive set of data seemingly revealed that ‘taken together….the IAT-effect is due to in-group/out-group membership and is not based on racial prejudice’. [See Does the Implicit Association Test (IAT) Really Measure Racial Prejudice? Probably Not].

I’m left thinking three things. The notion of ‘unconscious bias’ being a pathology that infects everyone appeals to employers and PR pros who wish to avoid making anybody in particular accountable for discrimination, inequalities and bad policies. Second, its claim to conquer unconscious thoughts, appeals to organisations that lack the confidence or arguments to rationally persuade people of their point of view. Third, ‘unconscious bias’ theories have Stalinist underpinnings. Stalin believed that the Soviet Union was a scientifically advanced society, in which the role of intellectuals was to ‘re-engineer the soul’ of the masses to think and act in the prescribed manner for the public good.

Comments are closed.