Whenever we conclude (and expect someone to change a behavior) by asking, “Do you understand?” we are exhibiting an education bias:
The belief we can change people’s behaviors through more education.
Another symptom of education bias is that more information will make a difference, especially if it’s currently unknown. However, as Patrick Spenner and Karen Freeman, managing directors at Corporate Executive Board, wrote in their article, “To Keep Your Customers, Keep It Simple” (Harvard Business Review, May 2012 edition):
The marketer’s goal is to help customers feel confident about their choice. Just providing more information doesn’t help.
In other words, we need to tap into the feeling side of an interaction, not just the thinking side. While this seems intuitive, we often sacrifice it for beliefs such as “knowledge is power” which is just a reinforcement of the idea that more information is better (i.e. more knowledge means more power). In reality, it’s how we process and implement knowledge that generates power.
Moreover, as the articles “Too Much Information” (The Economist, July 2, 2011 edition) and “You Choose – The Tyranny of Choice” (The Economist, December 18, 2010 edition) discuss: at some point more information makes us powerless. This is why Tony Hey in his article, “The Big Idea: The Next Scientific Revolution” (Harvard Business Review, November 2010 edition ), goes even further to declare the processing of information the next scientific revolution.
We can offset our education bias by remembering the two aspects of interpersonal relationships. It’s not just about “Do you understand?” or more information (thinking aspect [red]), but how people feel about their decisions or changes (feeling aspect [blue]). Change is primarily about tapping into the feeling, emotional and intuitive (blue) aspects of people and not the aspects promoted by an education bias (red).
There is no place that the revisiting of our unconscious urges are taken more seriously than in retailing. The Economist article “Retail Therapy” appearing in the December 17, 2011 edition gives a great historical accounting of the rise and fall . . . and rise again of the application of Freud in business which Ernest Dichter is noted for introducing. As the article asserts:
Every week seems to yield a new discovery about how bad people are at making decisions. Humans, it turns out, are impressionable, emotional and irrational.
Increasingly, researchers are finding Dichter’s assessment that “most people have no idea why they buy things” to be correct.
Of course, “Sigmund Freud argued that people are governed by irrational, unconscious urges over a century ago.” However, as we saw earlier, it took science almost a hundred years to acknowledge that the subconscious existed. Meanwhile, “businesses were recognizing the limits of quantitative studies . . . which offered little genuine insight into how customers behaved.” Said more directly, you can’t rely on customers to tell you what they might buy.
The failures of online dating showed this truth as well as research into people’s internet surfing habits. The Atlantic’s article, “Learning to Love the (Shallow, Divisive, Unreliable) New Media,” which appeared in its April 2011 demonstrated that it’s “not what [people] say they want, nor what they ‘should’ want, but what they choose when they have a chance.”
If this applies to purchases, it also applies to all decisions. Names can affect decisions about scientific grants, and information that judges know is wrong can affect their decisions. So, if people don’t behave and choose as they said they would, we have no one to blame but ourselves for not looking deeper into the real emotions powering us.
When I’ve written about the illusion of free will, I’ve focused on the advancement of technology and research methodologies to uncover subconscious thought patterns. However, these advancements are also discovering a connection between chemical reactions and some of our emotions.
In the September 24, 2011 issue of The Economist, the article, “Rogue Hormones,” reports on the research of John Coates, a neuroscientist from Cambridge University. His research of derivative traders showed that when they “are on a winning streak their testosterone levels surge, sparking such euphoria that they underestimate risk.” This biochemical process produces extremely “powerful emotions” encouraging traders to “go crazy.”
This helps to explain why we often learn more from our failures than our successes and why success can deliver us to a state of hubris, an exalted arrogance that can corrupt our decision-making processes. Such biochemical processes help explain why such exuberance can infect many people to think and act similarly without communicating with each other while each is believing he is responding of his own free will. Thus, such events as financial bubbles and housing bubbles can occur on a broad scale.
A way to mitigate this effect is to diversify your workforce to include many types of personalities in decision-making positions. For instance, the article concluded that hiring women, who generally have about 10% as much testosterone as men, could help offset “irrational exuberance.” Experience can also help especially if it contains crises brought about by excessive risk taking. Moreover, even from strictly a gender perspective, not all men will experience the same increases in testosterone levels from success making them prone to erroneous risk assessments.
Of course, it’s not easy to manage a diverse workforce.
When we approach problems too logically and reasonably, we tend to place too much faith in the dominance of consciousness and to discount subjective influences that vary by person. For example, the Innocence Project, by using DNA evidence, has helped to exonerate 271 people wrongly convicted of crimes, but almost a quarter of these people had confessed or pleaded guilty. Why would people give false confessions?
What research shows is that we can easily extract false confessions from others especially when using certain interrogation techniques. The article, “Silence is Golden”, in the August 13, 2011 issue of The Economist mentions two such research projects. The journal, Law and Human Behavior, published one by Saul Kassin and Jennifer Perillo of the John Jay College of Criminal Justice in New York while the other is the work of Robert Horselenberg and colleagues at Maastricht University.
Since we tend to believe in free will and the dominance of consciousness, we consider confessions fairly damning because no one in her “right mind” would give false ones. Therefore, interrogations assume false confessions aren’t possible. Yet, people give them for many reasons including:
- Avoiding unpleasant interrogations
- Accepting that they might have accidentally committed a wrong
- Believing that
- Investigative process will show innocence
- Authorities and experts know better
- Objective truth and justice exist and will surface
- Technologically collected evidence is faultless
Many times our business processes assume people behave with a “right mind.” Yet, as this example shows, by questioning this assumption in our processes, interrogations in this case, we automatically call into question the outcomes derived from those processes, here confessions.
Thus, our processes need to account for more subjective, subconscious and intuitive factors or risk disconnection from reality and erroneous analyses.
Intuitive approaches often work because we don’t believe they do. Advertising is an excellent example: it influences us because we often believe it doesn’t.
This extends to our complaints about politicians not answering the question. Todd Rogers and Michael I. Norton researched this and were asked to “Defend Your Research” in “People Often Trust Eloquence More Than Honesty” appearing in the November 2010 issue of the Harvard Business Review. They found:
People who dodge questions artfully are liked and trusted more than people who respond to questions truthfully but with less polish.
In fact, when answerers perform the dodge effectively, less than half of the people could remember the question accurately. The key rests in the answer’s first ten words by disrupting the cognitive link we have for the question and expected answer. In everyday life, we like to complain about the fast-talking salesperson; however, on a higher level, fast-talking becomes eloquence. It’s here that we increasingly trust and like eloquence more than honesty.
Even though I promote the practical understanding and application of intuition in business on this blog, people can use intuitive approaches for ill or good. For instance, my guest 12 Most post, lists ways to influence people intuitively to build morale; however, people can use these techniques for questionable purposes too.
How do we defend ourselves? There are two broad introductory ways:
- Realize people can influence us intuitively and subconsciously even if we believe they can’t
- Raise our awareness regarding intuitive approaches
In this way, we can begin accounting for these natural biases in our decision-making and actions. However, believing others can influence us without our knowledge is scary for many of us, especially if we believe in the supremacy of the conscious mind and free will.
Long ago I sat in on the reprimand of an employee by a manager. The manager concluded his discussion by asking the employee, “Do you understand what I’m saying?” The employee responded, “Yes.” It suddenly occurred to me how biased we are in thinking that education alone will correct behavior. In other words, we assume that if someone understands our argument and reasons they will adopt our point of view.
In this above situation, there was no follow up by the manager to explore whether the employee agreed with the manager’s alternative action or whether the employee was moved to act accordingly in future situations. Yes, he was aware of the consequences, but we tend to forget that sometimes people are willing to pay those consequences.
I refer to making this false assumption about “Do you understand?” as a cognitive bias; we tend to believe that reasons, logic and rationales are enough to win the day. This bias will tend to make us wrongly believe that we’ve done “our best.”
I also experience this in non-disciplinary situations in which anyone is trying to influence another person. This cognitive bias happens frequently with instructors trying to move participants to take action in such settings as business training. They will ask participants, “Do you understand what I’ve shown (said, did, etc.)?”
Therefore, in summary, I find four basic hurdles, represented by the following questions, that we need to negotiate and verify before we can have significant confidence that we’ve persuaded someone:
- Do you hear me?
- Do you understand me?
- Do you agree with me?
- Are you moved to take the recommended action (to act on this idea)?