Selective Perception – A Major Stumbling Block For Science

0
172

 

There are several examples of studies the scientists have undertaken which ended up having unexpected results. In 1954, a study published by Princeton and Dartmouth researchers asked their students to watch a recording of a football game between the two schools and count infractions. The Princeton students reported twice as many violations against Princeton as Dartmouth students did.

In a 2003 study, Yale researchers asked people to evaluate proposed (fictional) policies about welfare reform, with political parties’ endorsements clearly stated. They found that their subjects sided with their political parties regardless of their personal ideologies or the policies’ content. A study by a different group in 2011 asked people to identify whether certain scientists (highly trained and at well-respected institutions) were credible experts on global warming, disposal of nuclear waste, and gun control. Subjects largely favored the scientists whose conclusions matched their own values; the facts were irrelevant.

This behaviour is called “selective perception”—the way that otherwise rational people distort facts by putting them through a personal lens of social influence and wind up with a world-view that often alters reality. Selective perception affects all our beliefs, and it’s a major stumbling block for science communication.

What divides us, it turns out, isn’t the issues. It’s the social and political contexts that color how we see the issues. Take nuclear power, for example. The outlook differs based on the region. In the U.S., it is highly argued; in France, it is a topic of lesser concern. This is in spite of the fact that the U.S.’s power is about 20 percent nuclear; France’s is 78 percent. Look at nearly any science issue and nations hold different opinions. While US fights about gun control, climate change, and HPV vaccination, in Europe, these controversies don’t hold a candle to debates about GMO foods and mad cow disease. Scientific subjects become politically polarized because the public interprets even the most rigorously assembled facts based on the beliefs of their social groups, says Dan Kalian, a Yale professor of law and psychology who ran the 2011 science-expert study.

The problem is, our beliefs influence policy. Public attitudes change how politicians vote, the products companies make, and how science gets funded.

So what can we do? The science world has taken note. For example, the National Science Foundation recently emphasized grant-proposal rules that encourage scientists to share their research with the public. And several conferences on science communication have sprung up. It’s not a bad start. As people hear more from scientists, scientists will be absorbed into the public’s social lens—and maybe even gain public trust. Having scientists tweet is good, but the most influential public figures are the ones folks can relate to (a la Carl Sagan). We need to get more figures like him—fast. According to Kahan, synthetic biology is a prime candidate for the next controversy.

People distort facts by putting them through their personal lens, which is conditioned by their social groups and their beliefs. Thus it is imperative that before people understand science, scientists understand people.

Subscribe to Our Newsletter
I agree to have my Email Address transfered to MailChimp ( more information )
Enrich your life with our latest blog updates and news from around the globe.
We hate spam. Your email address will not be sold or shared with anyone else.

LEAVE A REPLY

Please enter your comment!
Please enter your name here