Thursday, 19 February 2015

Why science is so hard to believe?




 

Ripper: “Have you ever heard of a thing called fluoridation? Fluoridation of water?”
Mandrake: “Ah, yes, I have heard of that, Jack. Yes, yes.”Ripper: “Well, do you know what it is?”
Mandrake: “No. No, I don’t know what it is, no.”
Ripper:

“Do you realize that fluoridation is the most monstrously conceived and

dangerous communist plot we have ever had to face?” 


Actually fluoride is a natural

mineral that, in the weak concentrations used in public drinking-water

systems, hardens tooth enamel and prevents tooth decay — a cheap and

safe way to improve dental health for everyone, rich or poor,

conscientious brushers or not. That’s the scientific and medical

consensus.



Science doubt has become a pop-culture meme. In the recent movie “Interstellar,”

set in a futuristic, downtrodden America where NASA has been forced

into hiding, school textbooks say the Apollo moon landings were faked.



In a sense

this is not surprising. Our lives are permeated by science and

technology as never before. For many of us this new world is wondrous,

comfortable and rich in rewards — but also more complicated and

sometimes unnerving. We now face risks we can’t easily analyze.

We’re

asked to accept, for example, that it’s safe to eat food containing

genetically modified organisms (GMOs) because, the experts point out,

there’s no evidence that it isn’t and no reason to believe that altering

genes precisely in a lab is more dangerous than altering them wholesale

through traditional breeding. But to some people, the very idea of

transferring genes between species conjures up mad scientists running

amok — and so, two centuries after Mary Shelley wrote “Frankenstein,”

they talk about Frankenfood.



The scientific

method leads us to truths that are less than self-evident, often

mind-blowing and sometimes hard to swallow. In the early 17th century,

when Galileo claimed that the Earth spins on its axis and orbits the

sun, he wasn’t just rejecting church doctrine. He was asking people to

believe something that defied common sense — because it sure looks like

the sun’s going around the Earth, and you can’t feel the Earth spinning.

Galileo was put on trial and forced to recant. Two centuries later,

Charles Darwin escaped that fate. But his idea that all life on Earth

evolved from a primordial ancestor and that we humans are distant

cousins of apes, whales and even deep-sea mollusks is still a big ask

for a lot of people.

Even

when we intellectually accept these precepts of science, we

subconsciously cling to our intuitions — what researchers call our naive

beliefs. A study by Andrew Shtulman of Occidental College

showed that even students with an advanced science education had a

hitch in their mental gait when asked to affirm or deny that humans are

descended from sea animals and that the Earth goes around the sun. Both

truths are counterintuitive. The students, even those who correctly

marked “true,” were slower to answer those questions than questions

about whether humans are descended from tree-dwelling creatures (also

true but easier to grasp) and whether the moon goes around the Earth

(also true but intuitive).



Even for

scientists, the scientific method is a hard discipline. They, too, are

vulnerable to confirmation bias — the tendency to look for and see only

evidence that confirms what they already believe. But unlike the rest of

us, they submit their ideas to formal peer review before publishing

them. Once the results are published, if they’re important enough, other scientists will try to reproduce them

— and, being congenitally skeptical and competitive, will be very happy

to announce that they don’t hold up. Scientific results are always

provisional, susceptible to being overturned by some future experiment

or observation. Scientists rarely proclaim an absolute truth or an

absolute certainty. Uncertainty is inevitable at the frontiers of

knowledge.

That provisional quality of science is another thing a

lot of people have trouble with. To some climate-change skeptics, for

example, the fact that a few scientists in the 1970s were worried (quite

reasonably, it seemed at the time) about the possibility of a coming

ice age is enough to discredit what is now the consensus of the world’s scientists:

The planet’s surface temperature has risen by about 1.5 degrees

Fahrenheit in the past 130 years, and human actions, including the

burning of fossil fuels, are extremely likely to have been the dominant

cause since the mid-20th century.

It’s

clear that organizations funded in part by the fossil-fuel industry

have deliberately tried to undermine the public’s understanding of the

scientific consensus by promoting a few skeptics.

The news media gives abundant attention to such mavericks, naysayers,

professional controversialists and table thumpers. The media would also

have you believe that science is full of shocking discoveries made by

lone geniuses. Not so. The (boring) truth is that science usually

advances incrementally, through the steady accretion of data and

insights gathered by many people over many years. So it has with the

consensus on climate change. That’s not about to go poof with the next

thermometer reading.

But industry PR, however misleading, isn’t

enough to explain why so many people reject the scientific consensus on

global warming.

The “science communication problem,” as it’s

blandly called by the scientists who study it, has yielded abundant new

research into how people decide what to believe — and why they so often

don’t accept the expert consensus. It’s not that they can’t grasp it,

according to Dan Kahan of Yale University. In one study he asked 1,540

Americans, a representative sample, to rate the threat of climate change

on a scale of zero to 10. Then he correlated that with the subjects’

science literacy. He found that higher literacy was associated with stronger views

— at both ends of the spectrum. Science literacy promoted polarization

on climate, not consensus. According to Kahan, that’s because people

tend to use scientific knowledge to reinforce their worldviews.

Americans

fall into two basic camps, Kahan says. Those with a more “egalitarian”

and “communitarian” mind-set are generally suspicious of industry and

apt to think it’s up to something dangerous that calls for government

regulation; they’re likely to see the risks of climate change. In

contrast, people with a “hierarchical” and “individualistic” mind-set

respect leaders of industry and don’t like government interfering in

their affairs; they’re apt to reject warnings about climate change,

because they know what accepting them could lead to — some kind of tax

or regulation to limit emissions.




How

to penetrate the bubble? How to convert science skeptics? Throwing more

facts at them doesn’t help. Liz Neeley, who helps train scientists to

be better communicators at an organization called Compass, says people

need to hear from believers they can trust, who share their fundamental

values. She has personal experience with this. Her father is a

climate-change skeptic and gets most of his information on the issue

from conservative media. In exasperation she finally confronted him: “Do

you believe them or me?” She told him she believes the scientists who

research climate change and knows many of them personally. “If you think

I’m wrong,” she said, “then you’re telling me that you don’t trust me.”

Her father’s stance on the issue softened. But it wasn’t the facts that

did it.

If you’re a rationalist, there’s something a little

dispiriting about all this. In Kahan’s descriptions of how we decide

what to believe, what we decide sometimes sounds almost incidental.

Those of us in the science-communication business are as tribal as

anyone else, he told me. We believe in scientific ideas not because we

have truly evaluated all the evidence but because we feel an affinity

for the scientific community. When I mentioned to Kahan that I fully

accept evolution, he said: “Believing in evolution is just a description

about you. It’s not an account of how you reason.”


Doubting science also has consequences, as seen in recent weeks with the measles outbreak

that began in California. The people who believe that vaccines cause

autism — often well educated and affluent, by the way — are undermining

“herd immunity” to such diseases as whooping cough and measles. The

anti-vaccine movement has been going strong since a prestigious British

medical journal, the Lancet, published a study in 1998 linking a common

vaccine to autism. The journal later retracted the study,

which was thoroughly discredited. But the notion of a vaccine-autism

connection has been endorsed by celebrities and reinforced through the

usual Internet filters. (Anti-vaccine activist and actress Jenny

McCarthy famously said on “The Oprah Winfrey Show,” “The University of

Google is where I got my degree from.”)

In the climate debate,

the consequences of doubt are likely to be global and enduring.

Climate-change skeptics in the United States have achieved their

fundamental goal of halting legislative action to combat global warming.

They haven’t had to win the debate on the merits; they’ve merely had to

fog the room enough to keep laws governing greenhouse gas emissions

from being enacted.

Some environmental activists want scientists

to emerge from their ivory towers and get more involved in the policy

battles. Any scientist going that route needs to do so carefully, says

Liz Neeley. “That line between science communication and advocacy is

very hard to step back from,” she says. In the debate over climate

change, the central allegation of the skeptics is that the science

saying it’s real and a serious threat is politically tinged, driven by

environmental activism and not hard data. That’s not true, and it

slanders honest scientists. But the claim becomes more likely to be seen

as plausible if scientists go beyond their professional expertise and

begin advocating specific policies.

It’s their very detachment,

what you might call the cold-bloodedness of science, that makes science

the killer app. It’s the way science tells us the truth rather than what

we’d like the truth to be. Scientists can be as dogmatic as anyone else

— but their dogma is always wilting in the hot glare of new research.

In science it’s not a sin to change your mind when the evidence demands

it. For some people, the tribe is more important than the truth; for the

best scientists, the truth is more important than the tribe.



Source Article from http://feedproxy.google.com/~r/AscensionEarth2012/~3/HiL1XA9jPrU/why-science-is-so-hard-to-believe.html



No comments:

Post a Comment