Does the scientific method apply to psychology?

imprisoned-2066638_640We’ve blogged about the problem in the field of psychology that so many of their research experiments can’t be replicated.  That means that, according to the scientific method, they are invalid.

The problem continues, and it’s compounded by the fact that the profession doesn’t seem to care!

The proliferation of peer-reviewed articles whose results can’t be repeated keeps building.  Despite these findings, nothing is changing in the way psychologists do their research, the way journals vet their articles, or in the articles that get published.

An article on the subject, quoted and linked after the jump, says that as many as two-thirds of psychology articles “can’t be trusted.”

But let me pose a different way to look at this problem.  Can it be that the same scientific method used for chemistry and biology is unusable in the study of the minds of human beings?  People are active agents, not inanimate objects that follow only natural laws.  So it’s no wonder human beings are unpredictable and inconsistent.  And different subjects and groups react in different ways.

After I quote the article, I quote a commenter, who points out that there may be other ways to design, evaluate, and learn from various kinds of research, in addition to strict application of “the scientific method.”

In fact, the view that the scientific method is the only way to know truth–not logical reasoning (as in philosophy) and certainly not revelation (as in theology)–is surely one of the more reductionist errors of the Enlightenment.

I have no problem jettisoning 2/3 of the published research in experimental psychology–though it would help to know which 2/3–and the lack of response of the professionals in the field is inexcusable.  But maybe what all of this proves, with an abundance of replication, is the protean quality of the human psyche. And that would be an important scientific finding.  It would even be empirical and replicable. [Read more…]

Empiricism, common sense, and flossing

First we were told that we should avoid food that is high in cholesterol; then we were told that such food doesn’t get into the blood so it doesn’t matter. We were told to avoid eating fat; now we are told that fat can be good for us. Sometimes coffee has been described as harmful and sometimes as helpful.  Drinking alcohol used to be considered unhealthy; now we are told it’s good for the heart.

But there has always been an eternal healthcare verity:  Be sure to floss.

Now that maxim too is under assault:  Researchers are now saying that there is little to no evidence that flossing actually works.

Read the story excerpted and linked to after the jump, and then consider what I say afterwards, how this may reflect a bigger intellectual issue:  the difference between valid deductive reasoning and empirical proof. [Read more…]

Biomedical studies that aren’t reproducible

A key principle of the scientific method is that experiments must be reproducible.  That is, an experiment has scientific validity only if another person tries the same experiment and gets the same result.

It has been coming out that a big percentage–possibly a majority of psychological and social science experiments are not reproducible.  That is understandable, since human beings are active agents and so cannot be expected to be as predictable as inanimate objects.

But now scientists are discovering that a large percentage of biomedical studies are also non-reproducible! [Read more…]

More and more experiments aren’t reproducible

Scientists are worried because more and more experiments are not reproducible.  A principle of the scientific method is that for an experiment to be valid, another scientist who performs it must get the same results.  In many cases today, that is not happening.  This is especially true in the social “sciences.” [Read more…]

Psychology experiments often can’t be replicated

There is currently what is being described as a “crisis” in the field of social/personality psychology.  It turns out, many psychological experiments, however heralded in the media and whose findings are made a big deal of, cannot be replicated by other researchers.  Is that due to fraud?  Statistical quirks? Or does it mean that psychology is not a science after all? [Read more…]

Psychologists admit to bogus research

Social science aspires to the status of natural science, never mind that human beings are not as consistent or predictable as inert matter.  But a new study has found that an alarmingly large percentage of experimental psychologists admit to using questionable, if not bogus, research methods:

Questionable research practices, including testing increasing numbers of participants until a result is found, are the “steroids of scientific competition, artificially enhancing performance”. That’s according to Leslie John and her colleagues who’ve found evidence that such practices are worryingly widespread among US psychologists. The results are currently in press at the journal Psychological Science and they arrive at a time when the psychological community is still reeling from the the fraud of a leading social psychologist in the Netherlands. Psychology is not alone. Previous studies have raised similar concerns about the integrity of medical research.

John’s team quizzed 6,000 academic psychologists in the USA via an anonymous electronic survey about their use of 10 questionable research practices including: failing to report all dependent measures; collecting more data after checking if the results are significant; selectively reporting studies that “worked”; and falsifying data.

As well as declaring their own use of questionable research practices and their defensibility, the participants were also asked to estimate the proportion of other psychologists engaged in those practices, and the proportion of those psychologists who would likely admit to this in a survey.

For the first time in this context, the survey also incorporated an incentive for truth-telling. Some survey respondents were told, truthfully, that a larger charity donation would be made by the researchers if they answered honestly (based on a comparison of a participant’s self-confessed research practices, the average rate of confession, and averaged estimates of such practices by others). Just over two thousand psychologists completed the survey. Comparing psychologists who received the truth incentive vs. those that didn’t showed that it led to higher admission rates.

Averaging across the psychologists’ reports of their own and others’ behaviour, the alarming results suggest that one in ten psychologists has falsified research data, while the majority has: selectively reported studies that “worked” (67 per cent), not reported all dependent measures (74 per cent), continued collecting data to reach a significant result (71 per cent), reported unexpected findings as expected (54 per cent), and excluded data post-hoc (58 per cent). Participants who admitted to more questionable practices tended to claim that they were more defensible. Thirty-five per cent of respondents said they had doubts about the integrity of their own research. Breaking the results down by sub-discipline, relatively higher rates of questionable practice were found among cognitive, neuroscience and social psychologists, with fewer transgressions among clinical psychologists.

via BPS Research Digest: Questionable research practices are rife in psychology, survey suggests.
HT: Joe Carter