Science as Critical Thinking? (RJS)

Science as Critical Thinking? (RJS) June 30, 2011

Harold Kroto, Nobel Laureate, co-discoverer of C60 better known as buckminsterfullerene or buckyballs, was speaking to young researchers at the Lindau Nobel Laureates Meeting. In his lecture he told them that science is more than a collection of facts. Sure, there is a body of knowledge, but science is more than that. It is a away of thinking and searching for truth. (From Scientific American in 60-second science.)

Perhaps most important is that it’s the way that we discover new knowledge. But for me the most important, by far, is that it’s the only philosophical construct we have to determine truth with any degree of reliability. Think about that. Because then it becomes a much bigger subject. In fact, for me, perhaps the most important subject there is. And the ethical purpose of eduction must involve teaching children how they can decide what they’re being told is actually true. And that’s not the case in general. The teaching of a skeptical, evidence-based assessment of all claims–all claims–without exception is fundamentally an intellectual integrity issue. Without evidence, anything goes.Think about it.

In these reflections Dr. Kroto is introducing science as a way of thinking, exploring and viewing the world around us. This way of thinking is essential to the growth of science through the last five hundred years or so and is part of the legacy of the Renaissance and Enlightenment thinking. Education in science is not to impart facts, but to train the mind.

How do you learn and discover and evaluate truth?

What role does “scientific” thinking play?

A popular biology text book, Human Biology 7th ed. by Daniel D. Chiras outlines rules for critical thinking (p. 13-19). These rules help to put the idea expressed by Harold Kroto on a more concrete ground.

1. Gather complete information, not just from sources that support your viewpoint.

Adopting a position based on authority – the views and beliefs of others – isn’t sufficient. Listen and read from a variety of sources and positions. We tend to selectively gather confirming information – this isn’t good enough. Make it a point to intentionally read dissenting views.

2. Understand and define all terms.

Understanding terms and making sure that others define them in discussions brings clarity to issues and debates. (p. 13)

3. Question the methods by which data and information were derived.

In science this is particularly important – how were the experiments done? what methods were used? was the theory appropriate? how was the data analyzed? These kinds of questions must be constantly addressed. A paper or report is never read or heard without these running through my mind.

4. Question the conclusion.

Even when the experiments or calculations are valid the conclusions may be wrong. Bias or implicit preconceptions can lead to misinterpretation of the data. Two questions should always be asked …  Do the facts support the conclusions? and Are there alternative explanations?

My thesis adviser insisted that we separate results from discussion in our papers – a common, but not universal practice in the field. The results, the data, should always stand the test of time. The interpretation in the discussion section may not in the wake of new information, improved methods, or a new insight or theory.

5. Uncover assumptions and biases.

6. Question the source of information.

Rules 5 and 6 are closely related. Bias and assumptions will not always invalidate a result of an interpretation, but it is important information and should play a role in evaluating claims. If someone is known to be an advocate of a certain position, especially with a professional or monetary stake in the outcome, the conclusions deserve more extensive scrutiny. Chiras highlights the importance of knowing the players. Peer review plays an important role here in the scientific community – when the results are reviewed favorably by other experts without the same bias or stake this lends credibility to the paper.

7. Understand your own biases and values

Subject yourself to the same scrutiny you subject others. Always ask … What are my biases, assumptions, and areas of ignorance? Do these impact my conclusions? What can I do to minimize the possibility that I am being misled by my biases and assumptions? (See #1).

These rules, with some modification, are broadly applicable outside of the narrow confines of science. I find it essential to employ a very similar approach when addressing pretty much any question that comes up, including questions of faith and religion. Faith isn’t the result of empirical investigation and determination of truth – it is a consequence of relationship with others and with God. But many of the areas of apparent conflict between science and the Christian faith arise from biases and assumptions, often implicit biases and assumptions

A common concern in our churches is the role that education can play in the loss of faith. The problem, it seems to me, is not the depth of Christian thinking or the strength of the arguments against the faith. The problem is the lack of resource in the local church and in the some traditions to deal with the challenges introduced by critical thinking. There is a reliance on authority rather than dialog in discipling young Christians to think “Christianly” through hard questions.

What do you think? Are these good rules? Which are the most important?

What rules would you add to the list that help you determine the truth of a claim or teaching?

What runs through your mind when you listen to someone teach or preach – how do you evaluate what is said?

If you wish to contact me directly you may do so at rjs4mail[at]att.net

If interested you can subscribe to a full text feed of my posts at Musings on Science and Theology.


Browse Our Archives