No serious mathematical understanding, aptitude, or even interest is required to enjoy this post. There’s this cool thing that I think you’ll find interesting, but there is just a bit of mathematical throat clearing we need to get through up front.

Here’s an example of a differential equation (seriously, this isn’t on the test):

This differential equation describes a damped oscillator. Imagine a weight on a spring. Drop the weight, and it bounces up and down, but less each time until it comes to a stop. That motion is described by this equation.

Equation 2 is a simplified representation (the dots represent a derivative with respect to time):

Move some terms around for one last simplification:

**Analog computers**

Okay, now it gets interesting. Decades before the early room-sized electronic digital computers like ENIAC, there were analog computers. They solve differential equations like this one.

Analog computers are made from elements like integrators, multipliers, and adders (I leave as an exercise for the reader why integrators make much more sense than differentiators).

To solve the equation above, we first *assume that we have ẍ.*

Crazy, right? We just proceed blithely along after first assuming that we already have the second derivative of the thing we’re trying to find.

Stick with me and see how this turns out. First, integrate it twice (each integration removes a dot—that is, one derivative of time). The signal moves from left to right through two integrators:

Okay, we’re almost there. We use the analog computer to create the right side of equation 3:

**Magic time!**

And here’s the fun part. We’ve now computed the right side of equation 3. But wait a minute—that’s equal to *ẍ !* So that bizarre, unfounded assumption—we just assume that we have what we don’t have—was actually justified. We feed that output back in as *ẍ* and we’re done. Here’s the final layout:

**Augustine’s contribution to differential equations**

Augustine (354–430CE) didn’t have much to say on this subject, but see if this sounds like our analog computer project: “Seek not to understand that you may believe, but believe that you may understand.”

Just believe. Don’t worry about it making any sense—understanding will come with time. It’s like solving a differential equation with an analog computer: assume the result, and you will be rewarded.

The problem, of course, is that it actually works with an analog computer. Every time. By contrast, “just believe” is terrible advice when evaluating a claim with poor evidence—Oz, fairies, leprechauns, the Force, and so on.

**Begging the question, Christian style**

We see a similar assumption of the conclusion with many responses to challenges against Christianity. For example, we see in the gospels just what we’d expect to see if the resurrection were true. Therefore, the Christian apologist says, the gospels are important evidence for the resurrection being true.

Or, take the order and beauty we see in the world. The apologist tells us that this is just what we’d expect if there were a god.

Take an analogous argument:

1. If space aliens caused car accidents, we’d see car accidents.

2. We *do* see car accidents.

3. Conclusion: we now have more evidence that space aliens cause car accidents.

Or (for a different kind of rationalization) take the Problem of Evil, the puzzle of why an all-good god allows so much bad in the world. An all-knowing god could have his reasons, couldn’t he? That you skeptics don’t understand is hardly surprising—your finite mind may just be incapable of understanding it from that god’s perspective.

In other words, *assume* the Christian position and rearrange evidence to support it rather than start with the evidence and *then* reach a conclusion.

Assuming the conclusion works great when solving differential equations, since we have evidence that it works. The opposite is true for supernatural claims.

* In a universe of electrons and selfish genes, *

*blind physical forces and genetic replication, *

*some people are going to get hurt, *

*other people are going to get lucky, *

*and you won’t find any rhyme or reason in it, nor any justice. *

— Richard Dawkins

*Photo credit:* Ryan Rahn