Better the God You Know?

For a long time I had the intention to write a book about the evolution of the god-concept, from about the time of the founding of Gobekli Tepe to the modern day. Sort of a mash-up of Robert Wright’s ‘The Evolution of God’ (though strongly disagreeing with many points raised in that book) and Jared Diamond’s ‘Guns, Germs, and Steel’. I have been procrastinating on doing this for the longest time – three years, at least. I finally decided that one of the reasons that I had been putting it off was due to the scope of the project – planning is not my strong suit.

Today’s blog is a quick exploration of the central thesis of the smaller project I decided to move to the front of the queue, ahead of the above. Specifically, I will discuss the way in which the brain processes information (by preferring explanations that rely on the familiar and well understood) and how that can predispose some people to believe in the existence of god(s). I’ll also mention how cultural institutions, even non-religious ones, can reinforce this.

This is fairly stream-of-consciousness in style, so no footnotes. I wrote the first half of this whilst in a Jamaican bar (drinking Red Stripe, as you do), waiting for my partner to finish her strong-woman training a few doors down. (She’s competing in just over a week in her first competition, flipping tyres, lifting stones, all that sort of thing.) I wrote the rest in a local brasserie, on a rainy Friday afternoon, as is my usual schedule.

 

Familiarity breeds contentment with simplistic explanations

People will always attempt to put something new in terms of something familiar. From reviews of a new artist’s album described in terms of existing well-known artists and albums, to art critics talking about new talent with respect to this movement or that artist. The most familiar thing to each of us is ourselves. For all that sometimes other people know elements of our behaviour better than we do (we know what we intend, they see what we do). Beyond ourselves, the people in our surroundings are the next most familiar thing.

Thinking about it, that latter point is a little strange, don’t you think? The space between us should be the next most familiar thing. It is gravity, various types of radiation, pressure, and friction that cause us to experience “ourselves”. These things are familiar to us insofar as they are the medium by which we know ourselves – a path to knowledge we tend to take for granted. (There’s a reason it took until the 17th century for us to understand gravity. Of course things fall down, what else would they do?) Other people – even when comparing adults to our infant selves – are on a similar scale and thus tractable to infantile understanding (the general environment is too big, and many aspects of it too small, or outright invisible). We learn about “other people” from the instant of birth (if not before), and our minds are tuned to certain aspects of people-ness from the get-go. At what age do most of us encounter the word “gravity”, let alone more-or-less understand it?

Screen Shot 2017-05-19 at 20.03.18

Originally posted on GoComics by WileyInk – Re-posted by Banksy on Twitter

 

It is the fact of being a human that draws us to other humans, from the outset. Sure, it’s possible that imprinting plays a role, and Mowgli may have imprinted to wolves before humans but, like Mowgli, the “gift” of self-consciousness (part experience, part genetic inevitability) makes us aware of our humanity at some point, and it is unignorable. (I hope it is obvious that I am using Mowgli as a stand-in for actual feral children.) Nevertheless, some child-rearing practices and educational preferences prioritise the human/social understanding of ‘self and other’ ahead of the holistic/general understanding. Once that prioritisation takes hold, it can be hard to reverse.

There are at least two aspects of learning that reinforce the social (human first) view. Firstly, teaching is often done by adults. Even where teaching is engaged in under a theory, such as Vygotsky’s Zone of Proximal Development (ZPD), as opposed to what ZPD describes (i.e. children learning from slightly older children). Adults interpret the world differently to children… no surprise. This, itself, gives rise to a couple of aspects of teaching that are specific to the adult-teaching-child scenario. Adults have their own, unique set of experiences, and these have given rise to habits of thought and understanding that are peculiar to them. Such habits are internalised. As such, the adults are often unaware of them, and almost as often, unable to explain where they came from… but this doesn’t stop them from trying, and often with reference to the “human first” view.

When someone asks why we do/think a certain thing we try explaining “ourselves”, not the thing that we do/think. By “try” I mean ‘we make plausible-sounding stuff up’ (and by plausible I mean ‘a mixture of both realistic and aspirational views of self, and some culturally sanctioned stuff for good measure’).

The other aspect of adult-teaching-child that fails is the method of teaching most often employed. Very little of how teaching occurs is experiential and empirical – though, depending upon where you are in the world, and which school you go to, that is improving. The personal experience of a piece of sodium exploding upon contact with water gives rise to a far more compelling lesson than starting by “teaching” children how to balance the chemical equation that explains it. A top down approach will only work if the atoms of experience are in place. Explaining the world in terms of humanness and sociality works because we have these atoms of experience within us. It is only through ongoing authoritarian teaching that we apply these human/social folk theories to non-human/non-social phenomena after the age of about 11.

If we provide the (quite literal) atoms of experience (the exploding sodium) and then provide an explanation that relies on that first hand experience (eventually getting to the balanced chemical equation) it teaches the general environment-first lesson. Teaching the interaction through the medium of the mathematical equation (itself a human-first view) necessarily leads to a confused understanding of the world. The very fact of teaching implies an ability to explain or describe, which assumes an available narrative – demonstration is both an entirely different skill, and an entirely different path to knowledge. Furthermore, the words “I don’t know”, whilst often highly accurate, are not a satisfying narrative. So an overweening reliance on narrative explanation ahead of experience will lead to an inability to accept that.

The second aspect of learning that reinforces the human-first view is the culturally sanctioned stuff itself. Adults have their narratives (clusters of information and explanation that are habitually considered together), but many of these are not their own, they are from culture. Culture is just a collection of such narratives that stuck. This stickiness is only partially because of the narrative’s ability to describe reality. The greater stickiness is down to how “human” it is (in the same way that the atoms of experience that we default to are human/social). This is why Freud is still the first person most people think of when you mention psychology, not his eminent and more scientific contemporaries, Wilhelm Wundt or William James (more’s the pity).

If someone has not experienced something personally, the human-first explanation will often work. If, however, they have experienced it, such explanations are unlikely to be satisfying. Except when the reliance on the human-first explanation has become habitual, or when the individual is tired, stressed, or in a state of emotional arousal sufficient to default back to our most infantile understanding of the world: human-first.

The humanness of an explanation should only have relevance when the phenomenon being discussed is in fact human. Even then, when it is an aspect of a human that is being discussed at the molecular or microbial level, describing it in “human” terms has limited use. Sure, thinking about the immune system attacking interlopers has a surface-level utility in explaining the basic concept (and gives rise to a great movie or two – narrative, again), but the failure to recognise that this is a metaphor would lead to many an incompetent doctor (albeit one that could explain things to kids quite well). Indeed, describing a limb, say, in terms of the totality of a person, doesn’t make a lot of sense (and is at least two different fallacies).

 

What’s meta- for?

The human-first view is a powerful source of metaphor, because metaphor relies on the use of something familiar to explain something unfamiliar. As outlined above, it is the most familiar because it is our daily experience, and the way our brain was tuned to aid our infant survival, and the basis of all of our subsequent understanding. However, if a metaphor keys in to the method by which one understands things, anyway, then the fact of the explanation being metaphorical may well be lost. It’s possible to come to use this methodology without recognition of the fact that metaphor is only descriptive, not explanatory (“familiar” is often mistaken for ‘correct’). Of course, if we base our explanation of others on a metaphorical extension of the self, and if that works, and if much of what we know relies on the human-first view, then we will use our sense of self to explain all of our surroundings.

Explaining all of our surroundings through the over-extension of the human-first view explains the omnipresence of God that only some people are aware of. God, then, is simply a metaphorical extension of the self into the environment, and thus the ‘go to’ explanation for all phenomena for some. On the one hand that must be pretty self-empowering, as it provides one a sense of connection to everything in existence (hello “spiritual but not religious”, and those that believe in a higher power, but not God, per sé). On the other hand, when such explanations fail, as they inevitably will, because not everything can be explained in human-first terms (and that increasingly includes other humans), one’s sense of autonomy must take a terrible knock. Unfortunately, there is no easy fallback available. The fear of failure to understand the environment, and the repeated proof of this, gives rise to stress which reduces the ability to learn, and increases the reliance on infantile understanding.

God is a metaphor, based on the self, and applied to all of those things to which that metaphor seemed to apply (the totality of which varies from individual to individual). Failure to recognise this in oneself and to attempt to remediate it by (usually) one’s early-30s will inevitably lead to disappointment, and quite possibly anger. This is an anger born of impotence in the face of a bewildering world. The only solution is to surround oneself with similar individuals, and have one’s beliefs reinforced by someone “in authority” (hello, organised religion). Some people, from this position of socially mediated reinforcement, perhaps recognising that their epistemology is failing them, will venture forth to attack those whose epistemologies are more stable, and make a pretense of skeptical argumentation (hello, apologetics). However, the first step in true skepticism is to be skeptical of one’s own position, and so these people fall at the first hurdle, which often frustrates them more. Failure to engage with the general environment on its terms is not a path to truth, and likewise leads to a failure to understand one’s fellow humans outside of the narrow social context in which one was educated.

 

Summary

Individuals tend to explain things with reference to themselves, and those around them, especially where those around them are “like” them. This immediately fails to explain natural forces and phenomena, many of which are necessary to our own definition of ourselves. It does however enable us to survive as infants in a social environment. The mental tools of infancy are not necessarily the best tools for adulthood. This is especially true of the means by which we learn. Learning from other people’s recounted experiences, as opposed to those that are mutually experienced, reinforces the reliance on narrative and human-first explanations.

Human first explanations are what have generally made up culture, particularly the most persistent aspects of culture, as we can all put human-centred explanation into the context of our own experience, and doing so can feel “right”. The human-centred view is the basis of many metaphors that have powerful descriptive value, but questionable explanatory value. They are nevertheless, sticky.

God, is an all-encompassing, self-based metaphor that is, by definition, human-first, and very sticky. If not grown out of by one’s early 30s, it can lead to failures to understand the general environment with which adult humans must engage. This can lead to feelings of fear and loss of control, which impede the ability to engage with complex explanations. This can lead to confrontation with those that have been able to take on the more complex explanations, and who are thus able to better understand both the general environment, and the complexities of modern society.

"I don't think many people have this misanthropic view. But I do think that those ..."

Pro-Life
"No, it changes how the Constitution is interpreted. Did you read anything I said above?I'll ..."

Meet a Closet Anti-Abortionist Pretending to ..."
"Society changing does not change the meaning of the Constitution, and it wasn't "society" that ..."

Meet a Closet Anti-Abortionist Pretending to ..."

Browse Our Archives

Follow Us!


What Are Your Thoughts?leave a comment