• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

Epistemic Responsibility

You presented me with a documentary which is about as scientific as an episode of "Survivor."

You posited I was not observing people. I suggested you view a documentary in which people behave counter to your position. You rejected it because it's not "natural" enough. In either case, that wasn't my scientific argument and you know it.

There is no scientific evidence of how human beings need to be certain.

Well that's objectively wrong and I can show you.

I rarely do this, but I'm going to give you a list of things to read because this is pretty much common knowledge in the fields of neurology and psychology, and there really is no point in me describing all the science to you when other people have done it better already.

A few of these are not scholarly sources (like Psychology Today) but I submit them because they do a good job summarizing the actual scholarly data. If you really are skeptical of their claims, you can follow their citations and read the work of the scientists and their studies.

(You could have found all of this with just a single Google search, by the way.)

A Hunger for Certainty | Psychology Today
The Certainty Bias: A Potentially Dangerous Mental Flaw - Scientific American
Changes in Brain Regions May Explain Why Some Prefer Certainty and Order - Neuroscience News
Brain cell mechanism for decision making also underlies judgment about certainty | UW News
https://www.amazon.com/Sure-Unconscious-Origins-Certainty-Brain-ebook/dp/B008AK8W1Q

If you want to read a few actual study abstracts, here:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5798459/
https://science.sciencemag.org/content/310/5754/1680.abstract
https://elifesciences.org/articles/27483

This is just the tip of the iceberg of what I found in about 10 minutes. Dig a little deeper and you're going to find the same conclusion over and over going back decades across multiple disciplines: humans crave certainty.
When you've actually developed a coherent argument for why their scientific evidence is invalid, I'd love to hear it. You should probably write a paper on it to, because if you're right and there's "no scientific evidence of how human beings need to be certain" then you might actually win a Nobel Prize.

It may or not be a built in mechanism for every day behavior but it does not mean it is needed to survive as a conscious big picture approach to life.

Brains aren't that compartmentalized. The human craving for certainty bleeds into everything, not just our meals and our sexual partners.

When we know where our next meal is coming from and have some social satisfaction and have some things that interest and entertain us we make it day to day. But one crisis is enough to spark doubt. This doubt may throw us off, but it doesn't necessarily create a fatal crisis.

I've never argued anything about a 'fatal crisis'. I don't even know what you mean by that or why it's relevant to our discussion.

We seem to adapt pretty well naturally, just like other animals do.

Yep, and the way we do that is through our certainty mechanisms.

If people arrive at certainty unconsciously there is nothing they can do to change it consciously.

Congratulations, this is the stupidest thing you've said so far. Unconscious biases can become conscious and analyzed. That's literally the entire discipline of critical thinking.

We are more like what we view as lower animals than we think. Most of life is unconscious.

This is correct. But unlike a donkey, we have a bright spotlight in our prefrontal cortex that is capable of illuminating the darker parts of our unconscious mind. Not all of it, and certainly not all of it at once (we'd go mad otherwise), but unconscious thoughts and beliefs can enter conscious awareness.
 
I'm sure there is a thread elsewhere for a debate on vaccines and autism. Please don't derail this one.

Sorry but if someone tries to dictate the events of my life to me, I will respond every time.

No derailment intended.
 
If we take the view that all beliefs must be classified in binary terms as either justified or not, we will always ultimately run into problems.

Science is an extremely useful way to attack a problem, but it is also an extremely costly way to attack a problem. If I needed to suspend disbelief until the standards of science are met, I couldn't begin to do the first thing, including science. Yet, there is manifestly a difference in the quality of an argument I can make for a specific diagnosis of a computer problem that I am about to use a basis for solving a problem and someone who says he'll strap bombs to his chest because 72 virgins wait for him after his death.

What we seem to be doing, however, is make assumptions and we forgo questioning them until someone disagrees with them or until we run into problems that inevitably pop up as we can only approximately correct frameworks to making choices. That seems reasonable to me. You don't have to provide extensive research to support your political views, for example, until someone questions the validity of your assumptions. Then, if you want to respond, you need to dig a little further. Usually, this involves using statistical techniques and data, all of which come with unjustified choices of their own. If someone then questions these, you need to dig into the details of how to choose between methods or the details of how data is collected or measured to reply.

Manifestly, no one goes that far into everything all the time. It would also be extremely stupid to do it because time is limited.
 
The pomposity around evidence really rears its ugly head here. It's not about evidence vs. no evidence, it's the standard of evidence in question. For most devout religious people, the evidence is generated from faithful witness and observation. This was the tradition for centuries in Europe, reinforced with scripture. You can't just act like what billions of people are doing is not real by waving your hand and dismissing it as non-evidentiary. You don't get out of it that easy, even if the current hegemonic epistemology is that of rationality in some areas, you don't get to pretend it's above subjectification. To me this demonstrates a lack of understanding of what epistemology is, possibly due to a mono-epistemological world view. Something as simple as learning other languages and cultures shows that there is not one way of seeing reality. People who have a multicultural understanding can switch between epistemologies at will, and each one seems as concrete as any other while you're immersed in it.

I don't see how you can talk about epistemology, which is highly varied, yet simultaneously talk about the limited epistemology of secular agnosticism. Why don't you (or Clifford) just own the fact that you have a certain standard of evidence that, when not met, creates an immoral or unethical standard for you?

The anti-vax piece is a nice little bait tossed in. Vaccines gave my son autism, probably due to sub-clinical meningitis. I have my masters degree in biology so I am well trained in science. You may feign being part of the school of rationality, but I can tell you're not actually a scientist. If you were, you'd be part of scientific circles like I am, who argue about everything all the time. The level of disagreement is vast, despite our similar training. That's because the school of rationality is not immune to epistemology. Scientists themselves are steeped in their cultures and individual realities. Logocentricists love to act like they are the supreme object authority, but actually they are just one epistemology among many, and it disturbs them greatly to have this pointed out.

There is no epistemological responsibility. I don't need your permission to see the world a certain way; but even if I did, my way of seeing may not be entirely within my control. What is within my control is to engage in practices that enhance the common trust - social contracts if you will - and it's there that responsibilities play out. If I want to relate across commonly agreed reality with other people in a way that alienates me, deprives me of enjoyment or even inflicts harm, then I will suffer.

Your question is actually a social one, not an epistemological one. For all you know, each individual person has a slightly varied epistemology, despite the outward appearance of coherent culture. Just because we all play the role doesn't mean it holds equal meaning for everyone.

I think this deserves post of the year, or something like that. Well said!
 
Here is what I'm looking for: an epistemology that minimizes the acquisition of personally and socially deleterious beliefs, particularly those of the spiritual variety.

I want to suggest that this is not so much about the ethics of belief acquisition--that is, epistemic responsibility--but rather, just about ethics, tout court. When I'm not doing philosophy, I'm usually doing mystical or ritual practice in some form or other, and I've been doing it long enough to have become certain that there is something like God or the Great Spirit or Brahman or what-have-you (I like to think of it as the Vast Mystery), and that something within human beings interfaces with, communes with, partakes in that Vast Mystery. I have a myriad other beliefs as well, some of them occasioned by fantastic mystical visions I've been granted, some of them seeming to have been nudged into me from the outside. As a philosopher, I'm skeptical of these other beliefs--I'm aware they didn't come to me by a prima facie sound epistemic procedure. At the same time, some of them can be justified "on the back end," and some of them seem genuinely useful or helpful--for example, I believe that I am at the service, or at least bound to help, as many beings as I can before I die.

One question that occurred to me early on was this: how can I make sure I don't end up going the wrong way? I could listen to someone like Jim Jones or (later) Marshall Applewhite and hear some faint echoes of some things I think are correct, but they would of course become twisted into something vile, something obviously wrong. And yet, at some point along the mystical itinerary, the mystic must surrender utterly--and how could I tell that the powers to which I was surrendering wouldn't drive me insane, as had happened to these other individuals? The answer I decided upon was: ethics. If it appeared to me that an angel was appearing to me, commanding that I murder my neighbor, I would know that either it is an hallucination or some degenerate being--either way, something to be dismissed, for I could not possibly surrender control of myself to any being that demanded or even suggested such an act. This technique, I only learned later in my humanistic study, is very old, and was in use from at least the outset of Christianity, and is present in all three of the Abrahamic traditions.

But the basic idea is simple enough: we are now in possession of at least a few moral truths, of which we can be genuinely certain: it is wrong to murder another human being in cold blood; it is usually wrong to lie; it is wrong to rape; it is better to be humble than otherwise, etc. People may differ as to exactly what belongs on the list, but I think that the vast majority of human beings would agree on a core set of principles--and those are the ones I'm talking about. There are some ethical truths that are questionable (prescriptions of ritual purity, for example), and some that practically all of us would agree we just cannot do without--a kind of ethical "red line" if you will.

Beliefs that present themselves should be met with skepticism and tested against this kind of standard, and rejected if they violate it. I suspect if everyone could follow this test reasonably well, there'd be considerably less nonsense in the world. Of course it's not perfect. We are still learning ethical truths (including truths about belief-formation and acquisition).
 
I want to suggest that this is not so much about the ethics of belief acquisition--that is, epistemic responsibility--but rather, just about ethics, tout court. When I'm not doing philosophy, I'm usually doing mystical or ritual practice in some form or other, and I've been doing it long enough to have become certain that there is something like God or the Great Spirit or Brahman or what-have-you (I like to think of it as the Vast Mystery), and that something within human beings interfaces with, communes with, partakes in that Vast Mystery. I have a myriad other beliefs as well, some of them occasioned by fantastic mystical visions I've been granted, some of them seeming to have been nudged into me from the outside. As a philosopher, I'm skeptical of these other beliefs--I'm aware they didn't come to me by a prima facie sound epistemic procedure. At the same time, some of them can be justified "on the back end," and some of them seem genuinely useful or helpful--for example, I believe that I am at the service, or at least bound to help, as many beings as I can before I die.

One question that occurred to me early on was this: how can I make sure I don't end up going the wrong way? I could listen to someone like Jim Jones or (later) Marshall Applewhite and hear some faint echoes of some things I think are correct, but they would of course become twisted into something vile, something obviously wrong. And yet, at some point along the mystical itinerary, the mystic must surrender utterly--and how could I tell that the powers to which I was surrendering wouldn't drive me insane, as had happened to these other individuals? The answer I decided upon was: ethics. If it appeared to me that an angel was appearing to me, commanding that I murder my neighbor, I would know that either it is an hallucination or some degenerate being--either way, something to be dismissed, for I could not possibly surrender control of myself to any being that demanded or even suggested such an act. This technique, I only learned later in my humanistic study, is very old, and was in use from at least the outset of Christianity, and is present in all three of the Abrahamic traditions.

But the basic idea is simple enough: we are now in possession of at least a few moral truths, of which we can be genuinely certain: it is wrong to murder another human being in cold blood; it is usually wrong to lie; it is wrong to rape; it is better to be humble than otherwise, etc. People may differ as to exactly what belongs on the list, but I think that the vast majority of human beings would agree on a core set of principles--and those are the ones I'm talking about. There are some ethical truths that are questionable (prescriptions of ritual purity, for example), and some that practically all of us would agree we just cannot do without--a kind of ethical "red line" if you will.

Beliefs that present themselves should be met with skepticism and tested against this kind of standard, and rejected if they violate it. I suspect if everyone could follow this test reasonably well, there'd be considerably less nonsense in the world. Of course it's not perfect. We are still learning ethical truths (including truths about belief-formation and acquisition).

My skepticism tells me that you are not properly skeptical of the views you espouse in this post.
 
The pomposity around evidence really rears its ugly head here. It's not about evidence vs. no evidence, it's the standard of evidence in question. For most devout religious people, the evidence is generated from faithful witness and observation. This was the tradition for centuries in Europe, reinforced with scripture.

Your position poses a profound discriminatory problem because a sufficiently permissive description will inevitably include things that we do not really share, such as the mental state a religious person might experience when they pray. In other words, the problem here is that because I am not in them living their lives; I have to just take their word for it. There is no way I can reliably check for myself, even in principle: I cannot know if they lie, are mistaken about their recollection of the events or if they were mistaken about the source or nature of their experience during the events. Even a lie detector will not solve this conundrum. I can also try and possibly fail to feel as they do like millions of people seem to have in recent history. But even this absence would be problematic because it is not something they can check in their turn.

Another point to be made is that the aforementioned way to cast the problem has the attractive property of putting everyone on the same level. If what comes to bear on a discussion are things we can all in principle check for our own benefit, the source of an idea is irrelevant. The content matter. That's one of the many problems with scripture: I have to assign it a peculiar authority that the same people would have me deny to the story of Peter Pan, for example. The distance between the story of Peter Pan and our experience of life certainly is not the discriminatory factor. God is said to have far more power than flying only when the right kind of dust is tossed in the air. It is an argument from authority. That's problematic because it puts some people above others.
 
Sorry but if someone tries to dictate the events of my life to me, I will respond every time. .

For the benefit of others, empirical studies usually focus on a specific moment of the distribution of effects: one or another of the few interesting conditional mean value of the distribution***. While you could in principle look at many conditional quantiles, for example, you can never do it for the whole distribution. Every single person either receives treatment or does not, so you always need people in the opposite situation to span the space of the missing scenario.

So, it's not because you find little statistical evidence for anything that nothing is happening. One possibility is that the effect is so small that you would need much more data to detect something that is statistically different from zero. Another possibility is that the average treatment effect is indeed zero, but that doesn't mean it is zero for everyone. It could be positive for some people and negative for others in a way that cancels out on average.

*** To my best knowledge, medicine uses random trials almost exclusively, so you can estimate the average treatment effect (i.e., the difference of the conditional expected value of the distribution of outcome values across the treatment/control status). However, if you use field data as opposed to lab data, you might be looking at the average treatment effect on the treated (think difference-in-differences estimators) or the local average treatment effect (think instrumental variables and GMM estimators). It is rare, even in economics where you will see all of these, that people look at conditional quantiles, but you could wonder what happens to the lower third of the distribution, for example. It's just way more complicated.
 
If we take the view that all beliefs must be classified in binary terms as either justified or not, we will always ultimately run into problems.

I agree actually, and Angel did an excellent job in this post parsing out the non-binary criteria for the Epistemic Responsibility that I'm looking for. I'll repeat them here (using Angel's words:)

1. degree of evidence for a belief
2. behavior based on a belief
3. nature of behavior based on belief

You can see all three of those criteria are on a scale, and a subjective one at that. But a scale grounded in self-examination and critical thought-- even an inherently subjective one-- is far better than nothing at all.

Science is an extremely useful way to attack a problem, but it is also an extremely costly way to attack a problem. If I needed to suspend disbelief until the standards of science are met, I couldn't begin to do the first thing, including science.

I agree completely, and I also don't think what I'm proposing is anywhere near as costly as hard science.

What we seem to be doing, however, is make assumptions and we forgo questioning them until someone disagrees with them or until we run into problems that inevitably pop up as we can only approximately correct frameworks to making choices. That seems reasonable to me. You don't have to provide extensive research to support your political views, for example, until someone questions the validity of your assumptions. Then, if you want to respond, you need to dig a little further. Usually, this involves using statistical techniques and data, all of which come with unjustified choices of their own. If someone then questions these, you need to dig into the details of how to choose between methods or the details of how data is collected or measured to reply.

Except now we're talking about defending a position, versus deciding whether or not to take that position. Given the intense human propensity towards inertia of belief, challenging existing beliefs that have been resting in certainty for a long time is far, far more difficult than developing a system of holding those beliefs loosely and with care. The process of having a belief challenged is far less troublesome if we've already taken great caution acquiring the belief and are not piling weight on it.

I wouldn't ask someone to be "Epistemically Responsible" and take such extreme caution with every single thought or belief in their life. That, I agree, is too costly to function. But beliefs around spirituality seem particularly fraught and so I think they deserve a special class of caution.
 
Beliefs that present themselves should be met with skepticism and tested against this kind of standard, and rejected if they violate it. I suspect if everyone could follow this test reasonably well, there'd be considerably less nonsense in the world. Of course it's not perfect. We are still learning ethical truths (including truths about belief-formation and acquisition).

I agree with almost everything you said, up until this point. Many, many religious people acquire their ethics from their religion. Frequently this involves some version of divine command theory where their god's/religions tenets will bump up against 'common sense' ethics and their religion wins every single time. For most people, the big basics like not murdering or raping are pretty hard for a religion to knock over, but when it comes to the more subtle stuff like concepts of purity or "God's plan", they are pretty much immune to this process you've described.

I'm not just talking about uneducated or ignorant people who are fed a religious doctrine and cling to it because that is all they know. I'm talking about reasonably educated, reasonably intelligent people who would NEVER permit themselves to use the same epistemology for the rest of their lives as they do for their religion, and with serious consequences.

The pastor of a church who recommends people off to gay conversion therapy to fix them.
The imam who tells a woman she must cover herself in public to protect men.
The man whose empathy is blunted towards refugees because "suffering is part of God's plan" and "all this will soon pass away."
The woman who is deeply suspicious of all science because some science contradicts her holy scripture.

None of those people would be likely convinced by a vision from an angel to murder someone. Nevertheless, beliefs about sexual purity, the nature of suffering, and the primacy of scripture are all part of the ethic they use to measure challenges to their beliefs or the adoption of new beliefs. Simply testing beliefs against an ethical standard doesn't work well if the ethical standard includes loopholes like divine command theory, or if the ethical standard is bent by existing held religious beliefs. Over time, new beliefs supported by old beliefs can turn a person's ethics inside-out.

I think even the best-intentioned thinkers need something a little more than just a gut-check of ethics.
 
I think the answer is better science education.

And by that, I don't just mean force feeding kids a bunch of science facts and information. I mean getting them to start thinking like a scientist does. It is a particular kind of mindset that is at the same time both very open minded, and yet also highly critical. I think if you don't have the mindset, it's impossible to explain it philosophically.

How do you respond to Northern Light's assertion that within the scientific community, scientists of comparable repute often disagree on theories, levels of veracious evidence, etc? (paraphrase very approximate)
 
Last edited:
To be clear, I am not credited with those axioms-- they are the invention of Mike McHargue, who created them as a scaffold for him to re-enter Christianity from atheism. Honestly he's barely a Christian by the standard Evangelical definition. He's closer to an igtheist mystic in the Christian tradition with a heavy, heavy lean towards physicalist explanations for the experiential components of his belief. (A very different kind of "Christian".)

I find these axioms an excellent start point for an epistemically responsible theism. Developing such a theism/religion necessitates more than just physicalist redefinition of terms, though. These offer a kind of bedrock for belief with a built-in anti-dogmatic, anti-doctrinal failsafe: uncertainty.

Dogma, doctrine, and orthodoxy are the moral virus in the veins of religion. Igtheistic uncertainty is the vaccine. Are you really going to blow yourself up in a plane if you're uncertain that you'll go to a virgin-laden paradise? Are you really going to torture gay people if you're uncertain that God has forbidden their homosexuality?

Perhaps less virulent of an example, are you even going to make any effort to levy others into your beliefs if you make no claim to any exact nature of God beyond "it feels like God might be X, with no supposition that he is anything MORE than Axiom 2..."?

Could you clarify how igtheistic uncertainty differs from the trinity of William James's functions to which you referred earlier: that of "live/ forced/ momentous?"
 
Hmmm.

I don’t know. Maybe. I guess personally I just don’t have the religious bug, so it just seems like extreme mental gymnastics to try to make something stick that just doesn’t want to stick. I guess if you want to make it work that badly, this is one way to think about it.


But I think it’s a sort of optical illusion. God to me just seems Like a sort of personification or apotheosis of the ultimate platonic ideals (or should I say Plotinus’ The One). It’s a convergence of ideas and ideals taken to their abstract extreme and then personified and deified and given some hypothetical external existence. I don’t see any use in that.

I would tend to trace God from the image of the "sky-father boss of everything" that we get in tribal societies, and to view God's assimilation of Platonic ideals and such to be secondary in nature.
 
There are a lot of things we THINK we need. But sometimes, all it takes is a little paradigm shift in our thinking, adapting a different perspective, to realize it falls away easily and we don't need it all that much after all. This idea of a "need to believe" and "ultimate truth" as an inevitable part of human nature may just be a weird byproduct of certain traditional cultural paradigms and ways of thinking we have grown up with. But humanity has reached a stage where that is no longer compatible with more useful ways of thinking we have learned just in the past 2-3 centuries. So it may be time to revisit the earlier ways of looking at things and seeing if it was even the right way to look at things in the first place. It seems what you are describing here is just the discomfort, the cognitive dissonance, that comes from adapting more modern and useful ways of looking at the world, and finding that they are hard to reconcile with many of the old paradigms and worldviews. You are thinking there MUST be some way to reconcile these two. The inability to do so is making you uncomfortable. You want to salvage the old model in some way or other. You are using the perspectives and vocabulary of the new paradigms to look at the old ones, and they no longer make sense. But you feel like you need it, and wouldn't be able to live without them. You really don't. It's like the smoker who thinks he can't live without his cigarettes. Psychologists often use "cognitive behavioral therapy" (CPT) to show them that with being able to think about things slightly differently, you don't find necessarily clever new ways to fill that inevitable "need". You may find you don't need it at all.

But does the same paradigm shift work for every individual?

I appreciate the citation of the smoker's dependence, since by luck or design it reminded me of James's mention of a similar topic in VARIETIES OF RELIGIOUS EXPERIENCE.
 
How do you respond to Northern Light's assertion that within the scientific community, scientists of comparable repute often disagree on theories, levels of veracious evidence, etc? (paraphrase very approximate)

First, let's review what a theory is. A theory is a model that explains a set of facts.. about 'why do we observe what we observe'. Thefirst part of thinking like a scientist is 'this is what we observe'. The next set is coming up with an idea about 'why do we see what we see'.. That why is known as a hypothesis. Then comes 'how do we test our hypothesis'? Through experiments and observations, and making predictions about what will be found. If the predictions match up with what is seen, then the hypothesis is not falsified. As more is found, and tested, the stronger the hypothesis comes , until it is a theory. The disagreement is a way to 'let's see if the test was valid, and confirmation bias did not creep in'. Also, there could be alternate explanations for the same data found, and part of the scientific process is eliminating an alternate view. That's part of testing the model.
 
First, let's review what a theory is. A theory is a model that explains a set of facts.. about 'why do we observe what we observe'. Thefirst part of thinking like a scientist is 'this is what we observe'. The next set is coming up with an idea about 'why do we see what we see'.. That why is known as a hypothesis. Then comes 'how do we test our hypothesis'? Through experiments and observations, and making predictions about what will be found. If the predictions match up with what is seen, then the hypothesis is not falsified. As more is found, and tested, the stronger the hypothesis comes , until it is a theory. The disagreement is a way to 'let's see if the test was valid, and confirmation bias did not creep in'. Also, there could be alternate explanations for the same data found, and part of the scientific process is eliminating an alternate view. That's part of testing the model.

That's the paradigm, but whether it works in real life is my question.

Some hypotheses, for instance, cannot be tested so as to eliminate alternate views. Let's assume that you credence a particular theory of evolution that favors one particular mechanism by which evolution proceeds. You may believe that there is enough physical evidence that proves the superiority of your camp's theoretical mechanism over any other proposed mechanisms. But how do you prove to another reasonably intelligent person, who favors another mechanism, that you are not guilty of confirmation bias, given that evolution cannot be placed on a centrifuge and sorted out.?
 
Could you clarify how igtheistic uncertainty differs from the trinity of William James's functions to which you referred earlier: that of "live/ forced/ momentous?"

I realize this is like 6 months late, but this was a good question and deserves a reply.

James' Live / Forced / Momentous criteria are really just there to tell us when it is ok to believe something based on the Will to Believe, but it says virtually nothing about what we should believe. If those three criteria are the only filtering methods, then one can arguably justify believing some truly insane things.

Igtheistic uncertainty (as I am using the term) applies to the degree of confidence that we should carry in regard to the specifics of a belief. It starts with a premise that virtually any concept we have about God is either so woefully insufficient to capture what it is we're trying to describe, or just so completely wrong to start, that we should not be focused on establishing some specific doctrine about who or what God is in order to capture 'truth'. Rather, the goal should be to accept that whatever we decide to believe is probably wrong in some significant way, while still choosing to believe it contains enough truth to justify believing in it, and therefore enjoy the benefits of that belief.

Mike McHargue's axioms are about as close as we can get to a relatively bulletproof conception of the divine and spirituality. Everything above and beyond that should be held with a looser and looser grip.

A practical example: I believe that the primary characteristic of God is love. I have only the slimmest, barest degree of totally subjective evidence to support this belief. I acknowledge and accept that I could be wrong, and I acknowledge that my evidence is not better or stronger than an atheist's belief that there is no God at all. Even saying "God is love" so woefully fails to describe anything truly 'true' about either God or Love that attempting to defend it really isn't useful to anybody. Having said all that, I believe it anyway, because the pragmatic benefits of believing it are so vastly superior to other equally unsupported beliefs about God (such as believing there is no God, or that God is angry and brutal.)

That's igtheistic uncertainty. I believe it, but I'm not totally certain, and I don't have to be.

The practical upshot of all this waffling is that it results in very real religious belief while simultaneously minimizing the downside risk of religious belief. There is no way I would be willing take my "doctrine" of God being love out on the street and try to convince everyone else that I am right, because I am insufficiently certain-- and it's so fragile a belief that it doesn't have much argumentative power anyway. The most I am willing to do is share it as an idea to be considered by others, which is all any of us should really be doing.
 
I realize this is like 6 months late, but this was a good question and deserves a reply.

James' Live / Forced / Momentous criteria are really just there to tell us when it is ok to believe something based on the Will to Believe, but it says virtually nothing about what we should believe. If those three criteria are the only filtering methods, then one can arguably justify believing some truly insane things.

Igtheistic uncertainty (as I am using the term) applies to the degree of confidence that we should carry in regard to the specifics of a belief. It starts with a premise that virtually any concept we have about God is either so woefully insufficient to capture what it is we're trying to describe, or just so completely wrong to start, that we should not be focused on establishing some specific doctrine about who or what God is in order to capture 'truth'. Rather, the goal should be to accept that whatever we decide to believe is probably wrong in some significant way, while still choosing to believe it contains enough truth to justify believing in it, and therefore enjoy the benefits of that belief.

Mike McHargue's axioms are about as close as we can get to a relatively bulletproof conception of the divine and spirituality. Everything above and beyond that should be held with a looser and looser grip.

A practical example: I believe that the primary characteristic of God is love. I have only the slimmest, barest degree of totally subjective evidence to support this belief. I acknowledge and accept that I could be wrong, and I acknowledge that my evidence is not better or stronger than an atheist's belief that there is no God at all. Even saying "God is love" so woefully fails to describe anything truly 'true' about either God or Love that attempting to defend it really isn't useful to anybody. Having said all that, I believe it anyway, because the pragmatic benefits of believing it are so vastly superior to other equally unsupported beliefs about God (such as believing there is no God, or that God is angry and brutal.)

That's igtheistic uncertainty. I believe it, but I'm not totally certain, and I don't have to be.

The practical upshot of all this waffling is that it results in very real religious belief while simultaneously minimizing the downside risk of religious belief. There is no way I would be willing take my "doctrine" of God being love out on the street and try to convince everyone else that I am right, because I am insufficiently certain-- and it's so fragile a belief that it doesn't have much argumentative power anyway. The most I am willing to do is share it as an idea to be considered by others, which is all any of us should really be doing.

I have seldom ran into someone who has put so much thought into this issue. That seems like a very reasonable position.
 
Back
Top Bottom