Knowledge and Belief
A (propositional) knowledge (that) ascription logically entails a belief ascription, right? I mean if I think that Sam knows that Joe Biden is the president of the United States, I don’t need to do further research into Sam’s state of mind or behavioral manifestations to conclude that they also believe that Joe Biden is president of the United States. For any proposition or piece of “knowledge-that,” if I state that an agent X knows that q, I am entitled to conclude by virtue of logic alone that X believes that q.
This, as summarized, has been the standard position in analytic epistemology and philosophy of mind. The entailment of belief from knowledge has been considered so obvious that nobody thinks it needs to be argued for or defended (treated as falling closer to the “analytic” end of the Quinean continuum). Most of the work on belief by epistemologists has therefore focused on the conditions under which belief can be justified, not on whether an attribution of knowledge necessarily entails an attribution of belief to an agent.
Of course, analytic philosophers are inventive folk and there have been attempts (starting around the 1960s), done via the thought experiment route, to come up with hypothetical cases in which the attribution of belief from knowledge didn’t come so easy. But most people protested against these made-up cases, denying that they in fact showed that one could attribute knowledge without attributing belief. Some of the debate, as with many philosophical ones, ultimately turned on philosophical method itself; perhaps the inability of professional philosophers to imagine non-contrived cases in which we can attribute knowledge without belief rests on the very rarefied air that philosophers breathe and the related restricted set of examples that they can imagine.
Myers-Schulz & Schwitzgebel (2013), thus follow a recent trend of “experimental philosophy,” in which philosophers burst out of the philosophical bubble and just confront the folk with various examples and ask them whether they think that those examples merit attributions of knowledge without belief. One of these examples (modified from the original ones proposed from the armchair) has us encountering a nervous student who memorizes the answer to tests, but when it comes to actually answer, gets nervous at the last minute, blanks out, and just guesses the answer to the last question in the test, which they also happen to get right. When regular old folks are then asked whether this “unconfident examinee,” knew the answer to this last question, 87% say yes. But if they are instead asked (in a between-subjects set up) whether the unconfident examinee believed the answer to the last question only 37% say yes (Myers-Schulz & Schwitzgebel, 2013, p. 378).
Interestingly, the same folk dissociation between knowledge and belief ascriptions can be observed when people are exposed to scenarios of discordance between explicit and implicit attitudes, or dissociation between rational beliefs that everyone would hold and irrational fantastic beliefs that are induced at the moment by watching a horror movie. In the “prejudiced professor” case, we have a professor who reflectively holds unprejudiced attitudes and is committed to egalitarian values, but who in their everyday micro-behavior systematically treats student-athletes as if they are less capable. In the “freaked out movie watcher” case, we have a person who just watched a horror movie in which a flood of alien larva comes out of faucets and who, after watching the movie, freaks out when their friend opens the (real world) faucet. In both cases, the great majority of the folk attribute knowledge that (student-athletes are as capable as other students and that only water would come out of the faucet), but only relatively small minorities attribute belief. Other cases have been concocted (e.g., a politician who claims to have a certain set of values, but when it comes to acting on those values, by, for instance, advocating for policies that would further them, fails to act) and these cases also generate the dissociation between knowledge and belief ascription among the folk.
Solving the Puzzle
What’s going on here? Some argue that it comes down to a difference between so-called dispositional and occurrent belief. These are terms of art in analytic philosophy, but it boils down to the difference between a belief that you hold but are not currently entertaining (but could entertain under the right circumstances) and one that you are currently holding. The former is a dispositional belief and the latter is an occurrent belief. When you are sleeping you dispositionally believe everything that you believe when you are professing wide-awake beliefs. So maybe the folk deny that in all of the cases above people who know that x also occurrently believe that x, but they don’t deny that they dispositionally do so. Rose & Schaffer (2013) find support for this hypothesis.
Unfortunately for Rose & Schaffer, a subsequent series of experiments (Murray et al., 2013), show that knowledge/belief dissociation among the folk are pervasive, applied more generally than originally thought, in ways that cannot be easily saved by applying the dispositional/occurrent distinction. For instance, when asked whether God knows or believes a proposition that comes closest to the “analytic” end of Quine’s continuum (e.g., 2 + 2 = 4), virtually everyone (93%) is comfortable attribute knowledge to God, but only 66% say God believes the trivial arithmetical proposition. Murray et al., also show that people are much more comfortable attributing knowledge, compared to belief, to dogs trained to answer math questions, and cash registers. Finally, Murray et al. (2013, p. 94) have the folk consider the case of a physics student who gets perfect scores in astronomy tests, but who had been homeschooled by rabid Aristotelian parents who taught them that the earth stood at the center of the universe and who never gave up allegiance to the teachings of his parents They find that, for regular people, the homeschooled heliocentric college freshman who also gets an A+ on their Astronomy 101 test knows the earth revolves around the sun but doesn’t believe it.
So something else must be going on. In a more recent paper, Buckwalter et al. (2015) propose a compelling solution. Their argument is that the (folk) conception of belief is not unitary and that the contrast with professional epistemologists is that this last group does hold a unitary conception of belief. More specifically, Buckwalter et al. argue that professional philosophy’s concept of belief is thin:
A thin belief is a bare cognitive pro-attitude. To have a thin belief that P, it suffices that you represent that P is true, regard it as true, or take it to be true. Put another way, thinly believing P involves representing and storing P as information. It requires nothing more. In particular, it doesn’t require you to like it that P is true, to emotionally endorse the truth of P, to explicitly avow or assent to the truth of P, or to actively promote an agenda that makes sense given P (749).
But the folk, in addition to countenancing the idea of thin belief, can also imagine the notion of thick belief (on thin and thick concepts more generally, see Abend, 2019). Thick belief contrasts to thin belief in all the dimensions mentioned. Rather than being a purely dispassionate or intellectual holding of a piece of information considered as true, a thick belief “also involves emotion and conation” (749, italics in the original). In addition to merely representing that or P, thick believers in a proposition will also be motivated to want P to be true, will endorse P as true, will defend the truth of P against skeptics, will try to convince others that P is true, will explicitly avow or assent to P‘s truth, and the like. Buckwalter et al. propose that thick and thin beliefs are two separate categories in folk psychology, that thick belief is the default (folk) understanding, and that therefore the various knowledge/belief dissociation observations can be made sense of by cueing this distinction. In a series of experiments, they show that this is precisely the case. Returning (some of) the cases discussed above, they show that belief ascription rise (most of the time to match knowledge ascriptions) when people are given extra information or a prompt indicating thick of thin belief on the part of the believing agent.
Thin and Thick Belief in the Social Sciences
Interestingly, the distinction between thin and thick belief dovetails a number of distinctions that have been made by sociologists and anthropologists interested in the link between culture and cognition. These discussions have to do with distinctions in the way people internalize culture (for more discussion on this, see here). For instance, the sociologist Ann Swidler (2001) distinguishes between two ways people internalize beliefs (knowledge-that) but uses a metaphor of “depth” rather than thick and thinness (on the idea of cultural depth, see here). For Swidler, people can and do often internalize beliefs and understandings in the form of “faith, commitment, and ideological conviction” (Swidler, 2001, p. 7); that definitely sounds like thick beliefs. However, people also internalize much culture “superficially,” as familiarity with general beliefs, norms, and cultural practices that do not elicit deeply held personal commitment (although they may elicit public acts of behavioral conformity); those definitely sound like thin beliefs. Because deeply internalizing culture is hard and superficially internalizing culture is easy, the amount of culture that is internalized in the more superficial way likely outweighs the culture that is internalized in the “deep” way. In this respect, “[p]eople vary in the ‘stance’ they take toward culture—how seriously versus lightly they hold it.” Some people are thick (serious) believers but most people’s stance toward a lot of the culture they have internalized is more likely to range from ritualistic adherence (in the form of repeated expression of platitudes and cliches taken to be “common sense”) to indifference, cynicism, and even insincere affirmation (Swidler 2001, p. 43–44).
In cognitive anthropology (see Quinn et al., 2018a, 2018b; Strauss 2018), an influential model of the way people internalize beliefs, due to Melford Spiro, also proposes a gradation of belief internalization that matches Buckwalter et al.’s distinction between thin and thick belief, and Swidler’s deep/superficial belief (without necessarily using either metaphor). According to D’Andrade’s summary of Spiro’s model (1995: 228ff), people can go simply being “acquainted with some part of the cultural system of representations without assenting to its descriptive or normative claims. The individual may be indifferent to, or even reject these claims.” Obvious this (level 1) internalization does not count as belief, not even of the thin kind (Buckwalter et al. 2015). However, at internalization level 2, we get something closer. Here “cultural representations are acquired as cliches; the individual honors their descriptive or normative claims more in the breach than in the observance.” This comes closest to Buckwalter et al.’s idea of thin belief (and Swidler’s notion of “superficially internalized” culture) but it is likely that some people might not think this is a full-blown belief. We get there at internalization level 3. Here, “individuals hold their beliefs to be true, correct, or right…[beliefs] structure the behavioral environment of actors and guide their actions.” This seems closer to the notion of belief that is held by professional philosophers, and it is likely the default version of a belief on its way to thickening. Not just a piece of information represented by the actor and held as true on occasion (as in level 2) but one that systematically guides action. Finally, Spiro’s level 4 is the prototypical thick belief in Buckwalter et al.’s sense. Here “cultural representations…[are] highly salient,” being capable of motivating and instigating action. Level 4 beliefs are invested with emotion, which is a core marker of thick belief (Buckwalter et al., 2015, p. 750ff).
Implications
Interestingly, insofar as some influential theories of the internalization of knowledge-that in cultural anthropology and sociology make the thick belief/thin belief distinction, which, as shown by the research indicated above, is also respected by the folk, it indicates that it may be an idiosyncrasy of the philosophical profession to hold a unitary (or non-graded) notion of belief. Both sociologists and anthropologists have endeavored to produce analytic distinctions in the way people internalize belief-like representations from the larger cultural environment that more closely match the folk. This would indicate that many “problems” conceiving of cases of contradictory or in-between beliefs (Gendler, 2008; Schwitzgebel, 2001) may have been as much iatrogenic as conceptual.
As also noted by Buckwalter et al., the thin/thick belief distinction might be relevant for debates raging in contemporary epistemology and psychological science over what is the most accurate way to conceive of people’s typical belief-formation mechanism. Is it “Descartian” or “Spinozan”? The Descartian picture conforms to the usual philosophical model. Before believing anything, I reflectively consider it, weigh the evidence pro and against, and if it meets other rational considerations (e.g., consistency with my other beliefs), then I believe it. The Spinozan belief-formation mechanism proposes an initially counter-intuitive picture, in which people automatically believe every piece of information they are exposed to without reflective consideration; only un-believing something requires conscious effort and consideration.
The Descartes/Spinoza debate on belief formation dovetails with a debate in the sociology of culture over whether culture is structured or fragmented (Quinn, 2018). The short version of this debate is that sociologists like Swidler think that (most) culture is internalized in a superficial way and that therefore it operates as fragmented bits and pieces that are brought into coherence via external mechanisms (Swidler 2001). Cognitive anthropologists, on the other hand, adduce strong evidence in favor of the idea that people internalize culture in a more structured manner. There’s definitely a problem of talking past one another in this debate: It seems like Swidler is talking about beliefs proper but Quinn is talking about other forms of non-doxastic knowledge. This last kind can no longer be considered propositional knowledge-that but comes closer to (conceptual) knowledge-what.
Regardless, it is clear that if the Spinozan story is true, then beliefs cannot be internalized as a logically coherent web and therefore cannot exert an effect on action as such. Instead, the mind (and the beliefs therein) are fragmented (Egan, 2008). DiMaggio (1997) in a classic paper in culture and cognition studies, drew that test implication from Daniel Gilbert’s research program, showing that people seem to internalize (some) beliefs via Spinozan mechanisms. For DiMaggio, this supported the sociological version of the fragmentation of culture, because if beliefs are internalized as fragmented, disorganized, barely considered bits of information, then whatever coherence they have must come from the outside (e.g., via institutional or other high-level structures), just as Swidler suggests (DiMaggio, 1997, p. 274).
But if Buckwalter et al.’s distinction track an interesting distinction in kinds of belief (as suggested by Spiro’s degree of internalization story), then it is likely that the fragmentation argument only applies to thin beliefs. Thick beliefs, on the other hand, the ones that people are most motivated to defend, are imbued with emotion, are least likely to give up, and are most likely to guide people’s actions, are unlikely to be internalized as incoherent information bits that people just “coldly” represent or consider.
References
Abend, G. (2019). Thick Concepts and Sociological Research. Sociological Theory, 37(3), 209–233.
Buckwalter, W., Rose, D., & Turri, J. (2015). Belief through thick and thin. Nous , 49(4), 748–775.
DiMaggio, P. J. (1997). Culture and Cognition. Annual Review of Sociology, 23, 263–287.
Egan, A. (2008). Seeing and believing: perception, belief formation and the divided mind. Philosophical Studies, 140(1), 47–63.
Gendler, T. S. (2008). Alief and Belief. The Journal of Philosophy, 105(10), 634–663.
Murray, D., Sytsma, J., & Livengood, J. (2013). God knows (but does God believe?). Philosophical Studies, 166(1), 83–107.
Myers-Schulz, B., & Schwitzgebel, E. (2013). Knowing that P without believing that P. Nous , 47(2), 371–384.
Quinn, N. (2018). An anthropologist’s view of American marriage: limitations of the tool kit theory of culture. In Advances in Culture Theory from Psychological Anthropology (pp. 139–184). Springer.
Quinn, N., Sirota, K. G., & Stromberg, P. G. (2018a). Conclusion: Some Advances in Culture Theory. In N. Quinn (Ed.), Advances in Culture Theory from Psychological Anthropology (pp. 285–327). Palgrave Macmillan.
Quinn, N., Sirota, K. G., & Stromberg, P. G. (2018b). Introduction: How This Volume Imagines Itself. In N. Quinn (Ed.), Advances in Culture Theory from Psychological Anthropology (pp. 1–19). Springer International Publishing.
Rose, D., & Schaffer, J. (2013). Knowledge entails dispositional belief. Philosophical Studies, 166(S1), 19–50.
Schwitzgebel, E. (2001). In-between Believing. The Philosophical Quarterly, 51(202), 76–82.
Strauss, C. (2018). The Complexity of Culture in Persons. In N. Quinn (Ed.), Advances in Culture Theory from Psychological Anthropology (pp. 109–138). Springer International Publishing.