There is no way to know everything. In today’s information-rich society, there is no way even to know everything that is known. My instincts (which are sometimes, I grant you, pessimistic) tell me there is no way even to form an intelligent-sounding opinion about (I would estimate) 90% of the things you have to have some opinion about to make everyday choices (is this cosmetic product safe? should I eat organic? should I recycle? what nonhuman animals are conscious? how accurate is this news source? what makes a person a person? where will the future lead us technologically? what is the most effective way to read and remember what I read? how do I decide when to bend the rules for a customer? what makes the economy better?)—or, at least, that doing so would eat your life and you’d have no time to focus on what you actually care about.
Ok, there is one way: accepting the authority of experts. Apparently, studies show that humans are better at making decisions in groups than individually. So: take the advice of doctors, of researchers, of scientists, of thinkers who are generally accepted in their expert communities. There are, though, many problems with this.
The first is that the experts rarely agree, and it’s hard to figure out who the most legitimate experts are, and sometimes the experts don’t have a clue and know it (my economist roommate once told me that only non-economists think they understand the economy at all; I believe this).
Let’s semi-sort-of-solve this by saying “use your brain’s best heuristics and your best research skills to weigh the legitimacy of different expert points of view and figure out what the consensus is on every last one of these darn questions.” I would like to know how this works in the long run—one suspects it would work better than going by “gut,” “common sense,” or such things, even though my heart rebels, and even though it sounds far too exhausting and boring for me to actually try to put into practice.
The second, though, is that the experts seem to be sometimes (to you) not a fallible but still superior source of information but rather obviously full of it. Often even dangerously full of it. And sometimes (I wish I knew whether, for particular individuals or for people in general, this was true more often than chance would allow) you are right. Freud, whom a generation of intellectuals took quite seriously, and who worked from purely anecdotal evidence, some of which, if I remember correctly, he doctored, and whose theories are no longer taken seriously except in the broadest sense (i.e., “we don’t know everything that happens in our own brains”), comes to mind. (And C.S. Lewis, who anxiously questioned Freud with what looks in retrospect like very good sense.) When can you accept your own objections to a consensus you are exposed to (for an example from my own experience that I at least perceived as a consensus : “everything, up to and including truth, is a social construct”)? How obvious do experts’ blunders or “blunders” have to seem to you before you reject their positions?
Third is an emotional objection that has nothing, or very little, to do with the quest for truth and a lot to do with the individualism I’ve been steeped in from childhood; I suspect it is illegitimate but it still matters to me emotionally. I would much, much, much rather read or listen to someone who thinks for themselves than someone who speaks out of received knowledge but does not dare to analyze it. Thinking is fun and pretty and exciting. Accepting what you’re told is dull and infuriating and, for me, sometimes almost impossible. Too little intellectual humility can make you miss nine-tenths of the world, because you’re not willing to question your own instincts and assumptions, but too much can make you (even if it does mean you’re likelier to be right; I so wish I knew whether this was true)…a bore. (*Gasp*, what a terrible fate!)
I think that the way to not be a bore (which is, yes, a very stupid goal) seems to be to be on the edge of conventional wisdom: not so far beyond it that no one understands or relates to what you’re saying, that you look like a crackpot, but not so deep within it that you have nothing new to say.
But it is a stupid goal. A less stupid goal? Charity toward people with different ways of thinking. I’ve read interesting crackpots who have elaborate theories about topics they know nothing about. These theories tend to be, however complex and interesting to me, obviously wrong in some ways and cliche in others, because they arose without a sense of context. People tend to single these people out for mockery, to think they’re stupider than the people who accept what they’re told. I’ve been in environments where any out-of-the-box thinking is met with raised eyebrows and snorts, something I’m guilty of. I’m also guilty (as you can see) of getting frustrated with conformists of any doctrine, even though to a significant extent I am one. Best to try to understand others.