Este articulo está disponible en Español aquí.

Learning and understanding what other people believe is truly fascinating. Sometimes, even people we think we know well surprise us when expressing an opinion on a certain issue. How we react when confronted with such differences of opinion says a lot about our level of maturity and the kind of person we are. How do we react when confronted with information that contradicts our views on important subjects? Do we tend to react calmly and level headed, judging with impartiality the evidence before us, stoically accepting whatever conclusion it points us to? Or, are there certain emotions that tend to shield us from undesired realizations? What if those emotions lead us into rejecting important truths in our lives?

It is a fact that the truth often hurts. But in the end, I think most of us prefer it. Why is it then that, after so many thousands of years of human history, consensus seems so hard to arrive at in so many different areas? Psychologists have known for years, many decades even, that there are certain built-in blinders in our minds that tend to lead us astray in our search for truth. Surprisingly, even in much less emotionally charged issues than one would expect we all tend to be a lot less impartial than we would like to believe.

Philosophers like to use the term “epistemology” to refer to the study of knowledge. Yes, for many years, some of the brightest minds of the world have been attempting to define, understand, and explain what knowledge is, how it works, what limitations and capabilities do we have in that regard, etc. I certainly do not claim to have figured it all out, not even close. But, I probably know more about this topic than most people (since most people hardly ever bother to even think about or ponder the issue).

Since everything I have written in this blog up to now has been related to the use of computers, I am going to try to use terms familiar to people who work with them. For example, we can compare our brain’s capacity for storing and categorizing information to a computer database, admittedly, a vastly more advanced one. So, for example, we can say that one of the most important categories in our brain is the one reserved for “true facts”. In our brain’s “true facts” category we find stored the knowledge that the we are alive, that we need water in order to live, that we are aging. However, it is surprisingly hard to expand that list in a way that every one agrees.

For example, some people, classify global warming as a true fact. Other people are not inclined to do that. Some people believe that privatization is a good thing. Others disagree. Some people believe that the Bible is the infallible “Word of God”. Some don’t. Why is that? How do those issues get into some people’s “truth category” but not in other people’s? Well, lets examine (in a very simplified way) a couple of the most important ways in which we come to acquire knowledge.

Authoritative knowledge

Essentially, this is knowledge that we accept as true because it comes from trusted sources. Trusting that the people we know are giving us reliable information, especially those in positions of authority around us, is a normal and healthy attitude. There is so much to learn in life that we all need to rely on others for a lot of what we believe. Children rely on the grown-ups around them to form their view of the world. Students rely on their teachers. Teachers rely in historians and scientists. In fact, most of the important knowledge that most people have today has been acquired in this way.

Even historians and scientists have to rely on their peers and on the body of historic and scientific knowledge already available for a lot of what they believe to be true in their field. Does that mean that historians and scientists believe everything they are told by their peers. Well, of course not. Of all people, they are expected to be able to discern when a piece of knowledge handed to them as a truth is in reality false. However, that is not always easy to do. Sometimes, even they have been known to believe falsehoods.

One reason for this is that the more weight we give to the authority giving us a piece of information, the less likely we are to question it. If we believe that our parents always tell the truth, we are more than likely going to believe them when they tell us that Santa Claus exists and that he is going to bring us gifts every winter if we are good children. We tend to give more weight to a book written by someone with certain credentials than to a random Internet web page. Unfortunately, accepting only information from authoritative sources as true is not necessarily full proof.

One famous example of this is that of Aristotle, considered by many to be one of the most intelligent persons to have ever lived. In his book “A history of knowledge”, Charles Van Doren states that “Aristotle taught us to reason about the world we see and know: he invented the science of logic… , the idea of the division of science into fields distinguished both by their subject matters and by their methods…” In spite of these and many other achievements, Van Doren also states that “the general refusal by Aristotle and his influential followers to accept the law of inertia stood as an obstacle to the development of physics for two thousand years”.

In time, scientists were able to see past Aristotle’s authority and to base their conclusions about inertia, not on the words of authority, but on the evidence they had before them. Likewise, sooner or latter, we all come to realize that Santa Claus does not really exist. We do this because we are presented with so much information that disproves it that we are forced to change our belief. We reach into our brain, open up the Santa Claus file, and throw it into the falsehoods bin. And that is the second way that I like to discuss in which we come to obtain knowledge.

Evidentiary knowledge

We can define evidentiary knowledge as information that is convincingly demonstrated to be correct, even when it may be counter-intuitive. In other words, this is knowledge that we accept, not because it is coming from some authority figure, but because we have been convinced of its truthfulness by certain evidence. The evidence could be as simple as personal experience. For example, we accept as a true fact that “there are a lot of problems in the world” because we may have experienced some of these problems ourselves.

But, for the sake of argument, imagine that you have lived a very sheltered and privileged life, and that you have never had to deal with difficulties. Could you ever agree that there are a lot of problems in the world? Well, of course you could. You would be able to conclude this by observing that a lot of other people in the world do have a lot of problems. In other words, the evidence of there being problems in the world is convincing enough for any one.

A more formal way of acquiring evidentiary knowledge is the scientific method. The scientific method is intended to be a more reliable way of acquiring true knowledge than by simple observation or personal experience. In general, it involves observation of a phenomena, the formulation of a hypothesis to explain it, the devising of experiments to confirm or disprove said hypothesis, and the establishment of a facts based conclusion.

Why is the scientific method needed? Well, because experience has shown that simple observation or personal experience can easily mislead humans into the wrong conclusion. For example, simple observation lead men to believe that the Sun and the planets revolved around a stationary Earth. In time, through the use of mathematics and deductive reasoning several men were able to find problems with that idea. However, a convincing and satisfying explanation eluded them. It wasn’t until the arrival of the telescope, and the abundance of verifiable data that came with it, that the truth became easier to discern. They were now able to use the scientific method to test the predictions that a heliocentric Solar system would produce. And of course, they found it to be true. Now they could prove it!

In theory, the whole body of scientific knowledge should be comprised of this type of verifiable knowledge. Unfortunately, again, even scientists have fallen prey many times to incorrect knowledge. How can this be? Well, you see, scientists are humans too. They can sometimes, for example, fail to examine a certain key piece of evidence. Or, it may also be the case that some of the evidence they examined was falsified or misinterpreted. It could also be the case that an unconscious bias skews their perception of the evidence before them.

The result of this last effect, the power of our bias to affect our acquisition of knowledge, is what my next blog entry will discuss. However, in the mean time, why don’t we all try to see how good we are at dealing with differing opinions? Let me leave you with a challenge, why don’t you ask a few people around you what their opinions are on a few hot topics, such as morality, politics, and yes, religion? Please be respectful, and really try to understand the reasons that they have for holding such viewpoints. And, if you happen to have a surprising experience, or even an unexpected revelation, I would love to hear about it.

Leave a comment

The author

It is not about common sense.

Cruising Penguin

Related posts

Blog at WordPress.com.