Log in

View Full Version : A word to the wise



El Che
15th September 2002, 06:30
One thing I try to do when thinking and also when considering what other people have thought, is to have method. Naturaly the method I prefer and try to achive is a rational method. Systematic, imparcial, all-sided, rational thinking. Not to say that I achive it, but merely that it is an honest objective. I would hope we all have this objective, but sometimes I wonder about some people.

Too often, it seems to me, people rather then being open and "rational" are dead set in beliefs and apriori conclusions. People incapable of actualy putting their beliefs in question no matter what new evidence they are present with. Honestly, we see it all the time and I know you see it too. Perhaps we will even see it in our selves if we are honest enough.

With that in mind, I came across this interesting and accessible article on the very subject. Nothing bombasticaly insightful but simple and to the point. Read on if you`re interested

------------

Rationalism
by
Michael Albert


Last issue (see "Anti Rationalism" (http://zena.secureforum.com/Znet/zmag/allarticles.cfm)) I rebutted the claims of people I ungraciously called "irrationalists." I used a variety of arguments and examples to show how their opposition to scientific thinking and "Western rationality" was ill-conceived and reactionary. This month, since a rejectionist broom can leave a possible void, I'm going to try to more positively define and defend "being rational," particularly for political analysis. So, what constitutes being rational?

(1) We know it can't mean being correct. Often people are rational and wrong. Often two rational people reach opposite conclusions, both of which can't be right. Being rational therefore doesn't mean always attaining the truth.

(2) We also know there are different degrees of "being rational." We are all easily rational up to a point but must exert considerably more effort to attain higher levels of precision.

(3) We know that being rational goes beyond being logical. We've all dealt with people who we considered irrational even at a moment when they were carefully obeying the rules of logic. Being logical does not alone constitute being rational.

(4) Finally, we know that being rational differs from being dogmatic. It includes recognizing the possibility of being wrong and being willing to test one's claims over and over.

A good definition of being rational has to explain the above points. The dictionary tells us that to be rational is "to draw inferences logically from facts known or assumed." As a first approximation, this is quite good. For with this as our definition, "being rational" would lead to truthful, accurate conclusions only when the "facts known or assumed" are true and sufficiently encompassing, and when we understand them sufficiently so we don't introduce errors while using them to logically draw inferences. As a result, with this definition of rationality the above four conditions are met.

We can certainly get wrong inferences even while being rational. For example: The things that we think are true facts might be false, leading to wrong inferences. Or we might think our facts represent all that is critical where actually additional facts negate the inferences we draw. Or we may draw inferences logically, but on the basis of a wrong understanding of some of the true facts. Likewise, with this definition, increasing degrees of "being rational" correspond to taking greater care in ascertaining relevant facts and carefully drawing logical inferences. Obviously this can increase the likelihood of our inferences being true. Also, being logical isn't sufficient to make one rational because good logic applied to incorrect or incomplete information will only accidentally yield truthful inferences. And finally, since true inferences depend on true facts and logical analyses, rational claims are testable rather than having to be accepted or rejected solely on faith. The originator and evaluator of any claim can and should test it, and, when new evidence comes to the fore, test it again.

Also with this definition, people are rational, at least to some extent, almost all the time. After all, in almost everything we do, we "draw inferences logically from facts known or assumed." Errors are common, despite the fact that people are usually at least somewhat rational, for the reasons noted above. Moreover, biases, in the form of believing misconceived facts or letting desires or fears overwhelm our capacities for logic are also common. When these biases are extreme, of course, we label the result irrational. If it's self-conscious, we also call it hypocrisy, manipulation, dishonesty, etc.

So why is the dictionary definition only a "first approximation?" Something is missing: hypotheses, which, when complex, are often called theories. When we are rational, contrary to what the dictionary definition implies, we don't always go from facts via logic to inferences. Instead, often we get the inferences ahead of time and only then try to validate them by way of facts and logic. For example, we might have a hunch, an intuition, or a guess at a hypothesis. We might use an analogy to settle on a hypothesis. Or we might be offered a hypothesis by someone else. The point is, often we first have some claim about the world—a hypothesis—and only then see if we can find a set of facts from which we can use logic to infer the truth of the claim. This is an important addition because it highlights the difference between thinking up a hypothesis or theory as the first step, and then deciding whether it is true as the second. For the first step, unlike what the dictionary definition implies, anything goes, including analogies, hunches, extrapolations, intuitions, guesses, poetic flights of fancy, brainstorming, and even random rearranging of concepts or notions. For the second step, however, the dictionary definition comes more into play as the rules of evidence and logic take precedence.

With this clarification, we can see that when we move from daily life to science the only change that occurs in our mental orientation is that our methods for removing biases and pursuing greater precision become more disciplined. Philosophers of science try to write down methods for this, but scientists rarely if ever pay attention to their efforts. Instead, scientists learn the methods of science by emulating their teachers as part of learning their craft. Still, however difficult it may be to precisely enumerate the various ways scientists check their work, we know that at the heart of the matter is the scientist's commitment to obeying "the rules of evidence" and respecting the priority of repeated experiments, ongoing experience, and careful logic. In general, the closer we come to a scientific stance the less likely we are to include false facts, leave out relevant facts, misinterpret true facts, draw illogical inferences, or let biases bend our criteria for accepting facts or making inferences.

And that's it. Unless we want to get into philosophical nit-picking, there is nothing mysterious or complex about being rational or, for that matter, scientific. It is, however, sometimes hard work.

To be rational is simply one among many capacities associated with being human. To exert rationality, at least to a degree, comes naturally. To attain a scientific standard of rationality, on the other hand, requires more discipline. Moreover, the fact that we can employ a gradation of discipline in our rationality is quite lucky since if we had to test our data for biases, cross check it for completeness, and wait for confirmation from others who undertake related experiments just to decide it was okay to cross the street upon seeing no cars coming, we would never get to the other side. On the other hand, if we could only attain the spontaneous level of rationality associated with looking both ways and then doing the right thing, we would never have attained an understanding of physics sufficient to use electricity, a knowledge of biology sufficient to discover antibiotics, a knowledge of chemistry sufficient to make various cleansers, etc.

So how much rationality, from the street-crossing level to the scientific, do we need for making political judgments?



A Case Study in Rationality

Suppose we are offered a controversial hypothesis—for example, "JFK would have ended the Vietnam War had he only lived a little longer. He was killed for this reason. Moreover, his murder has drastically altered the nature of government and life in the U.S. ever since." How do we decide whether to accept or reject it?

A lot depends on the store of relevant background information we have at our disposal. What are our available "facts known or assumed" that we can draw inferences from? If among these we have a repeatedly verified theory about social relations and the JFK hypothesis is sharply contrary to that theory, we can quickly reject it. This is like not crossing the street if we see a car coming. The problem, of course, is that however often this is a valid and efficient way to respond, if our theory is actually false for this context, our inference will be false as well and we will have misjudged the hypothesis. Worse, this dynamic can cause a well-meaning but nonetheless stultifying sectarianism in which we let a previously convincing theory (e.g., marxism or neo-classical economics) dictate our current and future actions without repeatedly assessing new evidence that our theory may be flawed.

The alternative to a reflex rejection of the JFK hypothesis solely because it flies in the face of our preferred theoretical understanding, is to re-test our theory by showing in detail a line of argument that leads from more basic "facts known or assumed" to the hypothesis's renunciation. This way we have the possibility of discovering that our theory has a problem or, alternatively, we can explain our opposition to the hypothesis in a testable way that doesn't presume agreement about broader theoretical matters. Then, instead of dogma that supporters of the hypothesis can only ignore or succumb to as "received wisdom," they encounter a careful multi-step argument whose logic and premises they can test. At that point, they can either continue the debate by carefully evaluating the argument, or they can move into sectarian mode, ignoring the argument or dismissing it by fiat, but not by way of additional evidence or logic.

It sounds pretty abstract, but in practice it's quite comprehensible and relevant, as a couple of examples might clarify.

Someone might believe the JFK hypothesis from intuition, a guess, watching Oliver Stone's movie, reading Mark Lane's book, investigating documents, etc. In contrast, Chomsky, for example, has a store of "facts known or assumed," including some relevant knowledge about the historical period as well as a broader theoretical understanding of how the government and society works that together lead him to quickly reject the JFK hypothesis as contradictory with how the real world works and with known facts about the period. For Chomsky, it may even be hard to conceive how a serious long-time student of social relations could believe the JFK claim. So what does Chomsky, or anyone in his position, do?

One option is to just ignore the claim or dismiss it out of hand. This has the virtue of being quick and trouble free and of seeming proportional to the worth of the hypothesis. But what if the hypothesis persists? Suppose many people begin to take it quite seriously. Now what?

The next option is to briefly summarize a theory of society and point out that the hypothesis is inconsistent with that repeatedly verified theory and therefore must be wrong. This is okay at the beginning of an exchange, but at a late stage it would be sectarian. It asks the adherents of the now widely held hypothesis to drop it merely because it contradicts a theory someone else, or even they, previously believed. But why should they do this? Why shouldn't they instead say, "Hold on, our new claim is true, it's your old theory that's refuted? We are open to new insights. You are obstinate and sectarian about your old ways?"

So, the third option is to offer a comprehensive argument, with detailed logic leading from testable facts through to the rejection of the hypothesis itself.

Well, if you take a look at Chomsky's article in this issue, that's what it does. It attempts to refute the JFK hypothesis in a rational way that needn't be taken on faith but can instead be evaluated by each reader. Moreover, it attempts to present a comprehensive enough argument so that either the reader will find that some of the "facts known or assumed" in the argument are actually false, or that some of the logic employed is faulty, or that the argument refutes the JFK hypothesis. In this way, Chomsky takes the debate away from a clash of unsupported dogma and moves it into a more rational context of testable evidence and inference.

Roughly, Chomsky assesses the facts that others offer to support the JFK hypothesis and shows how they are either false or misinterpreted. He then offers counter-facts, detailing where he gets them from and why they can be believed. He also considers various implications of the JFK hypothesis, for example, for what should have followed after LBJ took over, and shows how these too contradict what the JFK hypothesis entails. Likewise, he shows that JFK's overall behavior, not just regarding the war but regarding broader issues of foreign and domestic policy as well, contradict the JFK hypothesis. And so, when we finish the piece, we are left to wonder, what will advocates of the JFK thesis do next?

To their JFK hypothesis, Chomsky has countered with another:

"Basic policy towards Indochina developed within a framework of North-South/East-West relations that Kennedy did not challenge... [it] remained constant in essentials: disentanglement from an unpopular and costly venture as soon as possible, but after victory was assured.... Tactics were modified with changing circumstances and perceptions. Changes of administration, including the Kennedy assassination, had no large-scale effect on policy, and not even any great effect on tactics, when account is taken of the objective situation and how it was perceived."

Chomsky also offers facts substantiating his new hypothesis and simultaneously refuting the JFK hypothesis. Will the JFK theorists show that Chomsky's facts are false? Will they show that his logic is faulty? Will they simply ignore his case or proclaim it a priori reactionary, repeating their prior claims with no real reply to Chomsky's argument? Or will they admit that Chomsky's case is valid and drop their hypothesis? These are their four options and the four options that generally arise in each new round of a debate. The first two and the last are rational. The third is not.

Now the issue for this article isn't who's right and who's wrong. Rather it's that we can take any other controversial hypothesis and treat it similarly to how Chomsky treats the JFK hypothesis. For example, consider these hypotheses:

-In the search for a better way to organize our economy, we should incorporate markets because this is the most efficient and productive and least detrimental way to achieve orderly allocation on a large scale.

-In the effort to convey radical ideas and visions to a broad audience, we should contour our actions and language to be easily reported through mainstream print and video media.

-Leninist organization is well suited to revolutionizing life in the U.S.

-If our society is going to become less oppressive, it will primarily happen through electoral change.

-Radicals need money to win change and wealthy people have it, so radicals eager for change should center their fund-raising efforts on wealthy constituencies.

In every case different kinds of exchanges can occur. Adherents and doubters of each hypothesis can shout their preferences back and forth, with no recourse to evidence and logical inference, no one ever changing their mind, no one learning anything, etc. Indeed, even if one party to the exchange moves on to a more rational stance open to communication, it takes two to tango. There will be no progress without both sides transcending dogma.

However, just being rational isn't always enough. For example, each side could rationally argue for its position based on a preferred theory, noting that the theory sustains the claim. But now the debate goes back a step. Why should either side believe the other side's theory, be it neoclassical economics, marxism, or whatever? And we may quickly return to the shouting match.

Another possibility is that each side, guided by its own theory, presents a comprehensive argument with clearly delineated facts, assumptions, and logical inferences. Then, a real exchange is possible. This transcends dogma and a clash of rationality too succinct to evaluate. It reaches a plain of real debate over testable claims and inferences. Of course, to get this far requires hard work that is only worth undertaking when the hypothesis in question matters a lot, but at least we know which procedures augur the possibility of progress and which don't. Agreeing on that, progress ought to be possible, whether deciding our view of JFK, or markets, or working with mainstream media, or Leninist organization, or electoral politics, or fund-raising, etc.