Read + Write + Report
Home | Start a blog | About Orble | FAQ | Blogs | Writers | Paid | My Orble | Login

Belief and reluctance

April 12th 2010 01:06
There's many times when you have to gather up all available information, and take your best shot, make your best guess. There might be constraints of time or resources, or there might be someone with a gun to your head. But for whatever reason, you've got to choose, and you've got to choose right now.

Well, this is an important difference between the ways that law and philosophy are (often) practised.

Law is driven to decide -- the court proceedings have to come to a resolution -- and the longer they drag on, the more they cost, the greater the potential injustice, the bigger the backlog of other matters...

Whereas in philosophy, fortunately and unfortunately, you're much freer to indulge in Hamlet-style vacillation and inaction.

***

There was a story at the start of one of the chapters in Stephen Toulmin's Uses of Argument. It was something to do with a boy who felt reluctance about expressing any firm view. For instance, when asked if his mother was upstairs, then even if he'd seen her a minute ago, he would reply, "Very likely she is".

(See the philosophy joke about black sheep.)

Perhaps he didn't want to be pinned down by his answer, didn't want to be held accountable for being wrong.

***

I think there's all sorts of reasons that I find myself in the same position as that boy.

I use a lot of hedge words as well to express lack of complete confidence -- "probably", "presumably", "perhaps", "it may be the case"...

The reasons include:

1. Multiple personality disorder. I think everyone has it to some extent. Everyone has many characters, many voices, within themselves, and everyone has enormous capability for contradiction. If you have good reasons for believing both A and not-A, sometimes you wind up believing both.

The human brain is biological matter; it receives stimuli and responds; it has a weblike structure; it builds from multiple directions simultaneously; it doesn't behave like a computer; it's usefully paralleled with a group organism, like a bee or an ant colony. If one were to examine all of anyone's behaviour, one would find plenty of inconsistencies from the point of view of the ideal rational agent.

2. An awareness of how often I have been wrong in the past -- once bitten twice shy.

3. The experience of having a firm view, then meeting other viewpoints, and discovering that those viewpoints are as reasonable as mine -- an awareness that there's usually a number of equally reasonable opinions that can be held on any topic.

4. A desire, before committing myself to a view, to explore the topic with a degree of care; an unwillingness to speak prematurely. (Black sheep and mother upstairs stuff.)

5. An impulse to suspend judgment about things for which I know the evidence isn't conclusive. In ancient Greece this might go under the name of "aporia".

Robert Nozick once said that at the time he wrote Anarchy, State, Utopia he felt it was important to have a position about everything, from baseball to politics to art -- but later in life he changed his mind, he became less opinionated.

If the evidence isn't decisive, and you're under no pressure to decide, then perhaps the most honest and truth-respecting path is not to.

6. An impulse to speak only what's meaningful and truthful. To paraphrase Wittgenstein: "If you can't say anything worth saying, then don't open your mouth."

You sometimes see the same thing amongst judges when you engage them in conversation. If they want to speak the whole truth about a matter, but they're limited by context, they might respond to a question not with erudite arguments, but with a simple "yes" or a "no", and they won't go further into discussion unless pressed.

The simple answer is sometimes the best contextual way to do justice to the question, and to not muddy the waters or confuse people.

7. An awareness that there are often a variety of grounds for anyone believing anything, so I'm wary of dismissing beliefs out of hand.

Someone's belief in a God, or their belief that there isn't a God, usually isn't the consequence of just one reason, and therefore can't be knocked down by just one argument -- it likely follows from a whole set of reasons that interact in complex ways.


139
Vote
Add To: del.icio.us Digg Furl Spurl.net StumbleUpon Yahoo


   
subscribe to this blog 


   

   


Comments
4 Comments. [ Add A Comment ]

Comment by wtp

April 20th 2010 23:51
I've become quite interested in the subject of philosophy lately and have been looking for discussions of real-world philosophical issues. In my casual pursuit, I have been quite frustrated with many of the discussions that I come across for the very reasons you state. It seems to me (and perhaps it's just my narrow experience with the subject matter) that this problem is wide-spread amongst those who consider themselves "philosophers". It is refreshing to see someone who is at least willing to acknowledge the problem and struggle with it. The reasons that you list here are thought provoking. I would think more "philosophers" would be interested in them, since they have a significant impact on how a philosophical subject is approached.

Comment by Nonymous

April 21st 2010 00:59
Hey wtp,

I'm intrigued by your comment, and there's a lot that I wish I could say in reply, but I'm going to restrict myself to making two points that are more about the topic of real world application of philosophy than about philosophers' reluctance to assert opinions. (To be honest, I don't know what percentage of philosophers are indecisive; most philosophers share a willingness to engage in discussion and to play the game of giving reasons, but some philosophers are fiery and intolerant, whilst others are airy fairy and flaky; it just depends.)

***

Firstly, philosophy has historically spawned many children. Empirical science, economics, psychology, sociology were all originally considered part of philosophy.

Basically, these children are what you get when a group of people says to themselves: "Okay, I've got this set of concepts, conceptual relations, and methods -- and they're productive. They solve problems, they generate satisfactory explanations, they make good predictions, etc. And I've also got this well-defined subject matter. So I'm not going to worry about questioning the basic assumptions from here on in -- I'm just going to apply what I've got."

In light of this, though there's 101 different definitions of philosophy, one that often appeals to me comes from John Searle. In lectures, he's fond of saying something like: "When we don't have a clear method of deciding a question, that's philosophy. When we do have a clear method, it breaks off and becomes science."

So, what comes under the title of "philosophy", and especially philosophy as practised in universities, is arguably by nature speculative and indecisive. It's pre-science.

***

Secondly... Law wasn't always learnt at university, you know. From my vague recollection, though it was first taught at university during the Renaissance, the law degree as a prerequisite for practising law only really got underway around the start of the 20th century -- and its teaching was controversial. Lawyers complained that law is a craft, and should be learned through apprenticeship, and could not adequately be taught in the classroom.

Well, I think there's a lot of truth in this. If you study law at university, you deal with fairly abstract things -- the rule of law, the relation between law and society, how laws should be changed, etc. You study the decisions of the highest courts and the development of branches of law. And then you graduate and go on to practise it, and it's a different kettle of fish. You spend all your time doing mechanical processing of paperwork; you seldom deal with Constitutional challenges or High Court decisions; the knowhow that serves you best is things like the procedures of local courts, how to speak to clients and other lawyers, how to fill out templates as fast as possible...

The general point is that a lot of what you learn at university, and a lot of what professors do there, in many departments, lacks direct real world application. This is obviously true of humanities -- scholars will spend years arguing over translations of ancient texts, or reinterpreting Romantic poets, etc. But this is also true of sciences -- a lifetime might be devoted to investigating obscure properties of numbers, or cataloguing insect species, or formulating alternative models of quantum physics. -- All very abstract stuff that might never really be used.

The arguments that could be produced to justify all this might include: (1) that advancing knowledge in itself is a worthwhile pursuit, regardless of obvious application; (2) that all knowledge has the potential of being useful somewhere down the track; and (3) that all knowledge has ripple effects -- topic X might be abstract and obscure, but your views on X will influence your views on Y, which will ultimately cause you to do Z.

You've expressed concern that philosophy lacks real world application. I think this is a justified concern. The most obviously practical parts of philosophy are ethics and political philosophy. But I think it's also worth saying that this is something that goes with universities in general. Studying philosophy at university, and reading academic philosophy, will likely be as far as you can get from taking a TAFE course in metal-working.

Comment by wtp

April 21st 2010 16:26
Thanks for your reply. I seem to be running into a higher percentage of the intolerant, airy fairy, and flaky kind. Fiery doesn't bother me so much as at least it's an indication that the person cares deeply about the subject.

While I understand the reference regarding Searle's point, I'd say I only agree about 60%. And while I appreciate the search for knowledge for knowledge's sake, at the end of the day a philosophy must be put into practice, even if just theoretically. For example, let's look at the subject of one of your very interesting and thought provoking posts that I perused last night concerning torture. As a society we can (we must, actually) weigh the costs and benefits (moral, practical, political, and otherwise) about what certain policies will mean to the kind of society we wish to live in. And one might also argue if it that society will continue to exist if we act improperly, either way. Yet at some point, some person(s) is(are) put in a such a decision-making position with many conflicting philosophical questions. They will be forced to choose one set of principles over the other(s). That person or persons will need to be acting on the basis of some kind philosophy. The problem as presented to them will likely never present itself in that context again, so science (as in "scientific method" science) is kind of irrelevant in this real-world scenario. Now this is of course an extreme, but I see this conflict in more mundane situations every day. As you surely do in law, otherwise The Supremes could form themselves into a singing group or something practical...I jest, of course...

And I can relate to the mundane aspects of the real-world application of law relative to the university perspective. Also true in engineering and probably most other disciplines. It's been over quarter century since I studied differential equations, matrix algebra, and such without needing to use any of it in a real world sense. Of course it underlies the technology we are both using right now. And then just this week I've had a need to understand Fourier Series and Transforms. Go figure...

Comment by Jonathan Speke Laudly

May 4th 2010 21:05
Hi, Jonathan Speke Laudly here,
Science from my point of view is simply a
more focused inquiry into causes and results--
a faculty that humans have always had.
Humans apparently have always and made chance discoveries and improved and honed
them ; Penicillin and rubber Vulcanization are examples. It is likely that many of the basics of civilization--- agriculture, plant and animal breeding, metalurgy, the bow and arrow, medicines, as well as gunpowder, the compass, leather tanning, the firing of clay and countless others---were brought forth by astute humans who observed a phenomenon and thought of an application for it , improving methods and procedures over time --often over thousands of years---with additional discoveries deliberately sought.
The rules for breeding new wheat strains, formed, honed and passed on over hundreds and thousands of years, are just a simpler version of those rules we find in today's biology.
The science of plant breeding is thousands of years old.
Today's science is a result of new discoveries and understanding about inquiry and the methods of inquiry itself. A new focus upon inquiry itself.
I ascribe this new attitude to the skeptical faculty, polished by hundreds of years of scholastic philosophy, applied to a new interest in the operation of the natural world following the renaissance induced decline of the
medieval Catholic Church's power over thought
and the revival of all things classical, especially pythagorian thought and the example of Greek secular natural theory-----and the subsequent drive for new and original inquiry free from the dead dross of past assumptions.
Da Vinci's original researches into anatomy and Kepler's laws-- based upon Tycho Brahe's
observations-- Copernican revelation, and Galileo's discoveries are early triumphs of this new spirit.
These were observation and rules making as in the past but focused and brought to new heights.
But also introduced at that time is the focus upon inquiry itself generally. The rules and methods and attitudes concerning physical inquiry that make for the most productive results, as propounded by Bacon and others at that time is entirely new, in my view, in the history of man and marks a new maturity of thought, a new pinnacle of rule making.
These include the discarding of long received but unexamined preconceptions, testing of hypotheses, truly skeptical questioning, reproduction of experimental results.
It is the new vision of the universe as a vast cipher whose rules and particulars can be revealed by focused methods and principles common to all effective inquiry--whatever the subject-- that is the real revolution wrought by modern science, and real difference between the rule making of the past and present.
The pre-European California Indians chewed willow bark to stop headache. They did not know that the bark contained what we now call aspirin
but the bark still stopped their headaches.
If they had had the resources and attitudes of modern science they could have isolated and purified the aspirin and perhaps looked for some compounds with similar effect,
In manipulation of nature modern man has no advantage over the ancients but a more focused and swifter method and a scheme that relates all things physical within a few broad schemes.
The basic curiosity and insight and invention
of humans is nothing new.










Add A Comment

To create a fully formatted comment please click here.


CLICK HERE TO LOGIN | CLICK HERE TO REGISTER

Name or Orble Tag
Home Page (optional)
Comments
Bold Italic Underline Strikethrough Separator Left Center Right Separator Quote Insert Link Insert Email
Notify me of replies
Your Email Address
(optional)
(required for reply notification)
Submit
More Posts
4 Posts
1 Posts
2 Posts
461 Posts dating from August 2006
Email Subscription
Receive e-mail notifications of new posts on this blog:
0
Moderated by Nonymous
Copyright © 2012 On Topic Media PTY LTD. All Rights Reserved. Design by Vimu.com.
On Topic Media ZPages: Sydney |  Melbourne |  Brisbane |  London |  Birmingham |  Leeds     [ Advertise ] [ Contact Us ] [ Privacy Policy ]