Sunday, January 18, 2009

Philosophy vs. Science

Okay so I guess I'll get things started here on the Williamson book. First, let me say to Joshua: nice pick. The first chapter has sufficiently whetted my philosophical appetite. Second, let me say that I haven't ever thought very hard about issues of philosophical methodology, so you should certainly take all of my posts here in a spirit of open inquiry. With that, I'll give a bit of autobiography that was inspired by chapter one.

When I teach intro to philosophy, I usually start with a thought-experiment. (Typically, I have them imagine that Harry and Hermione switch brains in an attempt to foil Lord Voldemort's plan to kill Harry. I then ask them to imagine that they are Voldemort's closest advisor and they have to tell him which brain-body pair to kill, given that he wants to kill (and only kill) Harry.) I then usually give a spiel about thought-experiments in general and their value for philosophy. My explanation of their value is an analogy usually along the following lines.

If you want to know how a microwave succeeds in heating up whatever you put in it, you would come up with an hypothesis and then put it to the test. Hypothesis: It works by agitating water molecules. Test: Put oil in the microwave and see whether it gets hot. Depending on the results, your hypothesis is either confirmed or disconfirmed by the data. Something similar happens in philosophy. If you want to know why humans are more morally valuable than ants, you would come up with an hypothesis and then put it to the test. Hypothesis: They are more morally valuable because they have human DNA. Test: Imagine that we encountered a race of non-human aliens that were otherwise exactly like us. Would they be less morally valuable just because they didn't share our DNA? Again, the hypothesis is confirmed or disconfirmed by the data.

Drawing this parallel seems to steer the students away from thinking of philosophy merely as the analysis of our concepts and toward thinking of it as a way of finding out things about the world itself. At the same time, however, the data of thought-experiments seem to be some sort mental stuff -- intuitions, considered judgments, or something like that. Moreover, I sometimes describe the point of the Harry Potter thought-experiment in terms of the pressure that it puts on our ordinary concept of personhood. So am I really still stuck inside the conceptual turn that Williamson argues we have moved beyond?

In light of the above autobiography, I'd be curious to hear how you all describe the goal of philosophy to your intro students, or whether you even think it is useful to do so. And, more generally, how far off is my above description, even if it is pedagogically useful?

8 Comments:

Blogger Christian said...

Hi Neal,

Which book is this, by the way. I wouldn't mind reading along.

In the philosophical intuition case it't not clear to me that intuitive data confirms the hypothesis. It can disconfirm it. But, supposing philosophical truths are necessary truths; and, supposing necessary truths have probability = 1, then they cannot be confirmed.

I'm sure a story can be told about how such claims, philosophical claims, could receive a probability of less than 1. But, well, the proof is in the pudding.

I don't think relying on intuition and thought-experiments entails that one, when doing so, would count as "stuck inside the conceptual turn". The intuitions reveal the nature of properties, not concepts. The same goes for thought-experiments. In conversation Williamson was keen on this point.

I don't describe "the" goal of philosophy. I say it has many goals that include: learning how to think clearly, trying to provide reasonable answers to important questions, learning how to develop a reasonable world view, taking concern in questions "bigger than oneself" and things like that.

Your description seems to be close to this. In terms of a basic methodology I teach, I suggest stating principles and forming inconsistent sets of them, where these principles seem reasonable before being brought together. One project involves coming to grips with rejecting what would otherwise be a reasonable principle, learning how to give it up and retain the intuition that lead to it. Another project involves seeing the clash between particular cases and principles, and doing the same. Another project involves deriving principles as the best explanation of intuitions about cases, and then testing those principles in the way you suggest.

5:06 PM  
Blogger Joshua said...

Neal,

When you say that "drawing this parallel [between the microwave example and the ant example] seems to steer the students away from thinking of philosophy merely as the analysis of our concepts and toward thinking of it as a way of finding out things about the world itself" you seem to be setting up a kind of false dichotomy. The suggestion is that we are either learning about the world or we are merely analyzing our concepts. But, what seems to me to be going on is that we are learning about the world by analyzing our concepts. It is easiest to see if we focus on linguistic analysis rather than conceptual analysis. We might learn, by linguistic analysis, that the sentence “Every person is F” expresses a truth in English. But, since we know that “every person is F” expresses a truth in English iff every person is F, we can learn something about the world by learning something about how our words work. I think that this is the kind of conceptual/linguistic view that Williamson plans to address later in the book.

It definitely seems that at least one thing that we might do when we do philosophy is figure out how our words or concepts work by conceptual analysis and thereby discover something about the world. I often make this point explicitly to my students by telling them that one way that we can learn about the world is by learning something about our concepts and then applying what we have learned. I take it, though, that one might wonder whether the only appropriate, perspicuous, way to do philosophy is by using conceptual analysis to discover something about the world. Probably, it is this view that Williamson hopes to address in the future.

I don’t know whether you are trapped in the conceptual turn or not. I don’t know partly because I am uncertain about what the conceptual turn is. But also, I think it might matter whether you think that the only appropriate way to do philosophy is by first analyzing concepts and then applying those concepts. If you think this is correct, then I think you might be caught in the conceptual turn. But, again, I am just not sure.

7:28 AM  
Blogger Joshua said...

Christian,

we are reading The Philosophy of Philosophy.

Also, It seems to me that the distinction between subjective and objective probability might be helpful here. Although necessary truths might have a objective probability = 1, they do not have a subjective probability = 1. So, our thought experiments can confirm necessary truths if we are thinking about subjective probability.

7:35 AM  
Blogger Christian said...

Although necessary truths might have a objective probability = 1, they do not have a subjective probability = 1.

Joshua, my worry for this is that we should assign subjective probabilities consistent with objective probabilities. Truth is the aim of belief, and this applies to beliefs about probabilities as it does to beliefs that involve no reference to probabilities. So, for truths that, if true, are necessary, we should believe there probabilities are either 1 or 0. And so I don't see how, if this is correct, we could get confirming evidence for such truths. And if philosophical claims are necessary, if true, I don't see how intuitions can confirm philosophical claims. They could disconfirm claims by showing their probability is 0, but that's different.

12:04 AM  
Blogger Joshua said...

Hi Christian,

Although I think that truth might be one aim of belief, I do not believe that we should always assign our subjective probabilities to match the objective probabilities. Here are two arguments against that position.

First, Suppose I assign a subjective probability of 1 to a proposition P. It follows that I am certain of P. But, there are some necessary truths that I believe but am not certain of. For example, I believe that Goldbach's conjecture is true. But, I am not certain that it is true. However, if it is true, then it is necessarily true. So, if it is true, then the objective probability =1 but my subjective probability (given my lack of certainty) should be less than 1.

Second, suppose a mathematician gives me two inconsistent mathematical statements and says that one of them is true but won't tell me which. I don't know enough math to say which it true and which is false. It seems like a principle of indifference should make me assign a subjective probability of 1/2 to each proposition. But, whichever proposition is true is necessarily true and whichever is false is necessarily false. So, whichever proposition is true has an objective probability =1 and whichever is false has an objective probability =0. Again, objective and subjective probabilities come apart.

9:02 AM  
Blogger Christian said...

Joshua,

Those are both interesting arguments and, I think, somewhat compelling. But suppose you are right. If you are, then the following should be a reasonable thing to assert:

"I know that the probability of P is not 1/2 since I know that, if P is true, it is necessarily true (and if false, necessarily so) and so its objective probability is 1 or 0. Nonetheless, I should assign P probability 1/2, that is, even though I know it's probability independent of my evidence is not 1/2.

That just does not sound like a reasonable thing to assert to me. Consider another case. For example, "I know the chance of the coin landing heads is 1/2, that is, that's its objective probability of landing heads is 1/2. So, given that I know this, I should believe it to be 1/2 likely that it will land heads." Knowing the chance of P being true seems to be sufficient for assigning a subjective probability to P, namely, whatever that chance is.

So, are your arguments stronger than this? I don't know the answer. One could suggest that here, assigning intervals is appropriate. We should assign probability [0,1] to Goldbach's Conjecture, not 1/2. The same goes for the mathematical claim in your second example.

I do want to resist the claim that assigning subjective probability 1 to P entails that one is certain of P. Subjective probabilities do not measure strength of a state (a belief) as certain does, they are features of contents of beliefs. So, for example, if I believe it 1/2 likely that P on my evidence, then my evidence entails that it is 1/2 likely that P. This could be so even if, for whatever reason, I am certain that P. Another way to make the point: Suppose there is a large lottery and I purchase a ticket. I could believe it 1/1000000000 likely that I will win, without 1/1000000000 measuring the strength of my belief. Strengths of belief might not be that fine-grained in the first place.

But what are intervals? And, supposing there are such things, how can one update on them? I don't know. I think there is a real puzzle here.

12:45 PM  
Blogger Joshua said...

Christian,

I have been thinking about htese arguments. I guess I am inclined to think that we can alleviate the seeming strangeness of the following comment by clearly separating objective and subjective probability:

"I know that the probability of P is not 1/2 since I know that, if P is true, it is necessarily true (and if false, necessarily so) and so its objective probability is 1 or 0. Nonetheless, I should assign P probability 1/2, that is, even though I know it's probability independent of my evidence is not 1/2.

I think this does not sound strange if we rephrase the statement like this:

"I know that the objective probability of P is not 1/2 since I know that, if P is true, it is necessarily true (and if false, necessarily so) and so its objective probability is 1 or 0. Nonetheless, I should assign P probability 1/2, because I have no idea whether it is true or not."

I understand your worry about the word 'certain' but I think I can make my point without it.

Suppose I assign a subjective probability of 1 to a proposition P. It follows that I cannot be in a better epistemic position with respect to believing P. But, there are some necessary truths that I believe but I could be in a better epistemic position with respect to my belief. For example, I believe that Goldbach's conjecture is true. But, I I could be in a better epistemic position with respect to my belief. I could for example, derive Goldbach's Conjecture from a set of mathematical axioms that I have very strong reason to believe. However, if Goldbach's Conjecture is true, then it is necessarily true. So, if it is true, then the objective probability =1 but my subjective probability (given that I could be in a better epistemic position with respect to my belief) should be less than 1.

As I see it there are a few options available to us. We could reject the claim that necessary truths have an objective probability =1. I don't think I like that option. We could say that our subjective probability need not match the objective probability for a person to remain rational. I think I favor this position. Perhaps what we should do is assign a subjective probability that is as close to what we think the objective probability might be given our evidence.

11:51 AM  
Blogger Christian said...

Hi Joshua,

So I'm very sympathetic with what you're saying (and I'm unclear about what i actually think). Moreover, your rephrasal of my quote does sound like a reasonable thing to say. But, nonetheless, I want to distinguish between saying:

1. For some necessary proposition P, I should assign 1/2 to P since I have no idea whether P is true.

from

2. For some necessary proposition P, I should assign 1/2 to P since I have no idea what the probability of P is.

With respect to necessary propositions, and in the kind of case you have presented, the first is right but the second is not. With respect to a necessary proposition P, we do have an idea what the probablity of P is, i.e., it is either 1 or 0.

What I'm saying is that a claim involving 2 still sounds wrong to me, though your claim, reinterpreted as 1, sounds fine to me. I just don't think (I'm not convinced anyway) that 1 is an accurate reinterpretation of 2. It ignores information that we have when asking what probability we should assign to a necessary proposition. We don't know whether it is true, I grant that, but we do know that its probability is either 1 or 0.

In your Goldbach's case I agree, we should not assign a probability of 1 to it. We could be in a better epistemic situation with resepct to it, and this implies that one's subjective probability should not be 1. But does it imply that it should be strictly less than 1? I'm not so sure. Maybe it simply implies that our subjective credence should be distributed equally over two propositions, namely, that its probability is 1 and that its probability is 0.

The idea, then, is that if we do this we can go on to deny that it should receive a probability of 1/2. We can do while capturing your idea, which seems reasonable to me, that we could be in a better epistemic situation with respect to it. It would be beeter if we could distribute our credence over only one proposition, namely, that the probability that P is true is 1.

Let me make a different kind of argument. Subjective probabilities should match our beliefs about the objective chances of contingent events. Take a coin. I believe the chance that it will land heads when tossed is 1/2 and I think that my belief about it's objective chance of so landing makes this reasonable. If this is so (maybe its not) shouldn't we want the same reasoning to apply to all chances, rather than treating necessities as a special case for which this reasoning fails? The resulting theory would be more general than the kind of theory that would treat such chances differently. I find that to be a strong reason to avoid the kind of view you are suggesting, do you?

12:29 AM  

Post a Comment

<< Home