- Susan does not know whether it will rain during her walk.
- Susan's rational credence that it will rain during her walk is 0.4.
- The nuisance of carrying the umbrella on her walk will cost Susan 10 utils.
- The nuisance of being rained upon without an umbrella is -30 utils. (It is no nuisance at all to be rained on if she has her umbrella.)
It's pretty reasonable to suppose that in this case, Susan ought to take the umbrella; we calculate the expected value pretty straightforwardly. The umbrella costs 10, and gives her a 0.4 chance of saving 30. (30 * 0.4) - 10 = +2. If Susan takes the umbrella in a way sensitive to the positive expected value of doing so, there's a pretty strong intuition to the effect that she's done everything right.
As you know, cases like this one are sometimes thought to be problematic for the knowledge norm of practical reasoning. Alex Jackson's careful paper does a nice job separating different knowledge-norm-like commitments, but he identifies this kind of case as at least a prima facie challenge to the following determination thesis:
What one knows determines what it is rational for one to do (possibly in concert with one’s desires).
I like this determination thesis. (In fact, I like something stronger -- I think we should be after something more like metaphysical grounding, not just determination.) So I need a story about the case of Susan. The Hawthorne & Stanley story is that Susan, if she is acting appropriately, is acting on knowledge about epistemic probabilities. She says to herself: there is a 0.4 chance that it will rain; this thing that she says to herself is what she treats as a reason. And it is something she knows.
I'm pretty uneasy about this line, although I've never been able to put my finger on exactly what I don't like about it. There is something odd, it seems to me, about probabilistic contents playing these kinds of roles. I know that's not an objection; I'm just recording my uneasiness. Here, anyway, is an objection: suppose she doesn't know the relevant probabilistic claim. Suppose that, for all she knows, the chance that it will rain is 0.3.
Remember, this is evidential probability we're talking about; the difference between the chance's being 0.3 and its being 0.4 can't be made by meteorological facts wholly outside Susan's ken. Still, it's not at all implausible that Susan might not know, with that level of precision, whether her evidence probabilifies rain to degree 0.3 or 0.4. Indeed, my own evidence right now, it seems to me, puts the chances of its raining on me as I walk to work tomorrow right around that ballpark; but I have no idea whether it is closer to 0.3 or to 0.4. I hope you agree this is not a very implausible possible situation.
Notice also that if the probability really is only 0.3, then, given the stipulations above, Susan's expected value for taking the umbrella is negative. ((30 * 0.3) - 10 = -1) So under the current stipulations, for all Susan knows, taking the umbrella might have negative expected value. She doesn't know that she should take it. She does, we may allow, know that there is some chance of rain, but this doesn't look like a good enough reason to perform this action.
You might think about trying to gild the bitter pill at this stage, suggesting that if she doesn't know whether it's better to take it, then she really does violate the action norm in taking it, although she does so in an excusable way. This seems to be Hawthorne & Stanley's line. But I don't think we should take it. For it's consistent with our case here that Sarah is exceptionally well-attuned to the evidence. That is, if the chance really were only 0.3, then she wouldn't take the umbrella. This is of course totally consistent with her failure of introspective discrimination.
I think Hawthorne & Stanley were right to look to knowledge with contents other than that it will rain, but wrong to focus in on probabilistic ones, in part for the reason just offered: there's no particular reason to expect them to be known. (And also in part because of that feeling I haven't managed yet to articulate, that these things aren't the right kinds of things to be invoking in one's reasoning.) While there's no reason, it seems to me, to think that Sarah must know the probabilistic content, there is, it seems to me, good reason to think she must have some other relevant knowledge around. If, for example, you think that E=K, then the evidential probability must be probability conditional on some knowledge. Let that knowledge be the reason for action. What is it that is the relevant evidence? I don't know, it depends on how the case is filled out. Maybe something a forecaster said? Maybe the look of the clouds? Whatever it is, I say we understand Susan as acting on the basis of that evidence.
Can you get a case like this involving no such evidence? Alex Jackson tries to give us one. But this blog post is getting long and I'm getting hungry, so maybe I'll leave discussion of that for another day.