(This post essentially agrees with Probability Is In The Mind. It differs in emphasis and style.)

[Epistemic status: clarifying language in the hope of helping people who were confused like me, not stating anything about what is or isn't the case in the world.]

Question (from a college-level math class): An urn contains 3 red balls and 6 blue balls. Two balls are drawn without replacement and the second is found to be red. What is the probability that the first ball was also red?

Question (in politics): What is the probability that Cory Booker is the 2020 Democratic nominee?

I think that the phrase "what is the probability" is misleading in a subtle way. It can lead you to believe that you are asking a question about the world, though you are really asking a question about some model of the world. Sometimes it's okay to conflate those things, but other times it's important not to.

Let's start with "What is the probability that Cory Booker is the 2020 Democratic nominee?" You might ask this question in the context of a sober, objective discussion. You are trying to look hard at the world and act like a scientist. It might lead you to investigate facts such as the future state of the economy, projected black voter turnout, etc. The investigative process feels like the scientific process by which we discover facts about the world. By analogy, it feels like the question "What is the probability that Cory Booker is the 2020 Democratic nominee?" has an answer of the same epistemic status as, say, "What is the age of planet Earth?"

However, it doesn't. Probabilities are part of the map, not the territory. In other words, they are properties of a model of the world, not properties of the world itself. They are not discovered, they are created. There is some fact of the matter as to how old the planet Earth is*, but there is no fact of the matter about the probability that Cory Booker will become the 2020 Democratic nominee. It isn't floating out there in the world, waiting to be found by an intrepid scientist.

Visualize yourself after the nominee has been declared. Suppose it is Cory Booker. What have you learned about the probability? You've learned that, in the world's past, the matter was arranged in such a way that led to Cory Booker being elected, but that is not a fact about the probability. (Do you think your model should have said "100%"?)

"The probability of event X" resides in a model, not in the world. There is no "true probability" about any event or state of the world; the world just has some state. We create models that give us probabilities when queried, and we continuously revise the models as we learn more about the world.

Although the math question above also uses the language "What is the probability?", we're in a rather different situation. We're given a map, a model (the implicit model of how drawing idealized balls from idealized urns works), and are asked a question about it. The question is actually about how to use the map, not about the territory at all. You'd be considered annoying if you ask a territory-motivated question like "But what about if a ball falls through a hole in the bottom of the urn?" even though this might be a real concern if we were engaged in modeling a physical urn. (I guess urn bottoms are pretty sturdy, but you get the picture.)

So if you say $\frac{1}{4}$ and I say $\frac{1}{3}$, one of us is wrong about this map we're supposed to be sharing, and we should be able to agree. When discussing the probability of Cory Booker becoming the nominee, there is no map we are assumed to share; we are allowed to build our own, and hence we might have differences. At best, if our maps apply to similar situations, we can get data over time about whose map is better or worse. At worst, we might not be able to agree about what "similar situations" are, or we might not have enough shots to gather data.

I used to be very confused by this, because I went from thinking about probabilities only in math classes to thinking about probabilities in the real world. This led to issues like me not knowing what it meant when the weather forecast said "80% chance of precipitation tomorrow", and suspecting no one else knew either. Now I think that there is a consistent way to think of such things, and that way is to interpret the weather forecast as saying "My (rather successful) model assigns an 80% probability to there being precipitation tomorrow."

Prescriptions:

• When in a context where there might be multiple models in question, or in a context where it pays to be careful, say "What probability does X assign to this event?" instead of "What is the probability of this event?", where X is some model (e.g. you, me, Nate Silver). Probabilities are always relative to some model. When you're pretty sure everyone has the same model in mind (e.g. some authoritative weather forecaster) and you're speaking casually, it's okay to say "What is the probability that it will rain tomorrow?".

• Math classes teaching probability should discuss this distinction more loudly.