It's long been fashionable for people in the Western world to claim they are "open-minded," and so this is claimed even by people who are not that way at all. (No one says with pride that "Yeah, dude, I'm closed-minded!") Yet closed-mindedness seems to be as common as it ever was, with people refusing to entertain any number of ideas they don't agree with; and it seems to be as much a problem today as it ever was. But what does it mean to be "open-minded," anyway? The website of Princeton University defines the word open-minded as "ready to entertain new ideas," and this seems to me to be appropriate. How does one know if an idea is false, if one has not heard it? (Or in the words of the Princeton definition, "entertain[ed]" it?) How does one know if they will like this food, if they haven't tried it? And how does one know if this idea is wrong, if they haven't heard it out?
Conversely, what does it mean to be "close-minded"? The same Princeton website defines "close-minded" or "closed-minded" as "not ready to receive to new ideas," but this seems to me to be an inadequate definition, because its opposite (being "open-minded") is defined as readiness to "entertain" new ideas, not necessarily to believe them. In the words of Aristotle, "It is the mark of an educated mind to be able to entertain a thought without accepting it." (Source for quote: University of Kentucky website) Thus, Princeton's word "receive" would seem to require a distinction between entertaining and believing, and the lack of this distinction (if it indeed exists here) would make this definition too vague to be properly usable. In conjunction with their opposite definition of "open-minded" (which specifies entertaining new ideas), it would seem best to assume that close-mindedness relates to entertaining new ideas as well, and not necessarily to accepting them - you can reject particular ideas without being closed-minded. (And if you don't believe me, then ask yourself - is your rejection of my definition also the result of "closed-mindedness," or can one reject an idea - my definition, for example - without being closed-minded? For the record, I'm not offended if you disagree with my definition; but it is ironic that everyone who defines close-mindedness in this way is rejecting all definitions that disagree with it, and is thus being close-minded by their own definition. You can do this if you want, but I recommend against it.)
The word that would better seem to capture what they have in mind is "skeptical," which is defined by the same Princeton website as "marked by or given to doubt," or "denying or questioning the tenets of especially a religion." It's not popular being a doubter or a skeptic - believe me, I know - but it is the price one pays, I believe, for being rational. How rational would I be, for example, if I believed every claim that anyone ever gave me - anything from aliens controlling our government through mind-control technology, to the idea that no politician has ever lied to anyone? I'm proud to say that I reject these ideas, and I don't mind being called a "doubter" or a "skeptic" by those that disagree - that is the price one pays for contact with reality, both in the realm of the obviously unlikely, and with more ordinary misstatements. Thus, I'm actually rather proud to be called a "skeptic" - that's my default reaction, in fact, until people come up with evidence for their point of view. "Extraordinary claims require extraordinary evidence" (in the words of Carl Sagan), and shouldn't be accepted without it.
Carl Sagan, astronomer
No, the opposite of open-mindedness is not skepticism, but close-mindedness - and the opposite of skepticism is not open-mindedness, but gullibility. The same website of Princeton University defines "gullible" as "naive and easily deceived or tricked," or "easily tricked because of being too trusting," and it's rightly viewed as a negative description. People will claim to be open-minded until the cows come home (sometimes correctly, sometimes incorrectly); but no one proudly says that they are "gullible." ("Yeah, dude, I'm gullible! I accept anything that anyone ever says, regardless of whether it makes any sense! By the way, aliens landed in Washington DC yesterday, and they control the Congress through their mind-control ray" - blah, blah, blah - you know the rest.) As with close-mindedness, few will ever admit to being gullible; but as with close-mindedness, gullibility is all too common; and it may even be unknowingly held up as something of an ideal, by some who confuse it with the more positive trait of open-mindedness. Clearly, it would seem, they are somewhat different; and the avoidance of gullibility may be as much an imperative for being rational as the avoidance of close-mindedness itself.
So to recap, being open-minded is being willing to entertain new ideas (and being closed-minded is not being willing to entertain those ideas); while being skeptical - by default, at least - is refusing to believe various ideas in the absence of proper evidence, and being gullible is the willingness to believe virtually anything, regardless of whether it makes sense or is supported by evidence. The one spectrum relates to entertaining new ideas, and the other to believing them; and true rationality requires both open-mindedness and default skepticism - the willingness to entertain new ideas with an open mind (and hear them out before judging their merits), and to require solid evidence before believing them - disbelieving the many ideas out there that lack such evidence. This allows us to distinguish between a solid and scientific claim, for example, and a claim of the aliens controlling Washington with their mind-control ray.
So with these distinctions made, when is it appropriate to be skeptical? If we were to say one should be skeptical of anything and everything without exception, then we'd have to be skeptical of this claim itself - or in other words, to be skeptical of the claim we should always be skeptical of everything, thus contradicting ourselves. Conversely, if we were to say one should believe everything, then we'd have to believe the opposite of this claim as well - or in other words, to believe that we should not believe everything (thus again contradicting ourselves). Some way of distinguishing these things is necessary, because either extreme in this spectrum involves self-contradiction.
The answer to this question is actually somewhat complicated, and classes from math & statistics to philosophy & the sciences all try to answer it successfully. Thus, I will not try to answer it in detail here. Rather, I will focus on one of the more common misconceptions about this, which is that we should trust anything coming from someone with a Ph.D. (or other respected status). I will focus here on the blind acceptance of Ph.D.'s, because Ph.D.'s are the source that seem to have the most intellectual respectability in the public mind; although I should acknowledge that people have much blind faith in actors, singers, wealthy people, or any number of other celebrities as well. Focusing on the Ph.D.'s as the representative example of choice here, there are any number of problems with this; one of them being that even people with real Ph.D.'s have preconceptions, ideologies, and agendas; and they have been known (at times) to be wrong. It was once fashionable, for example, for scientists to say that Caucasians had intelligence far greater than that of African-Americans; and to say that there was evidence that supported this. (I link to this informal page at Wikipedia documenting many cases of racist pseudoscience from previous eras.) The claims, it is true, came from people with real Ph.D.'s in real universities; but that didn't stop them from being dead wrong, not to mention politically motivated - they were dead wrong (and racist to boot), and they have been discredited by later generations of scientists. A Ph.D. - or other respected status - only goes so far, it seems, in making someone's conclusions worth relying on and trusting in.
Even in cases where there is not pseudoscience or political motivation, scientific theories have been proven wrong and replaced with better theories when new evidence came along, or when new ways of thinking about the old evidence came along; as it's happened time and time again in the history of science. It continues to happen all the time, as no one would get Nobel Prizes or Ph.D.'s to begin with, if they were just echoing the same things that had been said by previous generations. Nobel Prizes and fancy Ph.D. degrees are given for adding to the sum total of human knowledge, which often means overthrowing existing ways of seeing the world, and bringing new ways of thinking in their place. Thus, virtually no knowledge can be considered entirely certain, even when it comes from the smartest of people, with the most reputable of Ph.D.'s. (Or acting skills, or singing skills, or loads of money.) And that's without mentioning all the fake Ph.D.'s out there - many Ph.D.'s are just worthless pieces of paper obtained from diploma mill websites, which are given for nothing more than a few dollars of the customer's money, obtained under (usually false) pretenses.
Moreover, many who claim to have Ph.D.'s don't even have that piece of paper - anyone can say they have a Ph.D. - or that they're quoting people who do - and not have any evidence or authority of the kind they claim. I could claim right now that I have a Ph.D., for example, and that everything I say must be right because of it; but that wouldn't actually make me right - I'm just a guy with a bachelor's degree, I must admit, and I don't have a fancy Ph.D. degree to back me up. (I still think you should believe me, it is true; but not because I have a fancy title, or degree, or other form of authority - just because what I say makes sense, on its own merits. You can have logic and evidence without the support of any of these things, and you can lack logic and evidence even with the support of all of these things. Thus, arguments should be judged on their own merits, and not by what celebrity or popular authority might endorse them.) This is the best basis for sound decisions.