Welcome back to "Doggerel," where I ramble on about words and phrases that are misused, abused, or just plain meaningless.
In everyday circumstances, I know what intuition is. I probably couldn't put it in an elegant Webster way, but I understand the concept. If I pick up a platformer videogame and a controller I've never seen before, chances are, I'll quickly figure out what buttons 'attack' and 'jump' are without even glancing at the controller. That's because most videogame and system designers have a number of fairly consistent conventions. They want their customers to understand their product, so they leave basic actions the same as the popular products before them. That's intuitive controls. Anything similarly predictable is subject to our intuition: Our pattern recognition ability.
The problem is that in science, where we tend to spend a lot of time messing with circumstances on scales, situations, and cases that we don't deal with everyday (or at least not knowingly), intuition often fails. Quantum mechanics, for example, is weird to the layman who doesn't routinely deal with things that can't seem to settle on being waves or particles. That's why our largely Newtonian intuition fails to understand it. We have to go through lots and lots of tests to figure out what patterns and behaviors really emerge there, rather than intuit those patterns based on our macro-, much-slower-than-light everyday understanding. In short, we do science to understand it, instead of presuming we understand it.
To the woo, however 'intuition' is a magic way of knowing. They don't describe where they get the idea that it's accurate, since efforts to measure the things they believe in via intuition tend to come up short. If that particular question comes up, they will often try to wall off the area from science with meaningless labels and projection.
Normal intuition involves some observations to perform pattern recognition on, and it's quite unreliable a lot of the time. Woo intuition, however, doesn't need anything for a basis, other than things like 'this idea makes me feel better' or 'this idea is simple and immediately understandable', even when that hypothesis doesn't produce anything. In short, woo intuition is pretty much unchecked wishful thinking.
What's worse is that, compared with science, which knows that weird circumstances can throw our predictions out the window, woos take it for granted that their intuitions about complex, ethereal, exotic things are going to be true before anyone even proves their existence.