Tuesday, February 12, 2008

Doggerel #134: "Consciousness"

Welcome back to "Doggerel," where I ramble on about words and phrases that are misused, abused, or just plain meaningless.

Woos love to muse about consciousness. Given that we've all got it without a blue print (yet), it's kind of hard for any of us not to on occasion. I suppose one of the easiest ways to define consciousness is the ability to think and to think about your thinking. It takes quite a bit of sophistication for a critter to be able to do that.

Woos, however, seem to want to keep the complexity down. Rather than face the awesome task of cataloging countless neural connections and proportions of chemicals being juggled among those connections, they'd rather just make consciousness its own ineffable thing, rather than a complicated process. Of course, the problem that arises from this is that it involves special pleading: They have to introduce a new entity without any real sort of justification. Unless they can demonstrate something with consciousness without some sophisticated information handling 'hardware', I don't see any reason to think consciousness is separate from the brain.

One of the other reasons woos like to posit a separate consciousness is because it gives them comfort to think of something that can continue to exist. Of course, the universe isn't in the business of pleasing humans. Good luck finding the soul, but you'll have to do all the scientific footwork. Epistemological hedonism isn't a shortcut I'll accept.

Another reason they like to posit a separate consciousness: If you invent something that circumvents the natural world, it can give you the excuse to believe in other stuff that's considered "weird" like psychic powers and such. Of course, most of the latter ends up being much more mundane when examined. Consciousness has the advantage as not being as well understood as slight of hand, cold reading, and the placebo effect. Yet.


Neil Cicierega said...

Best way to describe consciousness is as whatever mechanism that deems this particular brain as the one "you" see out of. It implies an arbitrary choice of the billions of sentient creatures in the universe, and arbitrary isn't scientific. Being as there's no physical difference between "you" and "everyone else", but from your own personal perspective there is clearly a difference, at this point it seems like something best left to the philosophers to ponder, because science is designed to ignore the personal perspective.

Tom Foss said...

Being as there's no physical difference between "you" and "everyone else",

Except all the different arrangements of neural connections in people's brains, the different ways they're forged and pruned according to experience and usage, the different hormonal and neurochemical concentrations, and whatnot. Other than that, absolutely no relevant differences.

Anonymous said...

Ummm... Even if people were identical in the sense Neil is trying to express, they still wouldn't necessarily be the same person. In exactly the same way that two identical objects are not the same object. (The term "identical" is awkward here. In software engineering, we distinguish between identity - meaning "the same thing" - and equality - usually meaning "having the same properties".)

Even if you could exactly duplicate a person right through development, you'd end up with two identical people, not one person with two bodies.

There's a great thought experiment in on of Iain M Banks' Culture novels (Look To Windward, I think) to demonstrate that conciousness arises from a physical substrate and that the precise nature of that substrate is irrelevant, which involves exactly that - creating an electronic version of a brain, with "artificial" neurons which exactly duplicate the behaviour of "real" neurons. If you can exactly duplicate the developmental process, and feed your "artificial" brain the same sensory input and the "real" brain it's modelled on, then you will end up with something that behaves exactly like it's "real" counterpart. The question is then posed as to how you could possibly conclude that one is "conscious" and the other isn't.

Anonymous said...

The biggest issue I have with the idea of saying "the nature of the brain can't account so there has to be SOME soul thing" is then my question of how adding an extra "thing" explains it. If you say that brain connections and the running process of the brain can't account for a soul, then how does this extra thing account for it any better? It's just a step removed. Let's assume we live in some alternate world where there ARE extracorporeal "soul" entities. What's to stop someone there from just saying "there's no way the interactions of spirit particles and reatsu can account for the soul" and adding a 3rd layer, and so on and so forth forever?