Cheiromancer
Adventurer
A lot of times the word "consciousness" is used to mean "awareness." To be conscious of something is to be aware of it. In this sense consciousness is distinguished from being deeply asleep or in a coma. Some people who suffer damage to their visual cortex can be aware of stimuli without being conscious of it; they can "guess" the orientation (vertical vs horizontal) of a line with great accuracy while claiming they can't see it. This is called blindsight (different from the DnD term!) and is usually cited as evidence that awareness is not what is meant by consciousness.
Philosophers who talk about consciousness are usually referring to "phenomenal consciousness". This consists of having qualitative states; it is "like something" to have those states. Something's having a qualitative state does not necessarily mean that it is easy or even possible for us to know what it is like to have that state. For example, it has been argued that the question "what is it like to be a bat" cannot be answered. An entity can have consciousness without being able to verbally report the content of these states; no one thinks, for example, that someone who has aphasia lacks consciousness.
The same two examples of qualitative states are used over and over again in the literature. The experience of color (specifically the color red) and the experience of pain. If an entity can experience color and/or pain, then it would have phenomenal consciousness.
As for free will (which I also discuss in my thesis) there does not seem to be any relevant physical difference between artificial computers and human brains. On the neural level the brain is as deterministic as a computer chip. The main difference lies in the complexity of the wiring.
Now it may be that there are indeterminacies in the operation of neurons- perhaps quantum fluctuations are amplified (in a geiger counter like manner) to make real differences in the output of the the neurons. But this kind of indeterminism does not seem to capture the nature of genuine freedom. And even if it did, I don't see why a mechanical equivalent (with a real geiger counter, say) couldn't function in an similar manner.
Philosophers who talk about consciousness are usually referring to "phenomenal consciousness". This consists of having qualitative states; it is "like something" to have those states. Something's having a qualitative state does not necessarily mean that it is easy or even possible for us to know what it is like to have that state. For example, it has been argued that the question "what is it like to be a bat" cannot be answered. An entity can have consciousness without being able to verbally report the content of these states; no one thinks, for example, that someone who has aphasia lacks consciousness.
The same two examples of qualitative states are used over and over again in the literature. The experience of color (specifically the color red) and the experience of pain. If an entity can experience color and/or pain, then it would have phenomenal consciousness.
As for free will (which I also discuss in my thesis) there does not seem to be any relevant physical difference between artificial computers and human brains. On the neural level the brain is as deterministic as a computer chip. The main difference lies in the complexity of the wiring.
Now it may be that there are indeterminacies in the operation of neurons- perhaps quantum fluctuations are amplified (in a geiger counter like manner) to make real differences in the output of the the neurons. But this kind of indeterminism does not seem to capture the nature of genuine freedom. And even if it did, I don't see why a mechanical equivalent (with a real geiger counter, say) couldn't function in an similar manner.