Skip to main content

Objections to a non physical conclusion for consciousness.

In a paper recently published by Mark Muhlestein in the journal Cognitive Computation the following conclusion is made:

"In this computational framework, the distinction between a computation and the recording of a computation can be blurred arbitrarily, yet the physical implementation of the computation itself is unchanged. From this, we conclude that a purely computational account of consciousness is unsatisfactory."

This conclusion I agree with but for a few reasons not directly addressed in the paper which I elaborate on below.

On the objections, I'd say the ones that are most against this conclusion (that either computation is not all there is or consciousness is not what we think it is, roughly) are objections 3 and 4.

The clarion call that was ringing in my head as I read was the condition that a random number generator was being used. Conscious states are intimately dynamic systems that do not have deterministic response as asserted in the thought experiment because unlike the real woody, the woody emulation or the projection, real woody is subject to the vagaries of internal machinations that do not emerge from random desires or even external ones. These are the autonomic drivers that internally modulate I assert, internal characteristics of cognitive dynamism at the layer of the neocortex from the deeper layers of cognition in subtle and non repeatable ways.

Thus the failing of this paper, a devastating one lies under the assumption that determinism in emerged experiences is even possible. Repeated stimuli may emerge "similar" reported conscious states would never report identical experience states as real cognitive states are ultimately driven by physical sensory modulators that run continuously and are distributed in more places than just the computational core in the mind. How would the woody emulation for example 'simulate' the time dependent process of slowly becoming hungry over time and how that varies cognitive state? or of slowly feeling hot? or of itching at the fibers of wool in the sweater one is wearing while trying to smell that Rose? In a simulation such inputs are assumed to be of in a fixed (and undetermined) input state but I posit that is precisely why any subsequent question regarding derived experience can not be said to be deterministic and can not be said to be identical given "identical inputs". Not only is there feedback between the processing elements in the emulation and other processing elements (that naively in this thought experiment do the "computation")  but there is also feedback between *internal sensory states* modulated by external sensory states that are continuously varying. 

The *illusion* or non physical appearance of consciousness comes from this fundamental disconnect in how it is being characterized in this thought experiment as being purely a process of computation once inputs are gathered, in Bishop's conception and in Maudlin's conceptions.

I assert that Inputs are continuously gathered and modulate the "computation" if you don't simulate that in your simulation what you emerge will indeed be a shadow of consciousness.

Moreover, near the end when he states:

"Evolution has not prepared us for interactions with an entity which exhibits conscious behavior but which is not in fact conscious; we would find it tempting, perhaps overwhelmingly so, to grant full rights and responsibilities to an entity which can pass every imaginable test of sentience. But if Woody says, “Come on in! The water’s fine!” and suggests that you submit to a destructive brain scan in order to upload to a computational substrate, you would be wise to do so only if the nature of consciousness is clearly understood, and if you have included as part of your upload package any hardware (or wetware) necessary for conscious experience."

I agree that caution should be had here for more fundamental reasons. I assert that such a process of copying the minutia of ones computational substrate is insufficient to copy ones consciousness substrate which includes elements distributed through out the physical body. Again, the cognitive system emerges from intimate sensory and physical contact and modulation from both external and internal states which are dynamic...any computations are continuously converged to and are necessarily unique in approach...even if  the input signals are identical. The convergence to an indicted state of experience will always be slightly different, consciousness can still be intimately tied to those insensory , exsensory and cognitive processing interactions, still be physical. Those that are all gung ho about being "uploaded" need to realize that what will be done strictly (if possible) is a copy of a cognitive brain state (absent the inputs of physical modulation!) and then that will have a hell of an *awakening* once it boots in the artificial substrate. Furthermore, what boots up won't at all be the person copied but a simulacrum....the original persons consciousness would end with the destructive copy process...and the now pathological copy would go on in terror in the copied substrate. Even if the entire physical feedback loops of the original individual could also be copied it would still again be a copy and NOT the original agent. 

I can imagine only one scenario where he original consciousness can slide to a new non biological substrate and that is if the original body is piece by piece replaced by non biological elements across it's entire physical construction such that a new consciousness is not created but rather the existing one is moved (but in place) to a new substrate which includes both cognitive replacement and propriosensory and somatosensory replacement.

Now all that said, do I agree with the initial conclusion? Yes, and No. Yes, I agree that cognitive computational processes (with processing units resident only in the brain that are fed specific inputs) are not sufficient to replicate consciousness. However, No I do not accept the often ready fall back that because emerged consciousness is not entirely resident in the brain (but rather in the brain body physical unit) that it must come from some where external to the physical body entirely as some would like to fall to, or that it involved spooky quantum interaction among the cognitive elements that can not be simulated in an emulation.


Popular posts from this blog

On the idea of "world wide mush" resulting from "open" development models

A recent article posted in the Wall Street Journal posits that the collectivization of various types of goods or services created by the internet is long term a damaging trend for human societies.

I think that the author misses truths that have been in place that show that collectivization is not a process that started with the internet but has been with us since we started inventing things.

It seems that Mr. Lanier is not properly defining the contexts under which different problems can benefit or suffer from collectivization. He speaks in general terms of the loss of the potential for creators to extract profit from their work but misses that this is and was true of human civilization since we first picked up a rock to use as a crude hammer. New things make old things obsolete and people MUST adapt to what is displaced (be it a former human performance of that task or use of an older product) so as to main…

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

Waking Out: A proposal to emerging ethical super intelligence safely.

The zeitgeist of Science fiction is filled with stories that paint a dystopian tale of how human desires to build artificial intelligence can go wrong. From the programmed pathology of HAL in 2001 a space odyssey, to the immediately malevolent emergence of Skynet in The Terminator and later to the humans as energy stores for the advanced AI of the Matrix and today , to the rampage of "hosts" in the new HBO series Westworld.

These stories all have a common theme of probing what happens when our autonomous systems get a mind of their own to some degree and no longer obey their creators but how can we avoid these types of scenarios but still emerge generalized intelligence that will leverage their super intelligence with empathy and consideration the same that we expect from one another? This question is being answered in a way that is mostly hopeful that current methods used in machine learning and specifically deep learning will not emerge skynet or HAL.

I think this is the …