Skip to main content

Illusion of continuity: consciousness vs quantum electro dynamics

You know one thing I've been thinking about in the last few days....

It is of the solution that Feynman came up with for describing quantum electrodynamics and resolving a whole host of problems which up to that time were intractable.

The integration of renormalization into the theory and the first class representation of histories of evolution for particle dynamics that spanned the present the future and the past in the wave function description.

I asserted a few days ago that so far as enabling the mathematical resolution there is no proof that any real particles can travel backward in time...some mathematical tools are just that and despite being useful should never be expected to be "realized".

A good example of this from mathematics and engineering is the complex plane and the astonishing landscape of possibility it opened up. What is "i" ??

In the same way that a particles history can be seen as a present position that emerged from an infinite set of possible past histories.... consciousness seems to emerge from an infinite set of probable bit past states of the constitutive cells of the brain....the neurons and glial cells that store memory in some way.

So I see a parallel in here.... consciousness seems like a real thing but it is really an emergent concept that we have misused as a tool for describing how state changes between memory configurations evolve over time....just as quantum electro dynamics is a tool that describes how particle states change over "time" where looking at it as a continuum helps make problems tractable but in reality it is not a continuum at all.

Looking at consciousness as a continuum (we sort of can't help it) was the default state with consciousness but it wa something that had to be actively pushed into the mathematics (by Feynman) where it came to particle histories!

As for Time.... it's just a ratio of state changes between matter baring particles...mediated by energy exchange (this is clearly defined in the second heisenberg uncertainty relation)....I suspect that some important phenomena are misunderstood because tools are being mistaken for aspects of the phenomena under consideration and ability to describe the phenomena accurately is thus being lost in the process.

To me consciousness was always obviously emergent and NOT continuous...the last 5 years of my research have pretty much convinced me that it is (hence my certainty that it will be reproduced on non biological strata fairly soon) especially given the results coming out of neuroscience on how the brain is self connected....I find the similarity though to quantum electro dynamics and the similar confusion over what a wave function is and weather or not particle histories can truly continuously move from "the future" to the "past" very intriguing in that context.



Popular posts from this blog

On the idea of "world wide mush" resulting from "open" development models

A recent article posted in the Wall Street Journal posits that the collectivization of various types of goods or services created by the internet is long term a damaging trend for human societies.

I think that the author misses truths that have been in place that show that collectivization is not a process that started with the internet but has been with us since we started inventing things.

It seems that Mr. Lanier is not properly defining the contexts under which different problems can benefit or suffer from collectivization. He speaks in general terms of the loss of the potential for creators to extract profit from their work but misses that this is and was true of human civilization since we first picked up a rock to use as a crude hammer. New things make old things obsolete and people MUST adapt to what is displaced (be it a former human performance of that task or use of an older product) so as to main…

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

Waking Out: A proposal to emerging ethical super intelligence safely.

The zeitgeist of Science fiction is filled with stories that paint a dystopian tale of how human desires to build artificial intelligence can go wrong. From the programmed pathology of HAL in 2001 a space odyssey, to the immediately malevolent emergence of Skynet in The Terminator and later to the humans as energy stores for the advanced AI of the Matrix and today , to the rampage of "hosts" in the new HBO series Westworld.

These stories all have a common theme of probing what happens when our autonomous systems get a mind of their own to some degree and no longer obey their creators but how can we avoid these types of scenarios but still emerge generalized intelligence that will leverage their super intelligence with empathy and consideration the same that we expect from one another? This question is being answered in a way that is mostly hopeful that current methods used in machine learning and specifically deep learning will not emerge skynet or HAL.

I think this is the …