Skip to main content

Turtles all the way side ways...Godel and Turing point the way.

In a response to a friend on Facebook an interesting discussion on recent results that decidability of paths for particle transmission at the quantum scale is not assured similar to Turing's halting result in computer science and Godel's incompleteness in logic. This article was the original source of the discussion. Below I respond to comments made by my friend.

"Making harder the conundrum is that Godel's Incompleteness and Turing's halting only apply to countable sets, not continua."

-- Precisely, but the riddle being that an infinitely countable set is included in that definition...and it's possible to have infinite subsets of such sets! So discretization (of anything) seems to be fundamental to continuity...this truth of mathematics which stands apart from our present understanding of reality may point us in the direction of what reality really is about.

If the pattern continues to apply it may provide a way to test validity of multiverse theories. As you may know, there are several interpretations of the results of quantum mechanical theory and general relativity that seem to indicate the flowering of either event histories, or the multiplicity of entire Universes...but the question is, are these histories or Universes real? Which is the truth? Mathematically, the application of sum over histories in Feynman's formulation of QED enabled the solving of previously intractable problems. The solution involved the idea of wave function collapse (sum over) over an *infinite* set of possible histories for a given event..each with variable probability of occurrence.

Some interpreted wave function collapse as meaning the spontaneous creation of new realities to evolve the state of the system, others see the multiverse as a coeventful (I can't say "temporal" since measurement of time may be different between Universes, so I use "event" to encapsulate the relative difference in time measurement that would attend different Universes...according to GR, having a time dimension at all is not even necessary) set of Universes that may or may not exist *now* but did at some time in the past or will exist at some time in the future.

I attended a Physics symposium in 2010, where I was fortunate to meet and hear speak David Gross, Micio Kaku and Alan Guth. During a spirited discussion Gross and Guth battled over whether or not they felt the idea of multiverse made sense. Guth imagined there could be a multiverse of universes and posited possible travel between them...Gross called it nonsense saying that the mathematical existence of multiple Universes does not guarantee coeventful transit capability (which I agree with).

I am reminded of how we now understand energy in the modern sense that is different from how it was described classically when one studied 19th century energy based on newtonian physics. Then, we were told that "energy can't be created or destroyed" in the famous second law, it turns out quantum allows it to be created and destroyed so long as over time those events even out. *There is a sum over the histories* of not just particle evolutions but particle existence...if that's the case it means a) all of this truly can be the result of the spontaneous eruption of something from nothing. and b) that in order for balance to be restored this "something" will and must go back to a "nothing" state. This  though sounds a lot like what was supposed earlier about a coeventful multiverse. What if it is enough that the ensemble (infinite) of Universes goes through all possibilities for each contained particle event... *eventually* rather than that they all occur coeventfully. It would solve the multiverse problem by applying Feynman's idea of sum over histories to the entire multiverse itself. It's actually beautiful....and fractal at core...and it's universal absolute...very binary, like the divisions between the integers upon which both Godels incompleteness and Turing's computation halting problem are based....coincidence?? I don't think so.

""What if" recursion (iow fractal nature) is so Total, it's wholeclothe with dimensionality?"

I considered this but it would seem from what we know thus far that reality is not "turtles all the way down" as we look deeper into the fabric we see discretization not continuity. We go down to the atom and see it begin in energy transitions of electrons in their "orbits"....allowed and forbidden with no middle ground, we see the emergence of macroscopically absent dimensions of discretized energy "spin" (intrinsic angular momentum) ....we go down further still and see discretization in the family of quarks that make up the particles of the atom. Deeper still and we reach the limit of the planck length...below which space itself is a roiling soup of undulations that emerge and eliminate particles.  For the recursion idea to hold *within our universe* there would have to be a continuous ability to probe far all our physics...theoretically falsified or unfalsifiable (string theories) point to a bottom. A possible solution though lays in if additional universes *pinch off* dimensionally from parent Universes such that all their dimensions are orthogonal to all ours...this would then make them independent and maintain the fractal symmetry but instead of going "down" it goes it were. This idea is one I am partial to (and one that received media last year when a physicist proposed it)...and the fact that GR predicts objects that are forever hidden from the Universe that created them (black holes...namely the singularity within) it just may be the "sideways" set of dimensions in which new Universes (and their physics and ability to create black holes) exist....and thus maintain "turtles all the way sideways". The continuity is preserved through the fractal recursion...the discretization is preserved through the independence of the new Universes birthed...riddle solved.



Popular posts from this blog

Highly targeted Cpg vaccine immunotherapy for a range of cancer


This will surely go down as a seminal advance in cancer therapy. It reads like magic:

So this new approach looks for the specific proteins that are associated with a given tumors resistance to attack by the body's T cells, it then adjusts those T cells to be hyper sensitive to the specific oncogenic proteins targeted. These cells become essentially The Terminator​ T cells in the specific tumor AND have the multiplied effect of traveling along the immune pathway of spreading that the cancer many have metastasized. This is huge squared because it means you can essentially use targeting one tumor to identify and eliminate distal tumors that you many not even realize exist.

This allows the therapy for treating cancer to, for the first time; end the "wack a mole" problem that has frustrated traditional shot gun methods of treatment involving radiation and chemotherapy ...which by their nature unfortunately damage parts of the body that are not cancer laden but …

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

AgilEntity Architecture: Action Oriented Workflow

Permissions, fine grained versus management headache
The usual method for determining which users can perform a given function on a given object in a managed system, employs providing those Users with specific access rights via the use of permissions. Often these permissions are also able to be granted to collections called Groups, to which Users are added. The combination of Permissions and Groups provides the ability to provide as atomic a dissemination of rights across the User space as possible. However, this granularity comes at the price of reduced efficiency for managing the created permissions and more importantly the Groups that collect Users designated to perform sets of actions. Essentially the Groups serve as access control lists in many systems, which for the variable and often changing environment of business applications means a need to constantly update the ACL’s (groups) in order to add or remove individuals based on their ability to perform certain actions. Also, the…