Skip to main content

Autism, Astrocytes and Attention...a possible driving hypothesis.

"I take over a thousand pictures of a person's face when I look at them. That's why we have a hard time looking at people" ~ Carly Fleischmann

I actually like that self report because it indicates some very important hypothesis about how the brain is self connected in an autistic person different from a "normal" person.

Namely, I'd propose that the reason there is sensory over load is because the normal mechanism for encapsulating and possibly shifting attention to different cognitive aspects under consideration is short circuited in some way...such that there is a bias toward external stimuli.

Our cognitive dynamics involves a balance, an equilibrium between the external sensation and the internal sensation dimension, I assert (in many of my blog posts from the last few years) that consciousness is nothing more than the time variant dance of this interplay between what  the world is presenting to us across those "extrasensory" inputs and what our "insensory" inputs are demanding.

Extrasensory inputs are your standard senses, Visual, Auditory, Gustatory, Somtatsensory, Olfactory...your Insensory inputs are the autonomic drivers that demand your attention....boil down basically to food and sex.

I hypothesize that everything you do, everything you think is in some way guided by the continuous attempt to balance the extrasensory against the insensory drives.

That said, there needs to be a filtering mechanism in place to ensure that there is no over stimulation (in both directions) which delays convergence to "action" once sensation and processing have fact, switching attention is a precisely necessary function to keep the dynamic cognitive drive moving forward.

Last year research for the first time identified how attention is performed as a key aspect of the function of Astrocytes (glial cells)...the "other" major cell type (in fact the most numerous) in the brain beside neurons. Up until this study their full role was not it appears their purpose is to serve as attentional switches...they store a short term representation of a currently sensed signal in a given dimension of experience and resonate that back at a given rate to enable it to persist LONG ENOUGH to be processed...but this begs the question of what happens if they persist the signal TOO LONG?

It would seem they'd be re-echoing cues for continued attention when there is no need (beyond the processing time frame) this could very well be the source of the sensory over load that Carly reports in her experience and that seems to be a hallmark of all the autistic. In electronic and computer design the master beat of the dynamism of computational functions are set by a device called a clock which is stepped in various ways so that different systems can be invoked in a precise way as needed to provide synchrony critical to the function of the overall system, it is reasonable to wonder if the brain requires a similar set of stepped timing signals that are applied to different aspects of cognitive processing so as to effect a smooth and convergent dynamism as the possessing agent interacts with the world, doing so efficiently would be key to be able to focus on events in the world for just the right period of time necessary to optimize the achieving of all goals.

Also, it could be at work in a differential way across the brain...with some sensations being more sensitive to attention than others (say vision over sound, or smell over touch)...there is no reason to presume the pathology is homogeneous across astrocyte tissue though that is also possible.



Popular posts from this blog

On the idea of "world wide mush" resulting from "open" development models

A recent article posted in the Wall Street Journal posits that the collectivization of various types of goods or services created by the internet is long term a damaging trend for human societies.

I think that the author misses truths that have been in place that show that collectivization is not a process that started with the internet but has been with us since we started inventing things.

It seems that Mr. Lanier is not properly defining the contexts under which different problems can benefit or suffer from collectivization. He speaks in general terms of the loss of the potential for creators to extract profit from their work but misses that this is and was true of human civilization since we first picked up a rock to use as a crude hammer. New things make old things obsolete and people MUST adapt to what is displaced (be it a former human performance of that task or use of an older product) so as to main…

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

Waking Out: A proposal to emerging ethical super intelligence safely.

The zeitgeist of Science fiction is filled with stories that paint a dystopian tale of how human desires to build artificial intelligence can go wrong. From the programmed pathology of HAL in 2001 a space odyssey, to the immediately malevolent emergence of Skynet in The Terminator and later to the humans as energy stores for the advanced AI of the Matrix and today , to the rampage of "hosts" in the new HBO series Westworld.

These stories all have a common theme of probing what happens when our autonomous systems get a mind of their own to some degree and no longer obey their creators but how can we avoid these types of scenarios but still emerge generalized intelligence that will leverage their super intelligence with empathy and consideration the same that we expect from one another? This question is being answered in a way that is mostly hopeful that current methods used in machine learning and specifically deep learning will not emerge skynet or HAL.

I think this is the …