Skip to main content

Complexity in the Universe, conservation of energy and object orientation...

The following article was originally a post written at RD.net but I thought it should be copied here for non members to read. Enjoy!

Original link here

I seem to have added flames to a controversy with one of the posters in that thread , rainbow. I just realized that Steve Zara and others took up the standard and continued to present information to support the presented ideas. (I got really busy after posting the original and neglected to come back to respond directly to rainbow.)

Out of curiosity I just went over rainbows' other contributions to the site, turns out that the entire set of posts are to that one thread. It seems rainbow is a pseudonym for a hit and run poster, some one who created an account simply to comment on the post topic. Now there is nothing about this, often I've started new accounts to provide an alternate view but since that thread rainbow has been MIA from the site, every post is restricted to that thread. This gives me the idea combined with the points made that rainbow was a shill. He demonstrated a wide knowledge of some of the issues but lacked the critical reasoning to refrain from drawing certain conclusions (for example he assumed that the early biomolecules had no method of motility and this is patently false) based on false assumptions. Interesting, no matter my post in full follows below, you can click on the link provided above to read both rainbow's posts before it and the responses of others to his objections after for a good bit of entertainment. ;)



Regarding this discussion on the origins of life. I think rainbow is assuming that the chemistry of the early Earth was as inhomogeneous as it is today. This is a faulty assumption according to the fossil record, the planet was amazingly homogeneous in chemistry for a long long time. We know that life in the form of stromatolites were thriving along shores all along the forming continents (they were still being accreated from the interaction of the volcanic eruptions of the Earth's crust and the early seas) as early as 3.2 billion years ago. That is an astonishingly long time, but as wild as that is the very Earth itself had only formed barely a billion and a half year earlier.



I think the more likely process that led to abiogenisis was that the much more homogeneous chemistry allowed for a massive experimental space in which the early molecules could combine and recombine to form the first early protein chains and replicators. Once the first cannibalizing replicators emerged they would have rapidly spread through out the then much more homogeneous environment consuming useful submolecules. You have to look at the early Earth as almost a single global ecosystem. Today the Earth's biodiversity owes a significant reason for its existence to the great variety in biomes under which natural selection processes can proceed. In the early Earth the diversity was not in phenotypes selected through natural selection but rather in molecular types being nothing more than standard chemical affinities. Life today is highly segregated compared to the progenitors of life which being no different from complex chemical molecules could combine and recombine freely due to the laws of chemistry. From this perspective there was a much higher likelihood of chance molecular interactions producing new more complicated molecules. If we see these early molecules as "life" the natural selection was induced by the laws of chemistry providing a gigantic potential variability in the results compared to the present day biodiversity which may "look" far more diverse but in reality is much less so (if you think about it we all, from bacteria to bull are nothing more than genetic iterations of one another, highly conservative in energy due to our common use of structures that emerged in the billions of years that preceded the first visible signs of biodiversity during the Cambrian 542 mya.



If you look at a cell you see that the internal systems are more or less autonomous, they only require a particular environment and raw materials under which to carry out their chemical actions. Golgi bodies, mitochondria, ribosomes these simple engines perform specific tasks and do so without aim or direction so long as they have the necessary raw materials. The usefulness of their actions only makes sense in the context of the internals of the cell. Ironically, ID proponents claim that subcomponents have no use other then for which they were designed. I believe that for some relationships such as the relationship between a cell and its mitochondria for example, evolution itself predicts what the ID proponents conclude but for other reasons. Mitochondria are internally autonomous because the cytoplasm in which they thrive is the only remnant of the early environments in which their progenitors formed. Those environments only exist today in cells so we should not be surprised that we find them no where else. Note though that we still see that organisms amazingly similar to mitochondria exist outside cells as bacteria replete with similar DNA, the similarity is all we need to affirm the hypothesis of evolution. Just as we see symbiotic relationships between more advanced organisms today, it is very likely that the cell arose from just such beneficial symbiotic relationships arising between the various types of machinery that arose in the early biomes.



Different replicators in the form of the progenitors of the organelles coming together in order to enhance survivability by pooling resources or being forced together after being consumed by a parent organelle(the ancestors of the cell membrane or wall).



Tangentially in object oriented programming we perform this task as a matter of conservation of code, composition of class instances inside parent class objects allows us to use the attributes of the consumed objects without paying the penalty of having to recode the associated methods. When I look at a cell I see a superclass with composed child objects of other class types, by composing the objects the parent object avoids the energy expenditure of designing the consumed objects sub mechanisms. It simply need provide the environment and raw materials or in the programming case (data) required of the sub objects in order to have the sub object "catalyze" or process the data to yield a desired output data or perform some function. A good OO programmer composes well designed subobjects into parent objects to reduce the total code that must be written. The smaller the code, the faster the desired object performs its functions, the more such objects can be loaded into available memory and the faster the over all application runs. In this example the code maps to the data in the DNA but the possibilities for created functionality go up exponentially when the composed objects themselves have complex code behind their formation. In other words a cell would have taken possibly tens of billions of years to engineer a rhibosome through natural selection as the rhibosome is a relatively complex object but in the early soup the chemistry must have made such molecules exceedingly likely to form even if we have yet to form them. We can't assume that because we see a small set of internal organelles in living things today that this is all their was. As time went forward and the molecules became more complex energy would be conserved by interaction. Just as conservation of energy leads a ball down a slope using a particular path depending on the curvature of that slope and the energy in the ball, so to did the early molecules seek to conserve energy by interacting. Computer scientists can and have created simulations of the emergence of complex relationships such as these using interacting programs called cellular automatons. Though the relationships that arise are far simpler than that say between a ribosome and a cell they are emergent and non deterministic. The computer scientist has no idea how the initial conditions will seed for the emergence of interesting "behavior" in the automatons.



Now someone looking for design would claim the early Earth chemistry got "help" from a designer but there is no reason to make this assumption, the chemistry, the homogeneity of the early forming conditions and the law of conservation of energy and time are all that are required. Just as a computer scientists looking at an emergent simulated environment of interacting cellular automatons does not invoke God for the complexity, there is no reason to suppose God for the biological analog simply because we have yet to replicate the conditions that gave rise to our life.



For those unfamiliar with OO, here are some links on object oriented programming:



http://en.wikipedia.org/wiki/Object_orientation



http://java.sun.com/docs/books/tutorial/java/concepts/

Complexity is brilliantly explained in the book by Nobel Laureate Murray Gell-Mann "The Quark and the Jaguar" I highly recommend it!

Get the book here

Comments

Popular posts from this blog

the attributes of web 3.0...

As the US economy continues to suffer the doldrums of stagnant investment in many industries, belt tightening budgets in many of the largest cities and continuous rounds of lay offs at some of the oldest of corporations, it is little comfort to those suffering through economic problems that what is happening now, has happened before. True, the severity of the downturn might have been different but the common factors of people and businesses being forced to do more with less is the theme of the times. Like environmental shocks to an ecosystem, stresses to the economic system lead to people hunkering down to last the storm, but it is instructive to realize that during the storm, all that idle time in the shelter affords people the ability to solve previous or existing problems. Likewise, economic downturns enable enterprising individuals and corporations the ability to make bold decisions with regard to marketing , sales or product focus that can lead to incredible gains as the economic ...

How many cofactors for inducing expression of every cell type?

Another revolution in iPSC technology announced: "Also known as iPS cells, these cells can become virtually any cell type in the human body -- just like embryonic stem cells. Then last year, Gladstone Senior Investigator Sheng Ding, PhD, announced that he had used a combination of small molecules and genetic factors to transform skin cells directly into neural stem cells. Today, Dr. Huang takes a new tack by using one genetic factor -- Sox2 -- to directly reprogram one cell type into another without reverting to the pluripotent state." -- So the method invented by Yamanaka is now refined to rely only 1 cofactor and b) directly generate the target cell type from the source cell type (skin to neuron) without the stem like intermediate stage.  It also mentions that oncogenic triggering was eliminated in their testing. Now comparative methods can be used to discover other types...the question is..is Sox2 critical for all types? It may be that skin to neuron relies on Sox2 ...

AgilEntity Architecture: Action Oriented Workflow

Permissions, fine grained versus management headache The usual method for determining which users can perform a given function on a given object in a managed system, employs providing those Users with specific access rights via the use of permissions. Often these permissions are also able to be granted to collections called Groups, to which Users are added. The combination of Permissions and Groups provides the ability to provide as atomic a dissemination of rights across the User space as possible. However, this granularity comes at the price of reduced efficiency for managing the created permissions and more importantly the Groups that collect Users designated to perform sets of actions. Essentially the Groups serve as access control lists in many systems, which for the variable and often changing environment of business applications means a need to constantly update the ACL’s (groups) in order to add or remove individuals based on their ability to perform cert...