Skip to main content

Rube Goldberg rules software development


I've often touched on the topic around this articles thesis in my blog, here is a most recent post:

http://sent2null.blogspot.com/2012/03/engineers-versus-programmers.html

:for example. I've often said that engineering is an art while programming is more of a Science...unfortunately for those who are really good at programming...building big complex software is more art than Science.

The differences are ones of scale, the biological analog serves purpose here to illustrate. A living being is an amazingly complex organism with billions of cells acting seemingly in harmony, together over the course of about 70 years for most people...these cells replicate, divide and grow without issue until the whole clock work comes grinding down to a halt under the effects of progressive degeneration. Yet the darn thing runs for 70 years, it is amazing  that it works at all.

Now that we are getting into the code that defines how cells are organized, replicate and grow...we see how. It is an amazingly *over engineered* system, instead of having end to end efficiency in mind it was designed in another way...to be maximally efficient from one step to the next (that's the only way evolution could do it without a watch maker at the works) and so, the happy accidents of efficiency...many due to proximity of chemical or molecular affinity piled up over the last few BILLION years to create the super complex systems of living beings of today...some of which that can run like us to 70 years before falling apart!

Software is much like biology in this way, save the difference is the clockwork is in our classes, methods, objects, threads...however just like in biology, the big picture view over how those many components interact over application run time BEFORE the application is written is a huge mystery to most coders. They are great at defining the optimal algorithm to sort a bunch of digits or find edges between various shapes but bad at seeing down the road a bit to understand how those low level interactions bubble up for the entire system. We've known for decades about the various systems in the cell but only in the last decade are we figuring out how they are instantiated (stem cells) how those cells differentiate to specific types (polymorphism of cells) how those differentiations are regulated across tissue types to make sure (for the most part) the wrong cell types don't appear in the wrong places (cancer). Knowing how a cell works doesn't answer any of those  questions...which are locked up in the regulating and developmental code of the dna (much of it non coding and formerly called "junk"!).

The problem with software development today is that the planning of the engineering side of it is lacking while the planning of the programming side is well developed. So rather than taking the time to really understand the problem scope and shape a generalized solution to it...coders just get to work filling in patches to leaks as defined by a rough understanding of what the client wants. Putting grout and brick in place until the leaks stop...the end result of course is not always pretty or elegant...and like biological systems over time and stress reveal itself to be an inefficient Rube Goldberg like process. I detailed a list of considerations to keep in mind to perform good design in the following post:

http://sent2null.blogspot.com/2008/02/considerations-during-design.html

Now, UML is something that has been pushed for some time, the only time I've used it on a project was when I was using it to extract my class hierarchy from the source code into pretty diagrams I could print. That was it, the over all design of the application had already been diagrammed dozens of times on paper and pencil. The slogan of Apriority LLC the company I founded to write AgilEntity the project in question is "taking the time to properly design". I saw over and over again the problems that happen when a long design stage is not enabled for writing code...you end up building things that work...but just like Rube Goldberg machines. Incredibly fragile to changes that push the system out of the precisely designed for regime. This is a major reason why so many enterprise software projects eventually crumble under their own inefficient weight once load hits them and need to be re-architected. I thank the many horrible engineers out there writing code for all the jobs I've had where I was paid well to help fix such systems!

That said, specifications are different. A spec. can be simply a well detailed header comment above a class or method...it can reference the related classes it will be used in and why and maybe note how it could be changed in the future to enable some yet undesired functionality. The key is enough supplemental information about what the code is doing to allow it to be a) understood and b) changed in relation to all other code that may call on it without causing unseen pathology..once the changes are deployed.

I have a few examples of my developments in AgilEntity chronicled that show how good *engineering* can enable massive refactoring tasks in short time that in less well designed systems would simply not be attempted. Here's one:

http://sent2null.blogspot.com/2009/03/late-hour-super-class-surgery.html

When the time for refactoring is greater than the time of writing from scratch it isn't worth it to refactor.

Comments

Jane Doodles said…
Loved reading this, thank you

Popular posts from this blog

the attributes of web 3.0...

As the US economy continues to suffer the doldrums of stagnant investment in many industries, belt tightening budgets in many of the largest cities and continuous rounds of lay offs at some of the oldest of corporations, it is little comfort to those suffering through economic problems that what is happening now, has happened before. True, the severity of the downturn might have been different but the common factors of people and businesses being forced to do more with less is the theme of the times. Like environmental shocks to an ecosystem, stresses to the economic system lead to people hunkering down to last the storm, but it is instructive to realize that during the storm, all that idle time in the shelter affords people the ability to solve previous or existing problems. Likewise, economic downturns enable enterprising individuals and corporations the ability to make bold decisions with regard to marketing , sales or product focus that can lead to incredible gains as the economic

How many cofactors for inducing expression of every cell type?

Another revolution in iPSC technology announced: "Also known as iPS cells, these cells can become virtually any cell type in the human body -- just like embryonic stem cells. Then last year, Gladstone Senior Investigator Sheng Ding, PhD, announced that he had used a combination of small molecules and genetic factors to transform skin cells directly into neural stem cells. Today, Dr. Huang takes a new tack by using one genetic factor -- Sox2 -- to directly reprogram one cell type into another without reverting to the pluripotent state." -- So the method invented by Yamanaka is now refined to rely only 1 cofactor and b) directly generate the target cell type from the source cell type (skin to neuron) without the stem like intermediate stage.  It also mentions that oncogenic triggering was eliminated in their testing. Now comparative methods can be used to discover other types...the question is..is Sox2 critical for all types? It may be that skin to neuron relies on Sox2

AgilEntity Architecture: Action Oriented Workflow

Permissions, fine grained versus management headache The usual method for determining which users can perform a given function on a given object in a managed system, employs providing those Users with specific access rights via the use of permissions. Often these permissions are also able to be granted to collections called Groups, to which Users are added. The combination of Permissions and Groups provides the ability to provide as atomic a dissemination of rights across the User space as possible. However, this granularity comes at the price of reduced efficiency for managing the created permissions and more importantly the Groups that collect Users designated to perform sets of actions. Essentially the Groups serve as access control lists in many systems, which for the variable and often changing environment of business applications means a need to constantly update the ACL’s (groups) in order to add or remove individuals based on their ability to perform cert