Skip to main content

Action Oriented Workflow : Refined local guessing and prediction by design.

The zeitgeist of big data trends has recently come alive with articles pointing to the statistical pitfalls that big data analysis is fraught with when particular attention is not paid to means by which statisticians prevent bias from creeping into the results of their analysis. As most of the data scientists being nominated at many companies are simply computer programmers with a title and not necessarily statistics degrees this is not surprising. I find it interesting because of a unique distinction to these methods and how the Action Oriented Workflow paradigm built into the AgilEntity frame work I designed inherently behaves.

When I started work on the Action Oriented Workflow algorithm the idea I was trying to mine as that sufficient history on work actions on given business objects could be used to determine predictive ways to route new work through an organization such that traditional boundaries of stagnation are overridden. These boundaries include the offices, teams, buildings, organizational units, divisions and regions that all corporations strive to have as they grow to international power.

It there for was always about not so much "big data" gathered relentlessly and inquired for trends by interrogation of the data, instead it works by isolating the best trends within medium to small groups of is more about "right data" than "big data". The statistical pitfalls that attend cross boundary mixing of any kind of data are naturally modeled in AOW because workflows always *start* by encapsulating the action patterns between known groups of working agents which are always small at first. This is key as it means true signals of the action landscape are built first before being diluted to a different optimal regime when new agents from other workflows are crossed into interaction.

 It is only when these groups couple their work via the natural action of cross team collaboration (and under social oversight) that the new dna of adjacent workflows mixes such that a new optimal regime is explored involving the two (or more) connected groups of agents possessing of those workflows...thus statistical shaping to outlier interactions within these small groups start at their *highest* resolution of efficiency and then are dwindled down only as cross workflow interactions are engaged...meanwhile the algorithm continues to learn optimal for that new regime.

So there is no global guess done over all possible actions of disparate agents being sampled there is a refined but growing set of local guesses.

I posit that this is the optimal way to determine  predictive routing between agents action histories and is far less susceptible to the gross predictions made by global guess algorithms like the one used in Google Flu that could have simple critical assumptions about the sample space lead to wildly inaccurate "predictions".


Popular posts from this blog

Highly targeted Cpg vaccine immunotherapy for a range of cancer


This will surely go down as a seminal advance in cancer therapy. It reads like magic:

So this new approach looks for the specific proteins that are associated with a given tumors resistance to attack by the body's T cells, it then adjusts those T cells to be hyper sensitive to the specific oncogenic proteins targeted. These cells become essentially The Terminator​ T cells in the specific tumor AND have the multiplied effect of traveling along the immune pathway of spreading that the cancer many have metastasized. This is huge squared because it means you can essentially use targeting one tumor to identify and eliminate distal tumors that you many not even realize exist.

This allows the therapy for treating cancer to, for the first time; end the "wack a mole" problem that has frustrated traditional shot gun methods of treatment involving radiation and chemotherapy ...which by their nature unfortunately damage parts of the body that are not cancer laden but …

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

First *extra Galactic* planetary scale bodies observed

This headline

So every so often I see a story that has me sitting at the keyboard for a few seconds...actually trying to make sure the story is not some kind of satire site because the headline reads immediately a nonsense.
This headline did just that.
So I proceeded to frantically click through and it appears it was a valid news item from a valid news source and my jaw hit the floor.
Many of you know that we've been finding new planets outside of our solar system for about 25 years now.
In fact the Kepler satellite and other ground observatories have been accelerating their rate of extra-solar planet discoveries in the last few years but those planets are all within our galaxy the Milky Way.
The three major methods used to detect the bulk of planets thus far are wobble detection, radial transit and this method micro lensing which relies on a gravitational effect that was predicted by Einstein in his general theory of relativity exactly 103 years ago.