Skip to main content

When the crazy and the stupid can build monsters in their basements...

Never leave out  insanity or stupidity....both are amazing motivating factors for people with seeming bounty to do incredibly horrendous things. I am saddened on a *daily basis* by the ignorance that I come across on any given outing into NYC. If I speak to enough people on enough subjects I will encounter multiple people that believe things which can't even be termed as being wrong...they are simply in the wrong universe.

With that element out there in the human population I am yet more pessimistic about our chances once more and more powerful ways to change the world drastically are democratized (like building your own living organisms as synthetic biology enables us to do).

When I was a 13 year old writing BASIC programs....I couldn't write a program that could go to my neighbors pc and erase his hard drive because a) well I didn't know enough BASIC and networking code then and b)  neither of us had a modem so even if I did know how to write the code there was no network or viable vector for me to convey that code to his computer clandestinely. Sure I might have been able to sneak into his place with a floppy and download the code directly but that would require greater risk of being discovered.

That code, computer code moreover could only at most erase the data on his drive...which would for him have been painful but his life would be in tact.

Fast forward to a summer day in Pakistan in the year 2037. Synthetic biology at this point has long gotten to the stage where we have programming language for writing living things (something I predicted in 2008 but see this article:

 if you don't believe they already are being planned)  and such programming languages along with cheap genetic creation kits will allow ....zealots...of all stripes to create not computer code but real animals. Super pathogens....imagine what happens when the code the kid creates instead of infecting the hard drive is targeted to infect *the neighbor* and say...I don't know....change his hair color gene from black to red...ha ha so funny.

Except it's not so funny, it could be to do we pretty much now know we can change anything if we have the right vector and we know the sequences that need to be modified.

If you aren't afraid yet, you simply don't understand the gravity of the situation.

We are in deep shit...if people fueled with zealotry (of religious kind ..of political kind of CRAZY kind!) get their hands on such technology and that technology is coming fact it's really here....just kind of expensive, but it won't stay that way.

Either we change the collective tendency for zealotry in all people (there should be about 11 billion of us by then) or we are going to be facing all types of crazy biological and genetic attacks to make the current error of computer viruses and worms seem like school yard taunts.

The sad reality is, is that even if we succeed in a miracle of eliminating completely the desire for those of different views from wanting to take existential efforts to eliminate their opposition using some bio/genetic weapon there will be the crazy and the stupid, they are a silent sample of all those born....and fueled by their *interpretation* of what ever beliefs are out there a fraction of them will be inspired to take action using the then readily available and powerful techniques for creating living things and that doesn't give us a high probability of avoiding some serious situations on a continuous basis when those technologies are's just a probabilistic reality and that is depressing.



Popular posts from this blog

On the idea of "world wide mush" resulting from "open" development models

A recent article posted in the Wall Street Journal posits that the collectivization of various types of goods or services created by the internet is long term a damaging trend for human societies.

I think that the author misses truths that have been in place that show that collectivization is not a process that started with the internet but has been with us since we started inventing things.

It seems that Mr. Lanier is not properly defining the contexts under which different problems can benefit or suffer from collectivization. He speaks in general terms of the loss of the potential for creators to extract profit from their work but misses that this is and was true of human civilization since we first picked up a rock to use as a crude hammer. New things make old things obsolete and people MUST adapt to what is displaced (be it a former human performance of that task or use of an older product) so as to main…

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

Waking Out: A proposal to emerging ethical super intelligence safely.

The zeitgeist of Science fiction is filled with stories that paint a dystopian tale of how human desires to build artificial intelligence can go wrong. From the programmed pathology of HAL in 2001 a space odyssey, to the immediately malevolent emergence of Skynet in The Terminator and later to the humans as energy stores for the advanced AI of the Matrix and today , to the rampage of "hosts" in the new HBO series Westworld.

These stories all have a common theme of probing what happens when our autonomous systems get a mind of their own to some degree and no longer obey their creators but how can we avoid these types of scenarios but still emerge generalized intelligence that will leverage their super intelligence with empathy and consideration the same that we expect from one another? This question is being answered in a way that is mostly hopeful that current methods used in machine learning and specifically deep learning will not emerge skynet or HAL.

I think this is the …