Skip to main content

old object models to new object models....

I've spent the last couple of days researching the hot trend in software service enablement that is made possible by the idea of web services. I remember back in 2000, when I was still working as a network and pc support technician reading a book on the emerging tool of xml. At this time, the growing needs of web sites to store and provide content thanks to the exploding number of individuals coming online to buy, trade and interact necessitated a way to stream line content creation, transformation and delivery to different locations in different formats. Xml was the magic bullet that was able to provide all these things when used cleverly. I soon found myself during my free time at work, reading the chapters on how xml worked and how to perform xsl transformations. I became convinced that xml related technologies were the future and with that conviction in mind began a secret hunt for a job while I was still working at the bank. I was able to get a job with in a matter of weeks and was sufficiently enthusiastic and competent to get the job. While at I got to experience first hand the revolution of xml and the power of it as weilded over the problem of controlling web content. In the system I worked on, nostalgically called "rosebud" by the designers, xml were the rails upon which the content of the site rolled. Several simple relationships between xml content blocks and pages to be rendered in the StoryServer development environment, alone with a distribution method to multiple outbound locations, file or ftp allowed the site to publish content in house and then distribute it in various forms to outside partners (for a fee) as well as to the main web sites of the company. The experience of building many of the xml feeds that brought revenue to the company convinced me further that xml was the future. I imagined a system that could be used as an xml nexus, taking in content in the form of xml , transforming it for the needs of the system and then transforming it again to desired output formats for delivery to external partners. Though we did do this to an extent at it was not the jist of the companies business, which was to provide financial news and commentary, the content distribution was only a side effect of the businesses goals. I saw xml as more than just something useful for financial news content management, I saw it as something that could potentially manage any outbound or inbound data interaction between disparate systems. As I continued to work at TSC, the proposals to use xml as a lingua franca of data transmission between different platforms was born. Wrapped into the moniker of "web services", these xml grammars would allow systems to speak a common language that could be used by remote systems to authenticate, request method or object acces to various entities on remote systems and recieve the requested data in synchronous or asynchronous fashion. Since then web services as flowered with various protocols for ensuring certain tasks are performed in efficient and safe ways for their use in the enterprise. Security being chief among them as covered by the ws-security protocol.

Today, web services form the backbone of another raging buzz word in Information technology in the last few years, SOA. Web services are what make SOA's possible by providing the underlying fluid interface for mediating data and object requests between applications internally and between enterprises externally. Unfortunately, with all this new use of xml to do things that it was not originally designed to do comes one major issue, complexity. The web services protocols are not very intuitive but once it is realized that all they do is allow remote applications to talk to one another, then old dinosaurs who are familiar with older acronyms like CORBA, DCOM or COM will know just now to deal with the new parts once they are mapped to their object model equivalents. The key difference between web services and the previous object models however is how agnostic they are of the platforms that they run on. The industry has actually succeeded in providing a deep level of platform neutrality that has enabled enterprises to enable a new level of efficiency as far as remote data mediation is concerned. There are still many issues to work out in the areas of security, and still many things that are done inefficiently by the inherent nature of the use of an xml grammar to do them (like sending files) but these are finding solutions. The complexity of these new solutions must be absorbed and abstracted just as was done for the old component object models, through late nights and lots of coffee but the promise of infinitely interoperable systems free from lockin is actually palpable. Is this current round of an object model technology going to be the last one? Or in a few years, will web services be deemed old news and replaced by another new fangled technology that does the same old job in a new way? Only time will tell...


Popular posts from this blog

On the idea of "world wide mush" resulting from "open" development models

A recent article posted in the Wall Street Journal posits that the collectivization of various types of goods or services created by the internet is long term a damaging trend for human societies.

I think that the author misses truths that have been in place that show that collectivization is not a process that started with the internet but has been with us since we started inventing things.

It seems that Mr. Lanier is not properly defining the contexts under which different problems can benefit or suffer from collectivization. He speaks in general terms of the loss of the potential for creators to extract profit from their work but misses that this is and was true of human civilization since we first picked up a rock to use as a crude hammer. New things make old things obsolete and people MUST adapt to what is displaced (be it a former human performance of that task or use of an older product) so as to main…

Engineers versus Programmers

I have found as more non formally trained people enter the coding space, the quality of code that results varies in an interesting way.

The formalities of learning to code in a structured course at University involve often strong focus on "correctness" and efficiency in the form of big O representations for the algorithms created.

Much less focus tends to be placed on what I'll call practical programming, which is the type of code that engineers (note I didn't use "programmers" on purpose) must learn to write.

Programmers are what Universities create, students that can take a defined development environment and within in write an algorithm for computing some sequence or traversing a tree or encoding and decoding a string. Efficiency and invariant rules are guiding development missions. Execution time for creating the solution is often a week or more depending on the professor and their style of teaching code and giving out problems. This type of coding is devo…

Waking Out: A proposal to emerging ethical super intelligence safely.

The zeitgeist of Science fiction is filled with stories that paint a dystopian tale of how human desires to build artificial intelligence can go wrong. From the programmed pathology of HAL in 2001 a space odyssey, to the immediately malevolent emergence of Skynet in The Terminator and later to the humans as energy stores for the advanced AI of the Matrix and today , to the rampage of "hosts" in the new HBO series Westworld.

These stories all have a common theme of probing what happens when our autonomous systems get a mind of their own to some degree and no longer obey their creators but how can we avoid these types of scenarios but still emerge generalized intelligence that will leverage their super intelligence with empathy and consideration the same that we expect from one another? This question is being answered in a way that is mostly hopeful that current methods used in machine learning and specifically deep learning will not emerge skynet or HAL.

I think this is the …