Skip to main content

the rebuild is going....

I anticipated before starting the new development box build that I would run into driver issues with the current installation of windows 2003. The radical change of motherboard, video card , memory and processor underneath the existing system was sure to make the OS go nuts upon the first boot up. I have done hundreds of pc builds and the process proceeded with clock work regularity, I got everything looking nice and tidy and ready to go, set the pc open boxed on top of one of its twins and plugged it in and pushed the power button and it hummed silently to life with the one single beep that is the motherboard call for "ok". Within seconds though the screen flashed a BSOD indicating a stop error, the worst kind that windows throws about...not so surprising...but this is when the fun starts...


SATA and IDE drives, the newer mother boards are being shipped with the serial ATA port specification taking over the older PATA IDE spec. The difference in size of the connectors allows the mobo makers to save significant board space for other things or to just make the board smaller (and cheaper) , one thing they are also doing is reducing the number of PATA ports on the mother boards as well. I spent almost 30 minutes on newegg.com trying to find a motherboard that offered a board with the classic two PATA IDE ports. I couldn't find any and settled on a solution of using a PATA to SATA converter board to use on my DVD drive to allow it to work on the SATA port while the two hard drives continue to use the now lone PATA IDE port on the motherboard. This just may be the reason that I am having the issues I have run into...

I booted up the machine again after coming up with a plan to upgrade the win2k3 installation but I feared that the PATA to SATA converted DVD drive would give me some problem. Sure enough, attempting to boot off the bootable CD loaded into the drive failed. The drive itself gets power and loads the disk and is listed as a SATA device in the motherboard BIOS screen listing attached devices it simply won't recognize the disk on boot up.

So I came up with another idea, as a pc and network consultant at Chase Bank in 1998 I helped co-design a method for installing software over the network by allowing a "brain less" (no active HDD) or "mouth less" (no active optical disk) machine to get the required data across the network. It involves configuring a network boot disk with the necessary files to enable any NIC onboard or attached to the motherboard, I've been using this technique quite successfully to load OS's on machines with clean hard drives that for one reason or another fail to allow the optical drive to boot the OS disk. I also use it to perform Ghost restores to such machines if again they have issues running any restore disks locally. I decided to configure a network boot disk for the new machine, this required updating the NIC drivers in the DOS files to reflect the newer RealTek NIC driver used by the onboard 10/100/1000 NIC on the motherboard. After getting this done the network boot was successful! I added the windows disk into a working machine on the network and shared the drive, I then proceeded to attach to the drive from the DOS prompt on the build machine using the trusty DOS net use command. Changing directory to the shared disk on the remote computer allowed me to get to the i386 folder and I was ready to run the install using the winnt.exe command at the prompt. Of course doing this only revealed more "fun"...in an ominous message from the windows installation routine.

"An Internal Error Has occured."
"Could not find a place for a swap file."
"

This error occurs when the partition that I am attempting to install runs out of space. The initial DOS partition created on the hard drive was too small apparently,

A little investigation revealed that this error could occur when the "conventional" memory space of the DOS mini OS loaded by the boot disk runs out. Conventional memory is restricted to the small original memory sizes that were normal during the days of DOS. Obviously today's programs use significantly more memory to run but in order to maintain compatibility the additional memory is added into "extended" memory blocks addressed to the oceans of standard RAM that comes in even the most basic of pc's today. I have 2Gigabytes of RAM on this machine , an amount of RAM that is equivalent to the entire Hard drive capacity I had on my first pc purchased in1997. In any event I was confused at first as to why it was running out of memory. A quick read up on the issue revealed a possible solution, in loading only the programs needed to save on conventional memory use. After hours of trying various combinations it finally succeeded in allowing me to run the installation across the network. I proceeded to install the OS, by this time another day had turned to night and was now early morning. I was completely drained and decided to give up the game for the next day since the OS was installed.


Next day....

After a short stint of 5 hours sleep, I got up and started where I left off the night before by installing the required video driver and motherboard drivers for the onboard devices. While doing this I noticed that the hard drive partition that I created to use as an installation drive was showing up in the windows "my computer" directory, I didn't want this so I figured it would be easy for me to simply boot into dos and remove the partition after first changing the windows boot.ini file to point to the new boot partition as the second partition (which actually had the installation files) I even moved over the necessary boot files to the second partition to make sure it would boot properly. I knew I was taking a chance in the machine not booting up again but I tried it just the same. I rebooted the machine and after several tense seconds was greeted by an ntldr missing error, which told me that the boot.ini change must have not been saved. I was now back to ground zero with a broken installation of window on the machine! It felt like I was in the twilight zone!

I took a deep breath and began to set up the network installation of windows all over again, this time fearing that it would fail to start for the same reason as before , no memory....sure enough the first time I tried it, it failed in this way but after clearing the partitions and formating them with system files it worked. I installed windows again, installed the drivers again and for a second time tried to eliminate the dos boot partition, this time after a reboot ...it worked!

I spent the next three days reinstalling and configuring all of my software on the machine. Luckily, my application and data partitions were untouched and contined the most recent code up to the point that the last motherboard blew out. I was able to mount ghost viewer and extract various files from the images that were part of the previous system, here I am now on the new machine , all my development applications (VSS, XmlSpy,Advanced Installer, MySql Query Browser, Netbeans IDE, Java SDK...etc) installed and tested. My code loaded from the back up and used to compile and build a new distribution just fine...after two weeks of down time I am back!

Lessons learned:

1) Ghost is great for restoring images to a machine so long as that machines underlying hardware doesn't change significantly. I've restored ghost images on a machine after changing the optical drive, video card, NIC or HDD but none have ever survived a change to a new motherboard of a different brand...so always make physical file copies of your data AS WELL as full partition or disk images using ghost or a similar tool. If I use ghost explorer to pull out files from images so I didn't really need a separate file copy but if I didn't have explorer those images would be useless to me. Copying the files makes it more accessible so that should be done as well.

2) Hardware-software interface is very very kludgy even today and wraught with bad design choices. From the ridiculous "out of memory" errors that the windows installation throughs (the code should automatically figure out a way to use the available memory to get the job done ..even if performance is impacted , a slow installation is better than no installation) to the madness of formatting partitions and setting active partitions, a great deal of complexity during the build stage could be eliminated with better designed lower level programs particularly in the windows installation code.

This experience also is a good example of being prepared for anything, I knew that the development box was last rebuilt over 5 years ago so knew something was about to go wrong but that "something" is usually not a motherboard burn out. That said, had the hard drive burned out instead I would be in a worse off position despite being able to recover from the event easier using the ghost images...the reason is that up to the motherboard burn out the last ghost image I'd created was 2 weeks old! In those two weeks I solved some major production issues (e commerce integration with amazon, coded permission token logic into core code) it would have been quite a bit more taxing for me to reinvent those solutions than to install software on a fresh installation of windows. So always keep your back ups up to date! I originally was making ghost images every friday but fell off as I neared site launch, wrong move! I am going to make an image as soon as I'm finished with the rest of the software installation on the box and be sure to keep it up to date every week.

Comments

Popular posts from this blog

the attributes of web 3.0...

As the US economy continues to suffer the doldrums of stagnant investment in many industries, belt tightening budgets in many of the largest cities and continuous rounds of lay offs at some of the oldest of corporations, it is little comfort to those suffering through economic problems that what is happening now, has happened before. True, the severity of the downturn might have been different but the common factors of people and businesses being forced to do more with less is the theme of the times. Like environmental shocks to an ecosystem, stresses to the economic system lead to people hunkering down to last the storm, but it is instructive to realize that during the storm, all that idle time in the shelter affords people the ability to solve previous or existing problems. Likewise, economic downturns enable enterprising individuals and corporations the ability to make bold decisions with regard to marketing , sales or product focus that can lead to incredible gains as the economic

How many cofactors for inducing expression of every cell type?

Another revolution in iPSC technology announced: "Also known as iPS cells, these cells can become virtually any cell type in the human body -- just like embryonic stem cells. Then last year, Gladstone Senior Investigator Sheng Ding, PhD, announced that he had used a combination of small molecules and genetic factors to transform skin cells directly into neural stem cells. Today, Dr. Huang takes a new tack by using one genetic factor -- Sox2 -- to directly reprogram one cell type into another without reverting to the pluripotent state." -- So the method invented by Yamanaka is now refined to rely only 1 cofactor and b) directly generate the target cell type from the source cell type (skin to neuron) without the stem like intermediate stage.  It also mentions that oncogenic triggering was eliminated in their testing. Now comparative methods can be used to discover other types...the question is..is Sox2 critical for all types? It may be that skin to neuron relies on Sox2

AgilEntity Architecture: Action Oriented Workflow

Permissions, fine grained versus management headache The usual method for determining which users can perform a given function on a given object in a managed system, employs providing those Users with specific access rights via the use of permissions. Often these permissions are also able to be granted to collections called Groups, to which Users are added. The combination of Permissions and Groups provides the ability to provide as atomic a dissemination of rights across the User space as possible. However, this granularity comes at the price of reduced efficiency for managing the created permissions and more importantly the Groups that collect Users designated to perform sets of actions. Essentially the Groups serve as access control lists in many systems, which for the variable and often changing environment of business applications means a need to constantly update the ACL’s (groups) in order to add or remove individuals based on their ability to perform cert