I make a living convincing applications to run, that were never “installed”. For this, I have many people to thank.
I now take this opportunity to personally thank:
- The inventor of the windows Registry
- The inventors of COM and OLE and specifically the COM use of the registry for object registration! Awesome!
- The person who thought it would be a good idea to share C Runtime DLLs under the \Windows space to save RAM
- Application vendors that don’t “get it” that user settings go in the “user” space and that you can’t update \Program files at runtime.
The fundamental item that keeps me happily working is that the majority of this stuff entails INSTALLATION TIME configuration activity that arguably shouldn’t exist. Looking back to DOS 3.3, life used to be much simpler. Here’s how I used to do it
- The operating system goes onto drive C:
- Applications and data go on drive D:
Assume you need to reload the OS frequently – or that you move from machine to machine to machine all the time. Nicely, this hasn’t changed. After installing the operating system, you update PATH in the autoexec.bat and POOF! You’re DONE!
Today (I mean, really today – right now), I’m reloading my primary dev box. I’ve been reloading it for about a week, an hour here and an hour there. Writing this blog gives me something to do while the various programs re-install. Thankfullly, all the “streamed” apps are instantly available.
Why reloading? Well, something got confused in the registry and it was no longer willing to work right. I’m a certified expert at this stuff and the best I can come up with to describe the problem is that is it “no longer willing to work right”. Yes, I could probably diagnose why that application install I ran blew the machine away, but it isn’t worth it. I’ll just reload the machine and be done. Machines seem to behave faster after a reload anyway.
HEY – This is one of the driving factors for application isolation. Prevent things from getting out of whack.
Its all my fault though – I messed with it. I installed something and the machine then requires a reload. . Why is this my fault? I’m struggling with the advancements in computer science that now require even a single application installation to be database driven, update system DLLs and executables, the application space itself and include numerous “registrations” such as COM. All of this stuff seems superfluous. In 20 years, I’ve gone from a 4.88 MHz machine where I could reload the entire machine in 30 minutes to a 2,000 MHz machine * 4 processors machine that now takes about a week to get back into working shape.
Consider also that I have to be an “administrator” to install applications. Why is this? Most applications are just executables and data files. If I have a “My Documents” folder, why don’t I also have an “My Applications” folder. My applications should be mine, located some place other than where the OS is and they should be “installable” on another computer with nothing more than XCOPY /S /E.
A Xen World
What I really want is a pristine Operating System image, with a bunch of applications streamed on top of that image and my user data. I don’t really care how the applications get there. My administrator should just take care of this. All of this should be maintained by my administrator because even as a techie, I don’t want to deal with updating the applications, or the operating system. If I were a “real user” I would really have little patience for all this configuration stuff. Give me an icon and let me do my work. It should be that simple, but interestingly, the complexity of getting to this centrally managed world is a difficult one do do with low cost and simplified mainteance. Its such a hard problem that Citrix, VMWare, Microsoft and all the vendors in this space are working on exactly this problem as the next big thing in the computer world.
For computer evolution, I dream back to the easy days 20 years ago. Maybe in 10 more years, life will be as simple as the olden days.