What is Cloud Computing anyway? Citrix recently sponsored the Cloud Computing Pavilion at Oracle Open World 2010 at Moscone Center in San Francisco. I engaged in many interesting conversations, but the one that was interesting and yet funny at the same time was with a Gentleman that walked up to the booth and asked what Cloud Computing was.
After a brief explanation, he got it. Sensing our tenure in the field, I directed his thoughts to the old Computer Timesharing model that existed in the 60’s. Cloud Computing is like Computer Timesharing. You basically pay rent for compute time, on computers that someone else owns, because you can’t afford to build the datacenter yourself, or you are maybe cost conscious.
I remember back when I was going to school as a Computer Science Major, and the only computers available to us were large Mainframes. We had a PDP 11-70, made by Digital Equipment Corporation (DEC), that we did all of our coursework on. We logged in using monochrome monitors from a terminal room. The room that housed the mainframe was locked and only certain people had access. When you printed something out, you had to go to the 5th floor, to the printout window, where someone would hand you your print job when it was finished. When I started my career, I wrote code for the DEC VAX, another mainframe of sorts. At this time, the Client-Server model was just beginning to take shape, so these large mainframes are what consistitued time-sharing compute power. Computer Servers running on Personal Computers (PC’s) were just emerging. It was only a matter of time before the micros overtook the macros. This transition put computing power into the hands of many at a low cost.
The pleasant part of where we have evolved to is the richness and depth of the computing resources available to us. I remember the first Mac, Windows v2.0 and DOS 3.0, those were dark days indeed. Back then, the Internet did not exist as a commercial resource, GUI was not a household word, and if you had a 9600 Baud modem you were privileged. Color screens were a luxury, and word processors required “codes” to be placed in between the words to allow for special printing and formatting … hey isn’t this HTML deja-vu ?
Bit rates and bus sizes have gotten bigger. CPU’s have gotten faster and you can now own your own computer, infinitely more powerful than the mainframes of yesteryear. But why do that when you can rent one in the cloud? The more things change, the more they stay the same. I am not sure how many people could have predicted that we would have come full circle, once again renting time on someone else’s infrastructure – cloud computing.
Similarities of Cloud Computing vs. Computer Timesharing:
- Rent compute time
- Simple, Static screen
Differences of Cloud Computing vs. Computer Timesharing:
- For $20 a month, anyone can own a Virtual Computer, of any operating system flavor
- For a small fee, you can create your own server farm or datacenter
- Graphical User Interfaces (GUI)
- Pointing Devices (Mouses)
- Full color virtual screens
- Client-Server technology
- Web Servers
- Applications & Databases
- Networking & Internet
- Virtual Machines
- Virtual Desktops
- Virtual Applications
Whatever era you spawned from, it is clear that computing is richer and deeper in features than it ever was. Today we have load balancing to improve server performance, caching to speed data delivery, compression to reduce bandwidth consumption, and SSL to encrypt and secure communications. Concepts that I am sure didn’t exist back in the day. Now applications are being offered as “Software-as-a-Service” from computer(s) in the cloud. Corporations can very cheaply give employees access to virtual applications using XenApp, and virtual desktops using XenDesktop from the cloud – regardless of the “Personal Computer Operating System” that was requisitioned from the “Bring Your Own Computer” program.
Does anyone remember punch cards?
What similarities and differences can you think of with regard to “Cloud Computing” vs. “Timesharing”?