It’s been just over a year since I joined Citrix, and this evening I thought I would carve out a few hours and look back on an incredible year. I’ve been exposed to so many new experiences and great people that have helped evolve my thinking. I’ve witnessed major steps forward with the product, but rather than list every product accomplishment and bore you to death, I thought instead I would say thank you. Thank you to my colleagues who have made me feel so very welcome and put up with me pushing us forward. Thank you to so many people in the community who always keep me honest, have sincerely opened up to share their insights with me and given me the opportunity to share my thoughts with them. For this I am truly grateful.

Since I am in the mood for reflecting. I felt like scribbling down my thoughts on the evolution of desktop management as I have experienced in my career and relay some of the conflicts and thinking that I see as I have spoken to a broad customer base over the last year and discuss how I think we need to move forward towards a better future.

The evolution of desktop management

In the early days of enterprise desktop computing, desktop images were built with floppy disks/CDs and sneakernet was used to deploy images with applications manually installed using native setup routines. In time these desktop images became personal to the user as they configured them as they pleased and installed software that they wanted. This approach never scaled and inventory was impossible to maintain in an enterprise environment and personalization never persisted when users moved to different machines. This led to a series of innovations.

  • Image management technology, that would allow you to clone and personalize a machine became available. Microsoft made it easier to automate image creation with Sysprep and companies like Ghost cloning software became popular later followed by Altiris, which extended the concept to delivering images over a network.
  • Microsoft added technologies to Windows such as profiles and folder redirection to enable personalization across machines enabling roaming use cases.
  • Application setups started to become more standardized and companies like WinInstall gained early market share with their software packaging format to streamline deployment.

As organizations began to acquire more PC assets, the need to manage the inventory of software and hardware became more important. This led to the creation of systems management tools that included inventory management such as MS Systems Management Server (now SCCM) and Marimba (Acquired by BMC) which also included sophisticated machine focused software deployment at scale.

With the plumbing in place, this set of tools powered many enterprises, but cracks started to appear as things began to scale up and organizations became more complex. Packaging formats like WinInstall required manual admin intervention every time for enterprise deployments of setup programs. I don’t recall ever seeing a WinInstall software package from a 3rd party. There was no consistency in the industry, and application packaging was cumbersome. In addition Windows itself had a problem called DLL hell. Applications that shared DLL components on the operating system would collide with each other. The WinInstall packaging format was not able to reconcile these differences. There had to be a better way.

Two new application-packaging formats began to emerge. MSI and OSD, both with MS involvement. In short despite OSD being a very flexible format, MS Office shipped as a componentized MSI. While this format did not solve DLL hell, it made it easier to reconcile. MSI packages started to take hold within the broad eco system and became an accepted standard. Many 3rd parties began shipping their installers as MSI packages which were much easier to customize by IT. I certainly recall repacking many MSIs when migrating from Windows NT 4.0 to Windows XP, which I think was the big event that really stimulated the change. To make this migration easier, packaging authoring vendors started to emerge like InstallShield and Wise (not the thin client people). Many enterprises and service providers standardized on the MSI format and authoring tools because they could simply modify 3rd party MSI’s as opposed to having to repackage the entire application from scratch. The software distribution (ESD) companies started to embrace enterprise deployment of MSI packages.

This model continued for several years and is still the dominant model in many organizations, and once again after a few years of operation at scale the flaws became evident. MSI’s required high admin skill sets, outsourcing did not make it significantly cheaper and DLL conflicts were still there. In addition the amount of software in the market was growing, changes to existing software were occurring more rapidly and security patching frequency was increasing to a point where zero day vulnerabilities gave customers little if any time to regression test remediation patches on both OS and applications.

The change management problem

As the number and iterations of applications and OS changes began to grow with increasing organizational change, the distributed computing model required governance. With all the moving parts and interdependencies making a change is complex and risky because one does not know how change will affect so many custom distributed images. As a result, organizations have built up sophisticated change management processes and teams that police this complexity trying to enforce some known state.

This usually means that updates are slowly managed into the environment, often going through rigorous testing, qa, uat and phased production deployments. Exceptions are made for security related changes and accelerated processes are available. However this increases the risk of poorly written security patches impacting production environments as evidenced by the recent XP patches in Feb 2010 which affected many customers. As you increase the scope from security patches for the core OS to core applications the complexity and risk becomes exponentially greater.

When a problem does occur, there is seldom a reliable way to rollback a change for security patches or MSI’s and to keep track of the environments state. (I still don’t of any organization who is truly comfortable uninstalling MSI packages at mass scale) This is often addressed with phased rollouts into production after the initially rounds of testing and verification. Even when you have a successful rollout, there is then the task of maintenance. You may have rolled out successfully, but you then need to go and verify the bits actually got there, and all of a sudden you are in the business of configuration management.

Expand the scope once again to include regular applications that require frequent changes, the need to keep the environment stable and the net result is poor flexibility and agility for the enterprise. Developers especially in my experience hated having to go through the whole packaging process and actively started to seek out ways to get around the system that was too slow for them. Java Web Start a simple ESD that pulls from a single source and requires no admin rights started to spring up all over the place, and there was no real way to police it effectively without implementing yet another agent to deal with application whitelisting or blacklisting. That’s what the Java developers did, the Windows developers were stuck and converting their apps to Web apps in 99% of use cases was a pipe dream.

The rise of XenApp and the profile guys

To help address the change management problem one use case for our XenApp customer base is to offer supplementary environments to the corporate desktop where rapid updates are required for a particular application. (A lot of my early experience with Citrix technologies was because developers had an update problem). For similar reasons customers use low cost Hosted virtual desktops for groups of users that are largely similar to enable faster updates and simpler management. However for both of these use cases, there are limitations as sharing an OS does restrict the ability for users to personalize. Sure you can get a long way just using the standard OS features such as roaming profiles and folder redirection. For many application vendors, especially internally developed ones it was not a big deal because those applications would write their customization somewhere within their own infrastructure if the native OS features were not good enough. However not so easy for 3rd party applications. As a result a number of vendors such as Appsense, RES, Tricerat have innovated to make the XenApp environment more customizable per user. In the case of Citrix this is known as the User Profile Management feature which we got from a company called Sepago. Even VMware acquired a company call RTO to help them with this problem, which is currently not part of the View VDI only offering.

However, while this seems obvious to XenApp minded people, it’s not necessarily intuitive to regular desktop admins. A recent blog helps illustrate this point. Many desktop admins have yet to take the first basic step of enabling roaming profiles as opposed to local profiles, which do not allow users to change machines and persist some level of personalization.

So why is there this discrepancy?

I believe to understand this, it’s important to keep in mind that desktop users typically use one device and therefore roam less. With XenApp, users roam all the time from a myriad of devices and share an operating system. As a result they are more sensitized to the challenges of personalization when roaming. As desktop virtualization becomes more common, it’s not a leap of faith to assume that desktops will face similar challenges due to user mobility and device diversity. Therefore I think it will be increasingly important to abstract the user preferences from the desktop.

So application virtualization is the silver bullet right?

Several years ago, various application virtualization solutions started to appear on the market. Softricity (acquired by MS and rebranded as App-V), Appstream (acquired by Symantec), ThinStall (acquired by VMware and rebranded to ThinApp), InstallFree, Xenocode and Endeavors amongst the most well known. At Citrix we have built our own and it’s a feature of XenApp called Application Streaming. XenApp is also compatible with App-V.

The value proposition with application virtualization was that applications could now be isolated from the operating systems to solve DLL hell, simplify application packaging, enabling applications to be delivered by users on demand and to reduce support costs. In addition, the hope was that now one common base image could be applied to all users, with the differences between users delivered via application virtualization and user profiles.

Ok I will say it, we are not at that reality yet. Application virtualization does not offer 100% application compatibility, although this is getting better over time. Other challenges include inter application communication, x64 support in some cases, the need to repackage all existing applications (just like the old days when MSI was adopted) and integration with existing software distribution scaled infrastructures. Certainly that’s not true in all cases, but I am speaking broadly as it stands today but expect this to ramp up as people migrate to Windows 7. IMO, for many customers application virtualization to date has been an evolution not a revolution and they will continue to use existing application management practices until they have a catalyst like Windows 7 and desktop virtualization.

In addition, numerous customers have commented that the benefits of desktop virtualization are many, and that should not mean having to rip out existing systems management processes and tools day 1. These are sunk costs, often political battle grounds with non desktop or Citrix teams that can result in barriers to adoption. Cultural change takes time.

Hmm, so we’ll take this in steps

As a result some customers simply say they will phase in desktop virtualization. First move to the data center, and then adopt systems management changes over time. Other’s of course are more aggressive. Neither are wrong, it’s a question of priorities. I have yet to meet a single customer who is not thinking that moving to a simpler more efficient management paradigm for the desktop does not make sense.

People handle this differently, some simply deploy assigned desktops with existing machine based management tools in place,(usually means they have a clear business need) others brut force it,(forward thinkers) while the unfortunate ones get into circular internal debates and endless POCs that go nowhere.(usually no business sponsor)

So user installed apps are the silver bullet and the best thing since sliced bread?

Once again much debate is brewing over a potential technology solution. The idea here is that a single image could be managed by IT, but users retain the flexibility to install applications. Kind of the ideal fix for lack of current management standards/process, lack of business users understanding why some controls are needed and IT’s inability to provide flexibility. One side argues that a single image solution with the ability to enable a user to install apps on top gives them the ultimate flexibility and management simplification they need. Others argue that this is a stupid idea, because legally users can’t just accept click through software agreements for liability reasons, admin rights are an open door for malware and of course we come back to 100% application compatibility is still not possible with today’s technology. Yet other’s will argue that this is too complex, will take years to mature and the market for it is still not proven. There is validity in each of these arguments.

It sometime feels overwhelming and why not just stick with the status quo.

The scariest thought of all

Sticking with the status quo, yikes! I don’t say that just because I work for Citrix. I fundamentally think the change management problem is only going to get worse over time and status quo is not a sustainable model for the long haul, just like the original desktop models were not. Distributed management of devices, OS, apps and users to me is like a dog chasing his tail. We will never catch it. It will always be complex, heavy and slow. It will not allow users to roam easily across many devices and connect to work from optimal places. IT will not be agile. It represents a machine centric view of the world, when we should be building towards a user centric vision of the desktop. In the future what is a desktop? That’s a great discussion, but what I think will be the same as today is users who want to get to their stuff, easily and quickly. They want that kind of technology, something simple. When I look at other models it is simple. I want to search, I go to Google, Bing, Yahoo etc and it’s just works. I want to buy a book, Amazon. It’s simple stuff to consume as a service on many device form factors. Now those are probably over simplifications and I don’t really account for the session management work the desktop OS does for us with Windows applications, but IMHO we need to create something users want.

I look at the amazing uptake and customer interest in the Citrix Receiver on the iPad. I scratch my head sometimes and wonder why, why do our enterprise customers really like it? To me it boils down to users want it, it’s not a need question. User want to connect to work in a different way. I see that as an incredible shift that is becoming more real everyday. We’ve talked about consumerization for a while and I continue to believe that beyond any vendor it will reshape the landscape. I’ve spent a lot of time talking to customers struggling with how to deal with this shift.

In addition, many CIOs I speak with, tell me that 2/3 of their budget is operational costs, and the remaining is innovation. They want to reverse that balance and build solutions that users value and will enable their business. They are asking themselves questions like, if cloud means at some point IT services will be consumed centrally how will my organization need to adapt to take advantage? These are transformational strategic discussions that are being shaped by forces larger than any single vendor. For many of these folks, desktop virtualization is just part of a transformational strategy to offer new IT services efficiently. In order to accomplish this they are seeking out ways to abstract state from hardcoded infrastructure so that it can be created on demand and used more efficiently. I don’t think any of these shifts towards IT as an on demand service or consumerization are trade secrets. I think everybody in the industry is finding their way with these changing dynamics. In the case of Citrix we are actively working towards and committed to simplifying the desktop and enabling new ways for people and IT to work. I like to think of it as the stateless desktop. Sure we are not perfect and we have work to do, there will be bumps along the way. However I truly believe this is a worthy goal and creates a better tomorrow. It would be far too easy for me to simply give up and stay with the status quo because it is familiar, or become preoccupied with a niche use case constraint. Fortunately I have had the pleasure to meet so many leaders who understand the changing dynamics, the power of a stateless desktop architecture and are navigating their organizations accordingly. Those are some of the most rewarding conversations I have had and why one year in I am still so excited about having an opportunity to help shape the future.