Recently, I hear a lot of talk about this topic. I’ve heard a few people suggesting that in order to optimise the enterprise desktop, you separate your hardware layer, your OS, your apps & your user state. You abstract all 4 things from each other, leaving the desktop environment to be assembled on the fly when you fire it up.
Lets say you implement the following;
- A VDI solution of some kind, hosted either on a server or locally on the desktop.
- Sequence all your corporate apps using App-V or something like that.
- Keep your data off the endpoint perhaps using folder redirection.
- Mobilise user settings with roaming profiles.
The idea being that the more you separate all this stuff out, the more flexible your environment becomes. A user can sit down at any device and get the same experience and data as from the device they sat down at yesterday. When they login, their settings will be applied to the machine, their applications streamed down on demand, their data available seamlessly. Sounds great doesn’t it?
My opinion…. it’s being looked at in the wrong light. the user community has proven time and time again that what they want is a desktop environment that performs well, looks pretty and above all else, is FAST!!. The more you virtualise, the more you centralise, the more you separate… the further you move away from that core requirement that our colleagues, our customers and our friends are REALLY looking for. Often IT organisations get so tied up trying to solve the challenges put in front of them, that they forget this fundamental truth about user behaviour.
Now, I’m not suggesting for a second that we, as IT Pro’s don’t aspire to build flexible and agile desktop environments, I just feel that we often skip right past some of the less aggressive approaches, straight into the hands of the virtualisation freight train currently pass through. Here are a few thoughts on virtualisation of these desktop components;
Virtualising the OS
Consider the return on investment here. If we now propose to separate the user settings, the apps and the data from the OS, then the OS has just become a commodity. Now that we have stripped away everything that really makes this desktop different to the next, the OS is simply a component in the Infrastructure needed to provide the desktop service. It should be easily replicated, easily deployed and easily managed. So why virtualise it? Why not just build a nice efficient zero touch deployment, installing and re-installing it indiscriminately?
Virtualising the Applications
This is a fairly hefty topic, I could go on forever on this one…. I’ll try not to. I suppose to start with it’s worth pointing out the difference between App virtualisation and App Streaming. The two go hand in hand, but they are not the same. Application Virtualisation is the process of encapsulating an app from the underlying operating system on which it are executed. Application Streaming is the process of delivering the virtualised application. Sometimes pre-staged, usually not, the streaming client pulls the pieces of the program down as they are called for, caching them locally on the desktop.
So, in pulling apart the commonly proposed use cases for these 2 techniques, a few things are apparent. I keep hearing IT leaders suggest plans to virtualise every application that they support. Why? Adobe Acrobat reader can be easily packaged and delivered through software deployment tools such as SCCM or Altiris. Why sequence it, why stream it, you’ll only run into trouble when it’s required on the other end of a slow link.
Application Streaming can be extremely useful in cases where we need to update an application package frequently. It can be handy if we wish to make an application available to a user from any computer, in a short period of time, without having to deploy the application to various PCs. Great, lets use technology like App-V in these situations. Don’t forget however that this flexibility that I speak of is not without compromise. If our users rely on the availability of this application during periods of degraded or absent network connectivity, then it had better be there! If we are streaming on demand then this may not be the case. Also, the impact to user experience cannot be ignored. Whether on speedy gigabit LAN, or a highly latent sat link, a streamed application is going to load more slowly than if it were installed locally, considerably so on slow links.
My preference is to virtualise only when it’s called for, core applications listen like MS Office or Adobe Acrobat Reader can often be more easily packaged and delivered using conventional methods.
Redirecting User Data
This is fairly frequently employed tactic. Generally delivered using folder redirection of ‘My Documents’ to a network location. Generally, doing this achieves what it sets out to, it keeps user data off the desktop, ensuring it’s available at all computers, as well as being backed up. Back on XP, the biggest thing to watch out for was the distance to, or availability of, that network location. so many things, both in WinXP and in various apps, were linked into ‘My Documents’ that if it was suddenly on the other end of a slow link, or not available at all, the whole machine would be brought to it’s knees. The OS would slow down, applications linked into the location would hang or throw up errors, it could get quite messy.
Under Vista the situation got a bit better, as the ‘Offline Files & Folders’ was completely rewritten, and rewritten well. Most Sys Admins turned and ran after trying to use ‘OF&F’ on WinXP. It was clunky, unreliable and hard to manage centrally. On Windows Vista, the process became smooth and seamless. When explaining to colleagues, I often compared the new ‘OF&F’ to the ‘cached exchange mode’ of Microsoft Outlook, replicating seamlessly in the background, without you noticing or being involved at all. So…. now with a working solution to make your network location available offline, you can sync up the folder that your ‘My Documents’ is relocated to, which will resolve any issues with that folder being unavailable.
Things get even better under Windows 7. with the implementation of libraries. Now your My Documents is no longer a link to a single defined folder, but rather a collection of folders. I’m not sure the extent to which Microsoft did this on purpose, but intentionally or not, they removed the reliance that Windows had on being able to see that folder. Since your network location is merely one of many defined as part of the library (It may be the only one, but that is beside the point) if it’s not available, it’s content will simply not be displayed, however no other adverse effects will present on the OS. Naturally you could still choose to employ OF&F on the folder in your Windows 7 library if you want offline access.
Another feature of Vista and Win7 making the redirection of user data easier is the separation of ‘My Pictures’ and ‘My Music’ from ‘My Documents’. hands up if you remember the backup team or the server team getting upset because some silly user just installed iTunes on their work machine and synced 20gb of music to their home drive. Whilst ‘My Music’ may have seemed the obvious default location for Apple to put their music library, they definitely didn’t think that one through did they. You may take the hardline approach and ban your users from using iTunes or saving music on their machines, which is ok, but not always possible. I’ve never really taken issue with this kind of behaviour, people travel with their computers, or they sit in front of them all day, what’s the big deal if they want to listen to the latest Kings of Leon album while they do it…… as long as it doesn’t end up on my server!!!!. Anyway, back to the point, Microsoft separated out the different locations, enabling you to redirect ‘My Documents’ whilst leaving ‘My Music’ safely on the desktop, where it won’t cause problems. On Windows 7, they go a step further with the new libraries, you may wish to link ‘My Pictures’ for example to 2 locations, one local and one on the server, allowing users to easily store work related pictures on the server where they can be backed up, leaving personal photos on their PC, where they can remain private and won’t cause grief for your backups.
Virtualising the User State
I think the biggest deployment blocker to roaming user state has been the lack of granularity in doing so. Under WinXP, if you enabled roaming profiles, the whole profile would have to come across to each PC. This thuggish approach was merely good enough to get by on fast networks with little need for portability. Profiles could grow to gigabytes if left unchecked, making this method cumbersome and unwieldy to support. Bit level replication of profiles under Vista improved things a bit, but physics is physics, so it still isn’t great.