In two recent blogs I described testing I performed to validate a Citrix VDI-in-a-Box and HP ProLiant reference architecture. The solution goes a long way to simplify the task of VDI deployment since it defines a low-cost, cookie-cutter approach for rapidly deploying VDI. Previously I ran a set of single server scalability tests to validate 50- and 100-user configurations based on the reference architecture, as well as a series of stress tests.

During this testing, I thought about how to extend performance measurements beyond metrics for the infrastructure components — I wanted to incorporate client/user data and capture the overall user experience. For this reason, I developed a method of recording user experience (UX) data while running scalability tests. This blog shares my results and data samples from recent VDI-in-a-Box UX testing. If you’d like to understand UX testing methodology I used, I’ve written a separate blog on that topic.

Capturing VDI-in-a-Box User Experience

As in previous scalability testing, I ran a series of single server tests using a Login VSI Medium with Flash workload, testing user capacities of 75 and 100 users. In short, my UX test harness uses a dedicated launcher and a physical client system to create a baseline that acts as a “canary in the coal mine”. The baseline provides a comparison point for the remaining user workloads. The test rig then captured metrics from the beginning of the test through the entire Login VSI run. In addition, I recorded videos to illustrate the user experience and corroborate the collected data.

Results of the 100-User Test

Here’s a comparison between the baseline and VSI workload loops during the 100-user VDI-in-a-Box testing. The graphs below show data for the baseline and four consecutive Login VSI workload loops:

  • Baseline Test (1 User)
  • 100 User Test

o   1st VSI Loop (25 Users)

o   2nd VSI Loop (50 Users)

o   3rd VSI Loop (75 Users)

o   4th VSI Loop (100 Users, with 200 seconds of steady state activity)

(To understand what constitutes a VSI loop, please see the test video, which is discussed below.)

In the charts above, you can compare the baseline Login VSI test loop to subsequent loops to gauge the user experience during both login/ramp-up and steady state phases of the test. You can clearly see matching patterns with only a small amount of deviation between the loops, indicating that the user experience in subsequent test loops closely approximates that of the baseline.

The charts below graph the average ICART and FPS metrics for the one-user baseline and subsequent test loops. Again, the data indicates that the tested VDI-in-a-Box configuration maintains close to baseline levels of performance. 

Supporting Test Video

During the test runs, I recorded videos to validate the collected UX metrics. I created this short composite video to compare the baseline VDI-in-a-Box user experience to the last section of the 100-user test run, which is when the server is at its peak. The video further corroborates the collected data, illustrating how the UX of the fourth loop closely approximates that of the recorded baseline.

The UX video and associated metrics form one leg of a three-legged stool that represents my overall VDI-in-a-Box scalability testing—Login VSI tests (that measure application response times) and PerfMon scripts (that track infrastructure performance data) constitute the other two legs. The UX testing serves to validate results I observed in other scalability test runs. For example, during one phase of the 100-user test, I observed spikes in some storage metrics, including average disk queue length (as shown below). However, the video and UX metrics confirmed that the configuration maintained satisfactory usability even during these spikes in storage performance.

Average disk queue length (100-user test with eight 10,000 RPM SAS drives)

Summary

The aim of this UX testing was to examine how the VDI-in-a-Box solution scaled from the vantage point of a user. Even with higher than expected storage stats, the video provided a qualitative visual reference along with the quantitative UX data, confirming consistent quality for the overall user experience.

References