Make sure to also check this blogpost about a very handy tool named Remote Display Analyzer
In this blog post I wanted to talk about Adaptive Display, this new HDX feature is now available in both XenDesktop 5.5\5.6 and XenApp 6.5 (through a hotfix), Adaptive Display is the successor of the highly successful Progressive Display SpeedScreen technology, and it’s switched on by default. It’s an awesome technology because it is auto adopting to changes in the available bandwidth.
There is not so much information in the Citrix Edocs about fine tuning Adaptive Display, this is mainly because it’s auto-tuning and in various blog posts from Citrix they are saying the following :
“Progressive Display requires creating complex policy configurations to get it right making it a hard to use feature. Adaptive Display eliminates the need for such complex configurations and provides a fantastic out-of-the-box experience, making it zero configurations for Citrix Administrators”
Ok that’s fine by me, so we do not have to create complex policies anymore for LAN and WAN use cases, because it will detect the available bandwidth and adjusts accordingly. Super!
But what is exactly going on inside the Thinwire channel?
Before I go further, let’s summarize the default settings that are adjustable within Adaptive Display :
Adaptive Display Setting
|Max frames per second||24|
|Target Minimum Frame rate||10|
|Minimum Image Quality||Normal|
|Moving Image Compression||On (Enables or Disables Adaptive Display)|
|Extra Color Compression||Default Off and enabled when bandwidth is below 8192 KBps|
|Heavyweight Compression||Default Off|
|Lossy Compression level||Default Medium, the default threshold is unlimited|
* Note: At this time, not all Adaptive Display policies can be configured using the XenApp 6.5 AppCenter console. Use Windows Group Policy Editor (gpedit.msc) instead.
Ok so now we know what settings are in the game of Adaptive Display, how are this settings come together? To make this more clear I made some drawings and explanations.
Let’s begin with the Extra Color Compression setting, Color Compression takes advantage of the fact that the human eye is less sensitive to color information (Chroma) than luminance (Luma). When images are encoded with less color information, the bandwidth savings are huge yet the human eye still sees a very satisfactory picture. Most of today’s digital cameras use this technique to save on storage space. Extra Color Compression is turned on by Adaptive Display when the default threshold of 8192 KBps is reached, let’s picture this default behavoir :
Ok moving on to the Minimum Image Quality setting, this setting sets the JPEG quality floor.
In other words, this is the minimal acceptable JPEG quality, the following minimum quality levels can be set :
Minimum Image Quality
|Ultra High||80 (highest image quality, lowest Compression)|
|Low||15 (Lowest image quality, Highest Compression)|
The Lossy Compression level set’s the starting JPEG Quality, Adaptive Display adjusts the JPEG quality between the starting point to the Minimum Image Quality based on the bandwidth available to try to keep the frame rate from decreasing. The default starting JPEG Quality is 55 (Medium). Ok let’s picture this combination :
Notice that the default Lossy Compression level is set to Medium and the threshold to enable Lossy compression is set to 2147483647 KBps (unlimited), which means that this setting is always on.
The following Lossy Compression levels can be configured :
Lossy Compression Level
Starting JPEG Quality
Ok now we know what the default settings are and how the frame rate and compression is dynamically adjusted by Adaptive Display. So what about this default settings, should we change it or leave it alone?
As usual it depends on the use case 🙂 but read on….
There is a great tool from Citrix called HDX Monitor, this tool lets you see all the HDX aspects in an active ICA session. If you start the HDX Monitor (with the default settings in place) you will see the following screen :
Ok looks good, but what’s that big red cross? Let’s find out :
Error : Image compression is not tuned to the available bandwidth. An Administrator can improve the user experience by creating a policy that optimizes image compression.
Ok so it looks like the HDX monitor engineering team is not happy with the Out-of-the-Box experience settings from the HDX Adaptive Display engineering team 🙂
I think the HDX Monitor engineering team is right, because if we connect through a fast LAN connection the default Medium compression is used and the windows flag background and other images looks like this :
This is not the best experience you can get with LAN conditions.
Why did Citrix choose this default Out-of-the-Box settings? I think because of a combination between the following 3 points :
1: User Experience
2: Server Scalability
3: Bandwidth Scalability
The default settings also improves the performance on the LAN when viewing high resolution photos etc, if you enabled Progressive display in the past your users might already be used to this compression level.
But we can consider to improve the user experience for LAN scenarios by lowering the Lossy compression level or turning it off. This can be done in 3 ways :
1: Configure a Lossy Compression maximum threshold
The default threshold for Lossy Compression is set to unlimited, so by default Medium compression is always used. We can change the maximum threshold so we can give it a maximum value in KBps, above that threshold Lossy Compression will be turned off. This looks like this :
As you can see the Lossy Compression will be turned off when the maximum threshold is reached, for example you can set this threshold on 75% of your LAN speed. The side effect of this one, is that you don’t have any Lossy Compression at all when the Bandwidth is above the maximum Threshold. This can be negatively impact your environment with a lot of LAN users viewing high resolution photos.
2: Set the Default Lossy Compression to Low
If we want to improve the user experience on the LAN, we can also lower the Lossy Compression to the lowest level. This looks like this.
Keep in mind that Adaptive Display will try to maintain this starting JPEG Quality also for your WAN users.
3: Configure different Adaptive Display policies and filter them on IP address
This one is a little bit the same as we configured for Progressive Display in the past.
Make a policy which applies a low level of Lossy compression for you LAN users and filter them on internal IP ranges and give it a higher priority then your default policy.
You can let the default policy (which applies to all users) default or change it with higher compression levels. In this scenario we only give the LAN users a higher starting JPEG quality.
Do we need to fine-tune Adaptive Display?
I think we need to take this into consideration depending on the use case, for example :
– Do your users need to see lossless images on the LAN?
– Is your environment (Network, Servers, Client Devices) fast and scalable enough?
I also think the default Out-of-the-Box configuration is fine for most environments, but as you can see there are possibilities to change the default behavoir of Adaptive Display slightly to fit your needs.
You can do this by changing the compression levels and telling Adaptive Display what’s the starting JPEG Quality and the Minimum acceptable JPEG Quality.
What do you think? Please leave a comment on your thoughts.
Please note that the information in this blog is provided as is without warranty of any kind, it is a mix of own research and information from the following sources :
– Citrix Edocs “Configuring Adaptive Display” (Contains wrong information about the default Lossy compression level values)
– Citrix Blog “Dynamic Color Compression”
– Citrix Blog “Introducing Adaptive Display”
17 thoughts on “Adaptive Display, what’s in the game? And do we need to fine-tune?”
Great article Bram!
Wow! Thank you for the detailed article. This is the stuff Citrix should have in their eDocs, very well done!
Thanks for the feedback Dolph!
Great article Bram.
Pingback: Overview of often overlooked or misconfigured settings to improve graphics performance » Jack's Server blog
Question: in scenario 1. Custom Lossy Compression Threshold, you show two Starting JPEG Quality’s. I don’t see where each of those two starting points (based on bandwidth) are set. I.E. Lossy Compression Level is “55”, but where do you set the “80”?
80 is the default when no lossy compression is applied.
Okay….so is that settable? Or a hard coded thing? If settable, which setting is that (the 80)?
No that’s not settable, the values are predefined, you can find all the values in the tables above in the post.
Nevermind….the light just came on. Excellent article!
Pingback: A graphical deep dive into XenDesktop 7 | Bram Wolfs blog
Hey ѵery nice blog!
Hello Bram, we have a Citrix XenApp 6.5 farm with approx. 2000 Dell T10 thin client, thin O/S users. The video performance is not very good and I would like to improve the situation. Your article was very helpful for me to understand what’s involved with Citrix video/image settings. I believe our thin clients cannot handle the default fps setting of 24. I’m thinking that maybe I need to adjust the progressive compression level from low to high and also adjust the default fps to a lower number. What do you think? Appreciate your input.
Changing the amount of FPS can help, I would also try to fine tune the settings in the wnos.ini in regards of the HDX settings and disable the automatic network detection feature, I recon this feature can slow down the overall session.
Changing the compression levels is also a possibility to fine-tune but be aware that this will directly influence the user experience.
I have seen the T10 as a basic TC for normal day to day office work, but don’t expect do much from it.
Bram, thanks for your reply. What are your thoughts on session reliablity. Would you recommend having it on or off? Thanks again