Use MPC Video Decoder or ffdshow DXVA to leverage hardware acceleration and get the best possible picture quality. See the forum post Display calibration for more details: a decoder for the new track could not be found. track type: SRT/UTF8
If you want to get really adventurous, then buy yourself an EyeOne Colorimeter and calibrate like a pro, including adjusting your grayscale. See here for more details on that: Greyscale & Colour Calibration For Dummies
In my experience, a great difrerence is made by the way the monitor is connected to the computer.
Right away, a HDMI connection seemed to deliver a very contrasted, maybe over-contrasted picture with lots of black in it, while DVI delivered less contrast and color. Also, the two offered different sets of image controls on the monitor. For example, the HDMI input offers a color saturation control, while DVI does not. That might depend on the monitor too.
Also, HDMI never let me use the native panel resolution. I always needed to tweak the image. I am not into HDMI enough as to point out a reason for this, but it occured to me on both monitors. The 400DX's native resolution of 1366x768 is available only when using DVI. The 32WL58's native resulution, which is supposed to be 1366x768, did not give a good enough picture, I had to settle for 720p. In both cases, the image edges were cut off for some reason, so the desktop had to be resized to fit the panel, further reducing image resolution. I ended up with somthing like 1088x684 for the 32WL58 and 1824x1016 for the 400DX.
As you see, despite all the drawbacks, I settled for HDMI in both cases. The 32WL58 alternatively only offers analog VGA, giving an image quality that was unimpressive. The 400DX does offer DVI, but lacks the capability for black point adjustment - both brightness and contrast controls seem to modify the white point. I don't know wheter this is a bug or not, but that ruled out DVI instantly on that device, although I could have gotten the native resolution without undersampling.
So now lets get into the procedure. The ambient light should be low during screen adjustment. We will be going through the signal chain from back to front, starting with the monitor. These three controls will be needed:
Some monitors offer a gamma control, mine doesn't. So I used ATI's gamma adjustment in CCC.
First, let's adjust the black point. To me, this is a most important value, as it highly determines the perceived image contrast. Set your Windows desktop to solid black and hide the task bar, then use the brightness control on your monitor to find the threshold where the image starts to turn dark grey (45 out of 100 on the 400DX). Set a value just below the threshold, and try to keep the image as "black" as possible. Watch the screen from a close distance while you do this.
Next, adjust the white point. Most LCDs do this by setting the backlight intensity, so set this value as high as possible. The 400DX can be very bright, so I used 70 out of 100.
Finally, adjust the grays by setting the gamma value. This is a highly subjective task, some people may prefer higher contrasts than others. Also, in CCC there are different settings for "desktop", which applies for windowed display, and "3d", which applies for fullscreen video and VMR9 exclusive mode. Be sure to set the two equally.
I found this resource to be helpful:
Here you will find all sorts of test prints. First, use "Einser" and display it full screen.
At this point, you have set up your monitor to display the RGB desktop color space correctly. Still, if you watch a video or TV, you see black appear as gray. Why is that?
The RGB values, ranging from 0 to 255 for each color, start off at a value of 16, ranging up to 235, which seems to be an industry standard. So in RGB color space, the representation of black should be (0,0,0), but it actually is delivered as (16,16,16), which means dark gray in our own color space. The way around this is to transform the video color space into our now calibrated RGB desktop color space.
For all those playing high definition content, you should definitely grab a copy of Digital Video Essentials: HD Basics and calibrate your TV/monitor using the test patterns and instructions on that disc. It is available in both BluRay and HD-DVD formats.
If you don't have an HD-DVD or BluRay drive, then you can always grab the AVS calibration disc files (mount the ISO image) from AVS HD 709 - Blu-ray & HD DVD Calibration Disks - AVS Forum.
If you're using an ATI card outputting to a TV that expects video levels (16-235), then you should make the following changes to your set up: - Make sure you have the UseBT601CSC registry hack for SD content. See ATI HD Registry Tweaks for more details. - Set your brightness/contrast in ATI Catalyst Control Center -> Color (not the one under Avivo Video settings) to +31/73 respectively. This will output everything at 16-235 levels instead of 0-255. Under the Avivo Video settings -> Basic make sure the "Use application settings" checkbox is enabled. - If you're using PowerDVD Ultra for BluRays or HD-DVDs, then the brightness/contrast settings in Catalyst Control Center don't have an effect and PowerDVD Ultra will still output HD content at 0-255 PC levels. To get PowerDVD Ultra to output HD content at 16-235 (video levels) you actually have to adjust the brightness/contrast in PowerDVD itself. While playing a BluRay or HD-DVD, right-click and select Configuration, and then go to the Video tab, hit Advanced, and go to the Color tab and set brightness to +19 and contrast to -5. Now if you calibrate using the DVE: HD Basics or the AVS disc in PowerDVD Ultra, your display's brightness/contrast settings will be consistent across all videos you play (even the ones in MediaPortal).