As I see it, the "wide gamut LCDs are bad idea" part of Karl's post
seems very well funded. To summarize:
1) Current DVI interfases only offer an 8-bit datapath per stream. The
current implementations of dual-stream are only geared to increase
resolution not bit depth. I don't know if driving LCDs via analog
(which can have any bit depth) is a good idea, surely there are many
issues around it.
2) it doesn't matter how many bits you've got in the video card or the
monitor's internals, the bit depth is that one of the weakest link, in
this case 8-bit DVI.
3) wide gamut, driven in 8-bit, gives you coarse steps. This is akin
to the debate of 8-bit Lab. If step #100 is away 5 dE from step #101,
making a subtle 2dE adjustment there is just not possible.
On the other hands, those high-gamut images, what are they destined
for? if destined for print, the best approach would be to pull back
the monitor's gamut to print's gamut by using the display's internal
higher-bit LUTs. Then usage of your 8-bit DVI would be maximized in
respect to your final destination. But as always, the misfit shape of
CMYK gamuts in relation to RGB gamuts makes this difficult.
If destined for a high-gamut printer, then I guess the ideal would be
trying to match the display's tone response curve and the printer's
one, be it via 16-bit calibration of the printer or via the display's
internal higher-bit LUTs, or both.
Maybe we could apply Gracol v7's tone response calibration concept to
displays too? :)