Thanks for this. For a little bit of a novice can you elaborate on a couple
of points. When you say the "frame buffer", how does this relate to the
front-side (PCI) bus? I thought that the latter is a real bottleneck and
that current OS - well Mac OS-X Tiger at least - were >8 bit capable. Most
expect the next generation of Mac desktops to adopt at least PCI Express X16
freeing up this bottleneck substantially.
Re DVI, anything I've read on capacity gets confusing quickly. It would be
great if you could add a little clarity. For example, I understood DVI and
HDMI to have the same video bandwidth. hdmi.org claims that HDTV (with 8
channel 192kHz 24 bit audio alongside the video) uses less than half of
HDMI's 5Gbps bandwidth and hence has plenty of bandwidth for future
upgrades. Various sources mention a single link DVI bandwidth of a "maximum
of 165 MHz (1920x1080 at 60 Hz, 1280x1024 at 85Hz)" without mentioning bit
depth. ("Dual link DVI supports 2x165 MHz (2048x1536 at 60 Hz, 1920x1080 at
85 Hz)" again without mentioning bit depth.) It would seem that with
"consumer video electronics", ie HDTV and the like, embedding HDMI and
computer and computer display manufacturers grappling for ways to get over
the 8 bit colour issue that the two are heading in different directions
again. The new UDI is said to be compatible with HDMI but are there going
to be bit depth bottleneck issues due to bandwidth constraints? I guess my
points are predicated on whether or not HDMI has in fact the same video
bandwidth as DVI or whether it's already substantially better. Is UDI
intended to address 16 or better bit per channel colour? How long do we
have to wait for the display path to catch up with our editing workflow?
Your thoughts and knowledge would be greatly appreciated.
Thanks in advance
> From: William Hollingworth <will@airm...>
> Date: Sun, 08 Jan 2006 00:14:16 -0600
> To: <colorsync-users@list...>
> Subject: Re: On SpectraView 2180WG LCD
> While the Matrox board has 10 bit DACs (as do ATIs), this is of course only
> useful for older analog monitors. Modern digital DVI monitors are not able
> to take advantage of this.
> Additionally, the data going into the 10 bit DACs is still 8 bit data from
> the frame buffer but going through an 8 in x 10 bit out LUT. While having a
> 10 bit LUT on the video card is a great step forward, it is useless on
> todays digital monitors since single-link DVI is an 8 bit bottleneck.
> The ATI Avivo chipset mentioned that features 10 and 16 bit output is
> currently only really useful for motion video, since the video stream data
> can be processed (gamma, de-interlacing, color conversion, scaling etc.)
> within the card at higher than 8 bit depth.
> However until the core OSs and applications are updated to support frame
> buffers of >8 bit depth, this is not of much use to all of us using
> Photoshop etc.
> FYI - There is a tech paper on the LCD2180WG-LED that explains some of the
> issues with color bit depths as related to the increased gamut size of the
> LED display, and why it is not quite such an issue as one individual has
> suggested it to be. Certainly none of the color professionals worldwide who
> have been involved with the display throughout it's development have raised
> it as being a concern.
> http://www.necdisplay.com/products/LCD2180WGLED_Techpaper.htm >
> Will Hollingworth
> Manager of OEM Product Design & Development Engineering
> NEC Display Solutions of America, Inc.