View Single Post
  #4  
Old June 1st 07, 09:17 PM posted to alt.comp.hardware
Paul
external usenet poster
 
Posts: 13,364
Default MAX resolution for a VGA connector

wrote:
The VGA connector has no limit. I have a CRT monitor that will do 2048x1536,
so no worries there.


How does this relate to the fact that some 30'' LCD monitors require a
dual connector in a higher end graphics card in order to run?


There are a couple ways to interface to a monitor.

The analog way, is via a 15 pin VGA connector. The
RGB signals on the VGA, send the three primary colors
to the screen.

As the resolution increases on the VGA interface, the
analog "bandwidth" requirement of the components has to improve
to keep up. Many video cards have a 400MHz DAC on the VGA
connector, which appears to be suitable for 2048x1536 or so.
(At least that is a typical maximum number listed.) The
cabling used, also has a bandwidth spec, which is why,
if you use a long cable, the maximum resolution that works
well, will drop. Even the connectorization can play a part,
and if you wanted a really high quality interface, you'd want
set of coax cables, instead of the VGA connector. (This is
because the VGA connector is not a perfect transmission line
environment, and using coax connectors would be better.)
So while VGA is theoretically unlimited, the effects of PI
filters, DAC bandwidth, cable bandwidth, connector quality,
all take their toll on image quality.

You might not find the visual quality of a 2048x1536 analog
connection to be that good. One compromise, is on the video
card itself. The designers use "PI filters", to remove
EMI from the video signals. Doing so, will degrade the
sharpness of the display. And this becomes more obvious
at higher resolutions. While 1280x1024 may look good, there
are no guarantees as you go higher.

So while the video card maker may claim 2048x1536 resolution
is possible over a VGA connection, the resultant image may
not be usable.

A second alternative, is a digital connection via a DVI connector.
Here, all that matters, is that the digital data move across
the cable, without the bit values being corrupted. As far as
I know, DVI has no error correction, so if some of the bits
arrive flipped, you would get colored "snow" on the screen.
That would tell you that the DVI cable is poor quality, or
that the digital bandwidth of the output interface, is not
sufficient for the job. There are articles, where various
brands of video cards, have their DVI digital output
checked, and the test used, is to look at the "eye opening"
of the signal. That spec tells you whether the receiver will
be able to pick up the digital signal OK. A good or bad
"eye" is shown in pictures here. The six-sided blue area
in the picture, is the zone to stay out of.

http://www.siliconimage.com/docs/Sit...-CTC_FINAL.pdf

In terms of DVI resolution support, there are single and
dual link DVI connections. A dual link DVI carries twice the
data of a single link, by virtue of using more of the pins.

DVI has a "clock" spec of maximum around 165MHz. During one
clock period, something like 10 data bits are passed on the
cable. Which means the data rate on the cable, is something
like 1650Mbit/sec. This is a pretty high rate, and is why
the cable quality or length can make a difference.

There are some entries in a table on this page, showing
possible resolution settings, and the data rates involved.

http://en.wikipedia.org/wiki/Dvi

A dual link DVI card (you have to check the spec to see if
it is so-equipped), can do 2048x1536 @ 75Hz. But to do it,
the link "clock rate" is 170MHz. Which demands a good
quality DVI output device on the video card. Also, a cheap
or longish cable, could cause an attempt to run this
way, to have "snow" on the screen.

QXGA (2048 × 1536) @ 75 Hz with GTF blanking (2×170 MHz)

But chances are, if you compare attempting to transmit
2048x1536 over analog, versus via DVI, the DVI will win
on visual quality. To make the analog work, you'd probably
have to take your soldering iron, and remove the PI filters
from the RGB signal path, in order to get a clean signal
from the VGA connector. The DVI should be sharper, as there
is no analog bandwidth degradation along the way. As long
as a bit is received digitally, without corruption, the
monitor now has exactly the same info content, as was present
at the output connector of the video card.

Paul