A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » General
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

help please. DVI - VGA adapters



 
 
Thread Tools Display Modes
  #1  
Old December 17th 03, 12:46 AM
purmar
external usenet poster
 
Posts: n/a
Default help please. DVI - VGA adapters

Hello,

I got computer with video card ATI Radeon 9200 (128 MB). I was going to
connect my two monitors, both with VGA plugs. At the time I did not know
difference between DVI-D and DVI-I so I thought that I would simply buy a
DVI-VGA adapter for $5 and later, when I would buy LCD monitor, I would
simply unplug the adapter and would be good to go.

Well, I have all the things, computer is working, but only on one monitor.
I got adapter that connects DVI-I to VGA. But my video card has connector
for DVI-D!!!! Please, somebody help me. I cannot find DVI-D to VGA
anywhere. What should I do. Does it exists? Should I add another video
card? Replace the one that I have?

Any idea will be very appreciated.

Thanks

Purmar


  #2  
Old December 17th 03, 04:35 PM
Kent_Diego
external usenet poster
 
Posts: n/a
Default

It's been my experience that the ATI cards with DVI-D connector (8500LE) did
so since they only had one RAMDAC and cannot support dual monitors. All
higher end cards have DVI-I with analog output and support dual monitors
just fine.

You got screwed buying a low end card.

-Kent


  #3  
Old December 17th 03, 05:39 PM
Tony A.
external usenet poster
 
Posts: n/a
Default

"purmar" wrote in message
k.net...
[...]
Well, I have all the things, computer is working, but only on one monitor.
I got adapter that connects DVI-I to VGA. But my video card has connector
for DVI-D!!!! Please, somebody help me. I cannot find DVI-D to VGA
anywhere. What should I do. Does it exists? Should I add another video
card? Replace the one that I have?


DVI-D = DVI-Digital. There is no analog signal in a DVI-D interface, so a
DVI-D to VGA adapter is not possible, I'm afraid. You can only use an LCD
wth a digital input.

Tony


  #4  
Old December 17th 03, 05:46 PM
scott
external usenet poster
 
Posts: n/a
Default

"Tony A." wrote in message
news
"purmar" wrote in message
k.net...
[...]
Well, I have all the things, computer is working, but only on one

monitor.
I got adapter that connects DVI-I to VGA. But my video card has

connector
for DVI-D!!!! Please, somebody help me. I cannot find DVI-D to VGA
anywhere. What should I do. Does it exists? Should I add another video
card? Replace the one that I have?


DVI-D = DVI-Digital. There is no analog signal in a DVI-D interface, so a
DVI-D to VGA adapter is not possible, I'm afraid. You can only use an LCD
wth a digital input.


Which seems ironic as most LCD panels will convert the signal back to
analogue again anyway ;-)

You could, in theory, get a converter that consists of DACs to change it to
analogue. Not sure if that exists, but would probably be more expensive
than the video card!

Scott


  #5  
Old December 18th 03, 10:05 AM
Tony A.
external usenet poster
 
Posts: n/a
Default

"scott" wrote in message
...
[...]
DVI-D = DVI-Digital. There is no analog signal in a DVI-D interface, so

a
DVI-D to VGA adapter is not possible, I'm afraid. You can only use an

LCD
wth a digital input.


Which seems ironic as most LCD panels will convert the signal back to
analogue again anyway ;-)

It's the other way round - LCDs are fundamentally digital, but most take an
analogue signal and convert it to digital, purely for compatibility with the
majority of vid cards which only have analogue output. DVI-D is very
sensible, getting rid of the whole digital-analogue-digital conversion, it's
a shame analogue VGA is so standard that it hasn't make much of an impact
yet.

Tony


  #6  
Old December 18th 03, 11:03 AM
scott
external usenet poster
 
Posts: n/a
Default

"Tony A." wrote in message
...
"scott" wrote in message
...
[...]
DVI-D = DVI-Digital. There is no analog signal in a DVI-D interface,

so
a
DVI-D to VGA adapter is not possible, I'm afraid. You can only use an

LCD
wth a digital input.


Which seems ironic as most LCD panels will convert the signal back to
analogue again anyway ;-)

It's the other way round - LCDs are fundamentally digital, but most take

an
analogue signal and convert it to digital, purely for compatibility with

the
majority of vid cards which only have analogue output.


Believe me, the digital signal you give to your TFT LCD goes straight into a
DAC to produce an analogue voltage to drive the pixels at different
brightnesses. The analogue signal does not need to go through this DAC
(depending on design of the panel, some convert it to digital, then back to
analogue again to do the filtering and improve timing/sync).

DVI-D is very
sensible, getting rid of the whole digital-analogue-digital conversion,

it's
a shame analogue VGA is so standard that it hasn't make much of an impact
yet.


The point of DVI-D is that it gets rid of all the timing problems that you
normally need to set with monitors. Because the LCD panel has a digital
input, it can generate it's own analogue signal that will exactly correspond
to each pixel, whereas if you have an analogue input it just has to guess
where each pixel is (in time) on the video signal. Hence you need to set
the width, height and offsets (mostly done automatically though) with
analogue input, but no need with digital.

DVI-D is better because the DAC is in the monitor, not the graphics card.
In most cases there is just one digital to analogue conversion, and that
either happens in your graphics card or in the monitor.

Scott


  #7  
Old December 18th 03, 04:04 PM
Tony A.
external usenet poster
 
Posts: n/a
Default

"scott" wrote in message
...
[...]

Believe me, the digital signal you give to your TFT LCD goes straight into

a
DAC to produce an analogue voltage to drive the pixels at different


You're kind of right, but it doesn't quite go "straight into a DAC", that's
somewhat simplistic, see below.

brightnesses. The analogue signal does not need to go through this DAC
(depending on design of the panel, some convert it to digital, then back

to
analogue again to do the filtering and improve timing/sync).


It's not quite as simple as just applying a voltage to the pixel
porportional to the brightness you want. For example liquid crystal
deteriorates due to dc stress, so each pixel needs to be alternately driven
positively then negatively on successive writes to minimise the net dc
stress. Your analogue VGA signal doesn't do that, that's one reason you
can't just apply your analogue input, or a simple derivation of it, directly
to the LCD pixels.

For that and other reasons, I'd be willing to be bet that there aren't any
bare LCD panels that have analogue pixel voltage inputs, though as ever
ICBR.

Tony


  #8  
Old December 18th 03, 04:33 PM
scott
external usenet poster
 
Posts: n/a
Default

"Tony A." wrote in message
...
"scott" wrote in message
...
[...]

Believe me, the digital signal you give to your TFT LCD goes straight

into
a
DAC to produce an analogue voltage to drive the pixels at different


You're kind of right, but it doesn't quite go "straight into a DAC",

that's
somewhat simplistic, see below.


OK, so it goes through a look-up table first, but then pretty much straight
into the DAC ;-)

brightnesses. The analogue signal does not need to go through this DAC
(depending on design of the panel, some convert it to digital, then back

to
analogue again to do the filtering and improve timing/sync).


It's not quite as simple as just applying a voltage to the pixel
porportional to the brightness you want. For example liquid crystal
deteriorates due to dc stress, so each pixel needs to be alternately

driven
positively then negatively on successive writes to minimise the net dc
stress.


Normally the voltage on the other side of the LC to that of the signal is
altered each frame to help with that.

For that and other reasons, I'd be willing to be bet that there aren't any
bare LCD panels that have analogue pixel voltage inputs, though as ever
ICBR.


Yeah I see what you mean, I guess it's probably easier for the brightness
and contrast to alter the signal digitally anyway!

Scott


  #9  
Old December 21st 03, 09:01 PM
jasmith
external usenet poster
 
Posts: n/a
Default

Can't help myself from butting in here, but this card is a good mid level
card and I have 2 of them. Both of mine have an adapter to connect to D-sub
connection and it gives VGA output to be used w/analog LCD monitor or a
digital LCD. Why could you not hook it to an analog monitor?
"purmar" wrote in message
k.net...
Hello,

I got computer with video card ATI Radeon 9200 (128 MB). I was going to
connect my two monitors, both with VGA plugs. At the time I did not know
difference between DVI-D and DVI-I so I thought that I would simply buy a
DVI-VGA adapter for $5 and later, when I would buy LCD monitor, I would
simply unplug the adapter and would be good to go.

Well, I have all the things, computer is working, but only on one monitor.
I got adapter that connects DVI-I to VGA. But my video card has connector
for DVI-D!!!! Please, somebody help me. I cannot find DVI-D to VGA
anywhere. What should I do. Does it exists? Should I add another video
card? Replace the one that I have?

Any idea will be very appreciated.

Thanks

Purmar




  #10  
Old December 22nd 03, 02:55 PM
Gary Tait
external usenet poster
 
Posts: n/a
Default

On Sun, 21 Dec 2003 21:01:52 GMT, "jasmith"
wrote:

Can't help myself from butting in here, but this card is a good mid level
card and I have 2 of them. Both of mine have an adapter to connect to D-sub
connection and it gives VGA output to be used w/analog LCD monitor or a
digital LCD. Why could you not hook it to an analog monitor?



Because there are two DVI standards. One concurrently outputs an
analog signal out the DVI port (which that little adaptor hooks up
to), the other does not, requiring and external DAC box to use an
analog monitor. The former uses a simple plug adaptor, The OP
apparenty has the latter type.


"purmar" wrote in message
nk.net...
Hello,

I got computer with video card ATI Radeon 9200 (128 MB). I was going to
connect my two monitors, both with VGA plugs. At the time I did not know
difference between DVI-D and DVI-I so I thought that I would simply buy a
DVI-VGA adapter for $5 and later, when I would buy LCD monitor, I would
simply unplug the adapter and would be good to go.

Well, I have all the things, computer is working, but only on one monitor.
I got adapter that connects DVI-I to VGA. But my video card has connector
for DVI-D!!!! Please, somebody help me. I cannot find DVI-D to VGA
anywhere. What should I do. Does it exists? Should I add another video
card? Replace the one that I have?

Any idea will be very appreciated.

Thanks

Purmar




 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
problems with network setup under win2000 Kai Lehmann General 1 December 12th 03 12:39 PM
P3-800 vs Celeron 1.4 --> video encoding time PS General 15 September 21st 03 06:14 PM
USB, Parallel Scanner, Serial Joystick The Kid General 2 September 1st 03 04:38 AM


All times are GMT +1. The time now is 09:48 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.