If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
|
Thread Tools | Display Modes |
#11
|
|||
|
|||
Monitor fed by VGA and DVI-D thru KVM
Tech Zero wrote:
Reality folded in on itself, and somewhere the following words from "Jaz" appeared in history: One last thing, when I mentioned "adapters" I was only referring to a connector format change from 15-pin d-sub to DVI-I. This simply lets me plug a VGA cable into a DVI-I port on the KVM switch. These adapters are everywhere -- one comes included with almost all DVI video cards. Just to add to Paul's comments, at the "bolts" side of things as it where... The adapters that DVI-I cards have a male DVI-I connector with a female sub-D VGA at the other end, so to put it bluntly *the genders don't match there mate. Simple Rules: - Video cards have female connectors - Monitors have male connectors - Video cables only go in one direction This was origonaly done to differentate serial ports from video ports back in the CGA days, since they used the same style connectors. Video have as evolved since then but the gender diffrences remain. That said, there are DVI-A cables out there that may work; thay have a male VGA connector at one end and a female DVI-A connector at the other... But I don't know if they have nay issues... Most HD15 VGA cables seem to be Male-Male. Some KVM switches are Male (Tripplite) which require a M-F cable, but... Anyway, I beliave all video cables are functionally bidirectional -- if the gender doesn't match then a gender-changer is in order. I spoke with a very knowledgable fellow at Lindy Computer Connection Technology Inc, specifically about their #41219 cable, and they said that what I'm trying to do should work, at least with their own KVM. They also said that the generic DVI-I-to-HD15(vga) adapters that you typically use for changing a video card's DVI-I output to HD15 for your VGA monitor, are bidirectional, and should 'do the right thing' on the KVM's output (testing using a VGA monitor -- without the splitter). I hope this is true, since I can imagine there could be lots of issues with various signals like you describe, such as monitor type/ manufacturer, etc. Certainly this could be be the culprit. Any takers? Over the past week I emailed and phoned TRENDware to get the final word on whether this is a digital-only or analog/digital switch, and they maintain that it's indeed DVI-I analog/digital. I asked if it's possible that something might have been lost in translation with the Taiwanese manufacturer's specification sheet and that it may not actually be DVI-I, but DVI-D instead. They said 'No'. Hmmm... I suppose that aside from cabling problems, it's possible that my switch is faulty and needs to be replaced. Another problem I see is with the DVI source PC -- it starts up in analog mode if that port is not active at boot-up. Sheesh! Is there no end to these issues?! Might this be a shortcomming of the DVI standard? Shouldn't the KVM switch provide the needed signal to the DVI source so that it doesn't default to VGA mode? (and because this KVM doesn't pass analog, the monitor is blank). Well, perhaps the best thing to do at this point is to go all DVI-D/I. The reason I didn't in the first place is that two of the old PCs simply don't have the option of changing cards (built-in VGA) -- they have needed stuff on them I and don't want to spend the effort in moving application and data to new systems. --Jaz (Please excuse the 'burp' when replying (b)) |
|
Thread Tools | |
Display Modes | |
|
|