PDA

View Full Version : DVI Splitter Cable won't fit on my graphics card...


12-03-2005, 09:10 PM
Hi,

I just bought a "Molex DVI Splitter Cable" (to be able to plug into my
graphics card and then plug 2 DVI Cables for 2 Monitors into the Splitter cable...

The DVI Cables for both of my Dell LCD Monitors fit into the female acceptance plugs
on the Molex Splitter, but the splitter doesn't fit into the grapics card DVI slot (there are a lot more pins than could possibly fit into the DVI port on my graphics card...There are 4 row of pins and there are only 3 on the normal DVI plugs, so I don't get it...( I counted them and there are 59 total pins on the Molex thing....there is no way it will fit onto the back of my card...
...It there another adapter I need or do I need to get a new graphics card?

Please help! I need to start 8 tabling w/ no overlap ASAP! lol

Joe M.

12-03-2005, 10:11 PM
I think the 59 pin only fits into certain NVIDIA graphics cards:
""
Presently, all of our current Geforce and Quadro family of GPU's (Graphics Processing Unit) supports dual monitors. However, in order to support dual monitors, the graphics card must include two ports to allow two monitors to be connected to the PC or Macintosh. The only exception is the Quadro NVS family which includes a special type of video connector called a DMS connector which carries a video signal for two monitors. The DMS connector hooks up to a special Y cable which splits to dual VGA or dual DVI. This cable will not work with standard graphic card's DVI or VGA port. Therefore standard graphic cards that do not have the special DMS port will not be able to send out two separate display signals to two different monitors.

""

The $5. splitter I bought only is used on the Quadro NVS cards...the 59 pins I guess
lets it carry 2 digital signals for 2 monitors...(and also 2 signals for the VGA monitors...)

What my graphics card does have is a DVI-I and a DSUB, so I guess I
have to plug 1 of my LCDs into each and just deal with the lower image quality on the
second monitor...(I know that my card supports Dual monitors...)

Any help or advice anyone has - please let me know...! (aside from buying a new card! lol)

Nomad84
12-04-2005, 01:19 PM
[ QUOTE ]
What my graphics card does have is a DVI-I and a DSUB, so I guess I have to plug 1 of my LCDs into each and just deal with the lower image quality on the
second monitor...(I know that my card supports Dual monitors...)

Any help or advice anyone has - please let me know...! (aside from buying a new card! lol)

[/ QUOTE ]

Nope, that's about it. If you have one video signal coming out of your DVI connect, there is no way to split that one signal into two separate signals to drive two displays. Using a splitter on it won't help. Without buying a new video card, using one display on DVI and the other on the VGA connector (D-sub) is your only option.

12-05-2005, 12:08 PM
Thanks - I kinda figured this out on my own...

the image quality on the VGA connector (DSUB) isn't that bad...and, what I did was I went and downloaded a few video drivers (one from Dell and the latest from nZone) that noticably
improved the image on the VGA...(I got my computer in Aug. 2002 and hadn't downloaded
a new video driver since then, so I guess I was overdue here...).

I have a 2005FPW Dell (as my second monitor) and it wasn't able to get the 1680 x 1050 max Res., but after I got the new driver it can...

Question though, I have a Dell dimension 8200, 2.53 GHz, 533 FSB with 1,024 MB PC800 RDRAM, 120 GB 7200 rpm Hard Drive, 128 MB NVIDIA GeForce Ti 4600 - about all the best stuff
you could have gotten back in Aug. of '02...Does it make sense to try to cheaply update/upgrade
this rig., or should I just put her out to pasture and buy a new PC???

Any quick 'n cheap upgrades you would recommended doing to tide me over until that next
PC purchase???
/any particular weakpoints in my system that could cheaply be upgraded?
What's "the weakest link" in my system? lol

Thanks again for the advice! Joe