PDA

View Full Version : Video card built into existing circuitry


jrbick
09-26-2005, 02:17 AM
I pulled my computer apart tonight and found that I don't have a physical video card to remove. With some research, I found that it seems to be built into the circuitry. Further research reveals that I'll probably have to get in there and disable that video circuitry w/ a switch or a jumper. So, how do I identify a jumper and what do I have to do to said jumper?

Also, is this a pretty sure indication that if I install a second video card (I have empty PCI slots) my computer won't recognize it until the primary one (built into the ciruitry) is disabled?

It is a S3 Graphics ProSavage DDR "display adapter" (what it is listed as under "properties").

I just noticed that I have the ability to "enable" and "disable" "this device." Would this be sufficient when I install a new graphics card?

09-26-2005, 02:39 AM
What you have is a grahpics chip integrated into your motherboard. You can generally disable your integrated video from BIOS, or as you suggested disable it from Device Manager, but it's not always neccessary, though it wouldn't hurt either.

You usually press F1, F2, or ESC when your computer is first powered on to get into BIOS, but if you are running any recent operating system, and you plug your monitor VGA cable into the new video card, it will automatically pick it up. You will just have to install drivers from the CD that comes with the new video card, or preferable get the latest drivers from the manufacturers web site (ati.com, nvidia.com, etc.).

jrbick
09-26-2005, 02:44 AM
[ QUOTE ]
What you have is a grahpics chip integrated into your motherboard. You can generally disable your integrated video from BIOS, or as you suggested disable it from Device Manager, but it's not always neccessary, though it wouldn't hurt either.

You usually press F1, F2, or ESC when your computer is first powered on to get into BIOS, but if you are running any recent operating system, and you plug your monitor VGA cable into the new video card, it will automatically pick it up. You will just have to install drivers from the CD that comes with the new video card, or preferable get the latest drivers from the manufacturers web site (ati.com, nvidia.com, etc.).

[/ QUOTE ]

Awesome. Thanks for the help/confirmation. Yeah, it'd be ideal to be able to just plug in the new card and to still be able to use the existing one. What are my chances, 1-10?

jrbick
09-26-2005, 10:58 PM
K so I just discovered that I actually have an AGP slot (though it looks like it'd be a wierd fit out the back of my computer. How do I know if it's 2x/4x or 4x/8x? My computer was bought in 2003 and runs an AMD Athlon 2600+ processor.

So:

1.) Does it matter which slot I use, AGP or PCI?

2.) If it does matter and I have to use AGP, 2x/4x or 4x/8x?

3.) Are most cards going to be fine using both VGA and DVI outputs at the same time?

jrbick
09-27-2005, 01:33 AM
FYI I've run several googles and searched all threads in this forum w/ "video card" and couldn't find an answer concerning AGP required before PCI or anything like that. Does this mean it basically doesn't matter? PCI looks like more for money and I did read TT say that it will require less of my motherboard. Yay,nay?

09-28-2005, 12:49 PM
PCI cards share the PCI bus with any other PCI card and runs at 33MHz to the Southbridge chipset then to the Northbridge chipset.

AGP cards are connected directly to the Northbridge chipset at a greater speed, and therefore have greater bandwidth.

In other words, AGP cards are usually "better" cards.

So long as you buy an AGP card it will underclock itself to run at the 4x speed. Provided your motherboard isn't 1x/2x, any current card will run fine.