Join us on Discord:
+224|5257|Some where huntin in Wisconsin
I know i will probaly get flammed for posting this but WTF, what is the differnce between dual video cards and dual video cards with SLI i know a friend who has dual card (IDK my he has sli but i do not know) and i was kinda wondering kuz i was looking at buying a new compy, cuz the one i have right now sucks at life i htink i am running 30-40 frame rate and on low it about -50--60
so yea it sucks balls umm maybe i could have some of your knowaldge
The Magnetic Bullet Attractor
there are two different ways to be running dual video cards, SLI and CrossFire.  SLI is for nVidia graphic cards, and CrossFire is for ATI cards.  I don't know much about the difference between the two myself, except that SLI is much more popular for some reason, someone else here able to better explain the difference?

PitViper401 wrote:

there are two different ways to be running dual video cards, SLI and CrossFire.  SLI is for nVidia graphic cards, and CrossFire is for ATI cards.  I don't know much about the difference between the two myself, except that SLI is much more popular for some reason, someone else here able to better explain the difference?
SLI is just more mature than crossfire, but Crossfire is getting a lot better now. But i think unless your super rich and want the latest tech, SLI and Crossfire is useless unless u will get your second card 2-4 weeks later. Since next gen cards (1) can run faster and/or at the same speed as 1 last gen card, the price might drop let say 90 bucks on last gen, but would u spend 400 usd for another card to sli or 490 for a much better card w/ new features? Tushers, if your looking to upgrade, get a 7900gt since its one of the most bang for the buck card out there.
Decepticon Geek
+50|5248|Planet Seibertron ;)
Basically it started out by an old and assimilated company called 3dfx, where they created the Voodoo series of 3D accelerator cards.  3dfx then made it possible to combine two of these Voodoo cards together to render games at higher resolutions using a technology called Scan-Line Interleaving.  Basically one card would render every odd-numbered lines in a full frame of say 1024x768 while the other card renders every even-numbered lines.  The cards would then take these two pictures, combine them, and send it out to your computer monitor.  This was the first mainstream method of using multiple 3D accelerators in conjunction to increase the overall 3D performance.

After NVIDIA assimilated 3dfx and its technology, words then spread of the possibility of bringing SLI into NVIDIA-based chips.  This wouldn't be seen possible until the introduction of PCI Express, which mades it possible to allocate the necessary bandwidth to have multiple PCI Express video cards without inhibiting performance bottlenecks present in AGP and legacy PCI technologies today.  NVIDIA unveiled the GeForce 6000 series, where it is NVIDIA's first scalable graphics chipset.  NVIDIA coined their scalable technology as SLI, or Scalable Link Interface.  While not exactly the same as Scan-Line Interleaving from 3dfx, there are alternate methods that enables the use of both graphics chips to enhance performance.  NVIDIA's SLI uses two PCI Express x16 slots with a SLI dongle that enables the two cards to communicate with each other and must be enabled in drivers for this to work.  In order for proper operations of SLI, two cards of identical models must be used, although the brands may differ.  In other words, SLI will work on two GeForce 6800 from different manufacturers.  However, the overall speed between the two will degrade to the card with the slowest speed.

ATI's answer to SLI is called Crossfire, where you're required to have a motherboard with two PCI Express x16 ports like before.  However, the difference here becomes obvious as you are required to have 1 "Master Card" and a matching slave card.  Crossfire renders similarly the same as SLI, but the technology behind it differs.  Communication is sent directly to the Master Card, where the hardware differs slightly in that it has an onboard compositing chip that handles both the image assembly and the reception of data from the slave card.  The interconnecting link is thru the use of a DVI cable that the slave card sends digitally to the Master Card.  The Master Card takes the information and, depending on rendering mode used, handles the information for "stitching" or passes it through to the monitor directly.  A slight advantage to ATI's solution is that any video card can be used as a slave card but one must purchase a Master Card, which will be marketed as "Crossfire Edition," in order to enable this solution.  However, because the Crossfire Edition is so special, it is not in great supply.

In terms of costs between the two, Crossfire may end up being more costly to set up than SLI, as one must purchase that special Crossfire Edition video card.  Yet despite the slight advantage ATI has with its Crossfire technology, NVIDIA's SLI solution is seemingly much more cost effective and much more modular in ways that benefits some users.  Basically the high-end users are divided between those who get two NVIDIA cards at the same time and those who get one mid-range to high-end NVIDIA card now and acquire the second one later.  Some will also say that a dual video card solution is a mere waste overall as technology and prices fluctuates too rapidly in order to garner a safe investment for a gaming system.  In any case, you'll find many people who'll agree that SLI is a much more cost effective system that is both friendly to users and techies alike.

This ought to explain the majority of dual video cards.  I've left maybe one or two details out but that's just due to lack of knowledge or details.  Inquire away if I need to explain more about something.

(edit: me fail english?  that's unpossible!)

Last edited by sixshot (2006-04-30 00:26:20)

You might want to clean up your post a little... the wording is confusing.
The guys above me covered SLI and crossfire, but...
Dual or more video cards have been around for years.  That used to be the only way you could get support for multiple monitors.  I think CAD workstations were probably the forerunners in this setup.  AutoCAD would allow for 2 video cards, one that showed your drawing(s) on a BIG screen (21" crt) and a second that would show your text screen (usually smaller 15 or 17").  This in no way improved performance, it just allowed for dual monitors.  Several years ago most video card makers started building in 2 video plugs in each card so you could support 2 monitors without the expense of a second card, with as few people acutally run 2 displays I still wonder why this became the standard.
You can still have 2 video cards in any system without SLI, but for performance SLI fuses the 2 cards together to make them faster on 1 display.
Did that make sense?
Damn Command and Conquer Generals...
+62|5302|Rochester, NY
The dual use of two gfx cards are mostly used for higher resolution for big monitors and render things faster by 5 to 10 percent. EVerything else the use for a dual gfx card isn't really necessary unless you have the money for it. I say stick with one, unless you have the money or you have a huge 30" monitor or some really complex gfx intensive programs.

Board footer

Privacy Policy - © 2020 Jeff Minard