Unlock all of us!3.86 / 5.00 30,714 Views
As DD's story comes to a close, another looms from the not-so-distant shadows....3.86 / 5.00 7,416 Views
Blast at rockets with awesome graphics and music3.90 / 5.00 8,355 Views
I've heard about never considering an integrated card. Are they that bad? How bad are they compared to dedicated ones?
Also explain technically why they are so bad compared to dedicated ones.
I don't think it's a case of one being universally worse than the other. It all depends on how you use your computer.
A computer with 2GB of RAM and an integrated graphics card can theoretically run games better than the same computer with a 1GB dedicated RAM card, because under ideal conditions it'd be able to dedicate almost all 2GB of its own memory to graphics, whereas the dedicated card has a maximum of 1GB, no matter what the rest of the computer's memory is doing.
However, if you have background programs running (which most computers do), then large chunks of integrated RAM can be lost in the middle of a game if a program of higher importance decides it wants to do something while you're playing (antivirus programs are particular bastards for this).
It all comes down to how well you can manage your computer's memory for it. If left to its own devices it'll often result in unreliable / worse graphics if it uses an integrated system. Integrated systems are potentially better, but are so much of a pain to set up to their full potential that in 99% of cases a dedicated one is more useful IRL.
.... Is using "IRL" to describe computer graphics an oxymoron? idk.
Oh, and one thing I forgot:
Something that often gets overlooked by most people is the DDR value of the RAM you're using. My old computer and my current one both had 2GB dedicated graphics cards. The difference was that my previous one used DDR1 and my current one uses DDR3. The result is that modern game graphics (e.g Skyrim) are a lot smoother and the frame rate is a lot less choppy. The DDR upgrade was like adding at least an extra 1GB of memory in the case of these two computers.
At 1/7/13 07:30 PM, Sheizenhammer wrote: It all comes down to how well you can manage your computer's memory for it. If left to its own devices it'll often result in unreliable / worse graphics if it uses an integrated system. Integrated systems are potentially better, but are so much of a pain to set up to their full potential that in 99% of cases a dedicated one is more useful IRL.
The only thing is, video cards aren't just for ram. Onboard graphics are usually handled by the CPU in newer computers, so the CPU has to multitask between computer instructions and graphics processing. A video card has it's own dedicated shaders, processors, and dedicated video memory. Intergrated systems don't really compare to dedicated because their vertex shaders and pixel shaders are generally lower and they don't get their own dedicated ram just for video processing.
My old computer ran with the built in graphics card, couldn't even run games like Left 4 Dead without lowest graphic settings. Now that I have an actual card I can run anything I have used so far with maximum graphics.