There was once a time when each component of a computer was separate. Even rudimentary functions like audio, video and hard drive storage interfaces, had their own physical card that plugged into a master bus. This “mother-daughter” card interface ruled the mainframe systems of the eighties, allowing system owners to add and subtract features as necessary.
But as miniaturization advanced and part design became more efficient, manufacturers began to combine systems together. Motherboards, which were once used simply to connect components to one another, now include dozens of discrete functions. Many of the functions that began on daughter cards, like networking and audio, are now built in to the motherboard.
While motherboards can handle a lot, they can’t handle everything. The same integration trend that made motherboards multipurpose machines has also taken hold in CPUs. Today, many CPUs have graphics processing built in. This combination of CPU and GPU is called integrated graphics.
Because integrated graphics are built into the CPU, they can’t be as powerful as dedicated graphics cards. There are a few reasons for this.
Building an efficient graphics processing machine is expensive, and that cost has to be passed on to the consumer. Adding high-powered graphics to a CPU would drive the cost up far beyond simply pairing that CPU with a dedicated graphics card. When folks use integrated graphics, they’re looking for a system that can get the basics done with minimal cost and complexity. They want a graphics card that can display the operating system’s interface, handle basic animations and play videos, not something that can handle heavy-duty 3D graphics processing. So there’s no incentive to produce a highly-expensive, high-quality version of integrated graphics.
But even if there was demand for rocket-powered integrated graphics, there are physical limitations on integrated graphics that would make implementing such a solution challenging. Graphics processing produces significant heat and requires substantial power. Dedicated cards often pull more power than processors and require their own cooling system to manage the heat they produce. And heat is the mortal enemy of highly-efficient silicon like CPUs. Dropping a huge source of heat next to the processor would degrade its performance and shorten its lifespan.
By removing these limitations, dedicated cards can be significantly more powerful than integrated graphics. If you purchase a dedicated graphics card, you’ll find that, first, it’s probably more expensive than your processor. Obviously this has a lot to do with market forces, but it’s also based on researching and manufacturing costs. Higher retail price means more money for design and development. This means companies can push the envelope, designing better GPUs for a market hungry to buy them.
Most consumer-grade dedicated graphics cards also include their own active cooling system. These vary in quality, ranging from loud and cheap fans to expensive, well-designed units. Typically, the more expensive the card, the better the cooling system it supports.
Dedicated cards also allow for customization. If you want to overclock your graphics card, you’ll need to increase your cooling power. Add-on water cooling blocks or more powerful fan assemblies make this possible.
Conclusion: Which should you use?
It depends on how much power you need from your graphics card. If you want to do any gaming, 3D rendering or video editing, you’ll want the most powerful graphics card you can afford and your system can handle. But if you just need a computer to handle nine-to-five tasks like spreadsheets, web browsing and email, there’s no reason to use a dedicated card.
Image credit: Nick Stathas