Most people don’t need a dedicated graphics card and are fine using the iGPU built into their computer’s CPU. But a dedicated GPU is a must for demanding gaming and GPU-enhanced tasks like video editing, graphic design, and AI projects.
There’s always a lot of chatter about computer graphics cards, whether it’s about stock shortages, prices, or new innovations. But it’s not always clear who actually needs one, though. Let’s dig in and talk about the basics, when you need one, and when you don’t.
What’s the Difference Between Integrated and Dedicated GPUs?
The headline of this article is a bit of a trick question, in a way. Every computer needs a GPU (Graphics Processing Unit) of some sort. Without a GPU, there would be no way to output an image to your display.
The real crux of our inquiry today isn’t whether or not you need a GPU, but whether or not you need a dedicated (or discrete) GPU, or graphics card.
Integrated GPUs: Money for Nothing and Our Pixels for Free
In the early days of PCs, computers came with a graphics card. Functions like network connectivity, video display, and sound output were handled by dedicated expansion cards.
In the mid-1990s some workstation computers started shipping with the graphic card function integrated into the motherboard. The motherboard-integrated graphics design persisted well into the 2000s, but the arrival of the Intel Clarksdale processors (2010) and the AMD Llano processors (2011) ushered in the age of iGPUs—graphics processors integrated into the CPU itself.
Integrated GPUs are great because they’re free (and hassle-free). You don’t even have to think about them–just combine a consumer-class motherboard and CPU (or buy a pre-assembled computer from a retailer like Dell or Best Buy), and boom, you’ve got somewhere to plug in your monitor.
Integrated graphics are also very power efficient since they use very little power beyond what the CPU was already using in the first place. And, thanks to their standardization, you’ll rarely run into any issues with drivers or compatibility. On a modern Windows machine, everything will just be taken care of for you.
For most activities like browsing the web, writing email, using office and productivity software, basic image editing, and so on, the iGPU is more than powerful enough. Integrated graphics are even sufficient for light gaming (and getting better every generation).
While you won’t be able to play AAA game titles with the graphics set to Ultra Supreme Face Meltingly Realistic using your iGPU, you’ll have no problem playing best-selling titles like Stardew Valley, Hades, Disco Elysium, Celeste, or any other number of award-winning but undemanding titles.
Of course, integrated graphics aren’t without their downsides. An integrated GPU shares all the resources the CPU shares, including your pool of RAM. This means any graphics-heavy task you throw at the integrated system, like rendering video, playing a current generation 3D video game, or the like, will consume a hefty chunk of your system resources, and there might not be enough to go around.
If you’re not running into issues playing your favorite games or struggling to run GPU-intensive applications you need for your work or hobbies, it’s probably a good idea to save money and stick with the iGPU until a pressing need arises. And hey, if your pressing need is a desire to play around with AI-art generation, we’re here to help.
Dedicated GPUs: Premium Pixel Pushing at a Premium Price
On the opposite side of the GPU spectrum, in terms of both price and performance, you’ll find dedicated GPUs. Dedicated GPUs, as the name implies, are separate pieces of hardware devoted exclusively to handling graphic processing.
When you hear someone say, “I bought a new video card for my computer,” or “I need a new graphics card to play Super Soldier Simulator Shoot Shoot 9000“, they’re talking about a dedicated GPU.
The biggest benefit of a dedicated GPU is performance. Not only does a dedicated graphics card have a sophisticated computer chip designed explicitly for graphics rendering, but it also has dedicated Video RAM (VRAM)—which is faster and better optimized for the task than your general system RAM.
This offers a performance boost not just in obvious applications, like playing demanding video games, but also in applications that can take advantage of the additional processing power and resources the dedicated GPU provides.
In addition to a radical performance increase, dedicated GPU cards offer a better array of ports than a motherboard alone, making it easy to set up multiple monitors (or plug in a VR headset).
Sounds good, right? Way better performance, ports, ports, and more ports. What could be better? While all those things are awesome, there’s no such thing as a free lunch. First and foremost, there’s the matter of cost. A midrange GPU can run anywhere from $250-500 and up, and cutting-edge models can cost up to $1000 or more.
On top of that, if you’re upgrading an existing PC, you’ll have to contend with the demands the new dedicated GPU will impose. GPUs are more powerful than ever and certainly more power-hungry—it’s not unusual to find you need to upgrade your computer’s PSU to support your new GPU. And if you’re upgrading a pre-built PC, it’s almost a guarantee you’ll need a new PSU. (If you find yourself upgrading your PSU, be sure not to reuse the old PSU cables!)
Speaking of power use, increased power draw in electronics means increased heat. There’s a reason high-end GPUs have huge heatsinks and fans to keep them cool. Be prepared for more noise and more heat. You may even need to upgrade your case and/or case fans to keep things cooler.
Even if you don’t need to upgrade your case for airflow, you may need to upgrade your case just for space–the last GPU we purchased just barely fit in our mid-tower PC case, and even a fraction of an inch extra length on the GPU heat sink would have necessitated an upgrade.
Also, many modern GPUs are so big and heavy that they suffer from “GPU sag” and need support brackets. And you may find, when upgrading your GPU, that your CPU is now bottle-necking your performance, and you need to upgrade the rest of your computer too! There’s clearly a lot to keep in mind when considering moving to a dedicated GPU.
So Do I Need a Dedicated GPU?
So now you know how a dedicated GPU compares to its integrated cousin, but when should you make the jump to a dedicated graphics card?
The process of picking a specific graphics card over any other graphics card is fairly complex, and you may spend quite a bit of time comparing stats and wringing your hands, hoping you’re getting the best possible deal—but the process of deciding whether you need a dedicated GPU in the first place is pretty darn simple. Let’s look at the only two questions that really matter in the decision making process.
Can Your Current Setup Handle the Games and Apps You Use?
The biggest reason most people get a dedicated GPU is gaming. You don’t need a dedicated GPU for watching video (even razor-sharp HD and 4K video). You don’t need a dedicated GPU for email, word processing, or any Office suite-type apps. You don’t even need a GPU for playing older games, as today’s integrated graphics are far better than the dedicated video cards of decades past.
However, you need a dedicated GPU for playing calculation-intensive modern 3D titles in all their silky smooth glory, especially if you want to play at 1440p or 4K resolutions on your nice new gaming monitor. It’s also worth mentioning that dedicated GPUs are useful for game streamers, even if they aren’t streaming demanding games because popular gaming software and capture cards offload demand to the GPU.
Graphics cards are also useful for some non-gamers, too. If you do a lot of photo editing (not just cropping and fixing the white balance type stuff, but intense Photoshop work), video editing, or any kind of rendering (3D art, design, etc.), you’ll certainly get a boost from a dedicated GPU.
And if you’re at all interested in playing around with machine learning, AI-art, or with tools in any other GPU computing-based fields, you’ll need a dedicated GPU.
Can Your Current Setup Support the Number of Monitors You Want?
Although most people buy a GPU for gaming, a sizable (albeit much smaller) number of people buy a dedicated graphics card to expand how many monitors their computer will support to boost their productivity.
Without a dedicated graphics card, adding extra monitors to your computer is kind of a crapshoot. Some motherboards support using multiple video ports—such as the DVI and HDMI port for two different monitors–but most motherboards don’t. Some motherboards will allow you to keep the integrated graphics turned on and add in a lower-end dedicated GPU so you can score an extra port, but many don’t (and even when that trick works, it’s often a giant headache to get two GPU chipsets working in parallel).
But if you want to run two or more monitors without taxing your computer, fiddling with BIOS settings, or resorting to animal sacrifice to make your monitor dreams a reality, the easiest way is to simply buy a card that supports your monitor setup right out of the box.
The best part about it is that unless you’re both a multi-monitor lover and a gamer, you don’t need a premium card. If you simply want a lot of monitors, you can get away with using an older GPU model or a card designed specifically for multi-monitor setups.