CPU vs. GPU: What’s the Difference?

    A CPU sitting on top of a GPU.

    If you’ve ever built your own computer—or even just read about it—you’ll have figured out that a CPU and a GPU are two very different things. But what’s the difference exactly, and how does this work in practice?

    What Are CPUs and GPUs?

    The short answer is that the CPU, short for central processing unit though also called a “processor,” runs your computer. It’s the central hub for your device and manages all the processes that make it tick. If you don’t have a CPU, you don’t have a computer, just a state-of-the-art paperweight.

    The GPU, or graphical processing unit, also called a “graphics card,” runs the graphics displayed on your screen. GPUs are also vital to the operation of your computer, without them nothing would show on your screen. That said, they don’t always have to be a discrete, or separate, GPU; many CPUs, especially for laptops, have GPUs built-in.

    These integrated graphics cards, though, don’t have a lot of oomph. If you want to run high-end graphics for games or advanced graphical software like 3D modelers you’re going to need a discrete GPU. They just have a lot more power.

    Where You’ll Find Them

    Because a CPU is so, well, central, they’re ubiquitous: there’s not a single digital device that won’t have one. Smartphones and smart devices will generally have very small ones that don’t put out a lot of computing power, while supercomputers will have massive networks of CPUs that can do calculations that would make your phone belch smoke within minutes.

    Discrete GPUs are a lot more specialized. They’re generally only found in laptops and PCs marketed at gamers—in fact, they’re the biggest market as most top-of-the-line games nowadays require serious graphical computing power. Visual artists are the other big buyers of GPUs as they need to render images quickly and in detail, something a GPU integrated into a CPU can’t do anywhere near as well.

    However, it’s not just gamers and artists that use GPUs. They’re also used a lot in machine learning and in crypto mining, for reasons we’ll get into shortly.

    How a CPU Works vs. a GPU

    The CPU and GPU do different things because of the way they’re built. A CPU runs processes serially—in other words, one after the other—on each of its cores. Most processors have four to eight cores, though high-end CPUs can have up to 64.

    When the computer is running, each core will run a process more or less by itself, like registering your keystrokes while typing. While it does that, other cores will be handling all the other processes you see running in your Windows Task Manager (or they will be waiting to run). Because it manages tasks serially and dedicates a large share of its processing power to each task, it runs—and switches between running different processes—at lightning speed.

    A GPU approaches computing differently. When given a task, a GPU will subdivide it into thousands of smaller tasks and then process them all at once, so concurrently rather than serially. This makes GPUs much more suitable for handling large processes that are made up of many small parts, like 3D graphics.

    For example, in a game what you see is basically a field of polygons. Each polygon is filled in individually by the GPU at the same time, and, considering there can be thousands of them, it’s actually pretty impressive how fluidly GPUs can do it. You can even see it for yourself when your GPU malfunctions while gaming, as you get large blocks of textures on your screen.

    When to Use a CPU vs. a GPU

    Because they work so differently, CPUs and GPUs have very different applications. Serial processing is what makes a computer tick. If you tried to run a PC using concurrent processes it wouldn’t work very well as it’s hard to subdivide typing out an essay or running a browser. CPUs can dedicate a lot of power to just a handful of tasks—but, as a result, execute those tasks a lot faster.

    GPUs, on the other hand, are a lot more efficient than CPUs and are thus better for large, complex tasks with a lot of repetition, like putting thousands of polygons onto the screen. If you tried to do that with a CPU, it would just stall out, if it even worked at all.

    GPUs Aren’t Just About Graphics

    The idea that CPUs run the computer while the GPU runs the graphics was set in stone until a few years ago. Up until then, you rarely saw a graphics card for anything else other than games or visual processing (3D graphics or image and video editing).

    However, that’s undergone a drastic shift in the last few years thanks to two important changes in the way we use computers. The first is machine learning (also called deep learning), which requires intensive concurrent processing because of the way it manages data.

    As this article explains in much more detail, every bit of information that’s processed by a deep learning algorithm goes through several filters, called weights. Considering there are a lot of filters and a lot of data points, running this through a CPU would take forever. A GPU is much more suited for the task.

    GPUs and Crypto Mining

    GPUs are also popular when mining for cryptocurrency, for a similar reason. To get new coins, you usually need to solve a complicated cryptographic equation that will unlock the next section of the blockchain. Brute force is the keyword here, as the more processing power you throw at one of these equations, the better the chance of solving it quickly.

    GPUs have a two-fold advantage over CPUs because not only can they bring more processing power to bear thanks to being more efficient, they’re also outfitted with specialized math processors, named Arithmetic Logic Units (ALU). ALUs help graphics render more quickly but are also a godsend for anybody looking to solve complicated mathematical problems.

    In fact, GPUs became so popular among crypto miners that they caused a worldwide shortage of graphic cards, one that’s barely easing at the time of writing in December 2021. We’ve come a long way since the days that graphic cards were only used by gamers.

    Source link