Graphics Processing Unit (GPU)
What Is a Graphics Processing Unit?
How a GPU Works
History of the GPU
GPUs vs CPUs
GPUs and Cryptocurrency Mining
Examples of GPU Companies
What Is a Graphics Processing Unit (GPU)?A Graphics Processing Unit (GPU) is a chip or electronic circuit capable of rendering graphics for display on an electronic device. The GPU was introduced to the wider market in 1999 and is best known for its use in providing the smooth graphics that consumers expect in modern videos and games.
- The term graphics processing unit (GPU) refers to a chip or electronic circuit capable of rendering graphics for display on an electronic device.
- The term “GPU” is often used interchangeably with “graphics card,” though the two are different.
- Although GPUs were initially popular with video editing and computer gaming enthusiasts, the rapid growth of cryptocurrencies has created a new market for them.
- GPUs, first introduced to the wider market in 1999, are perhaps best known for their use in providing the smooth graphics that consumers expect in modern videos and video games.
- There has been a shortage of GPUs recently thanks to their application in the mining of cryptocurrencies.
How a Graphics Processing Unit (GPU) Works The graphics in videos and games consist of polygonal coordinates that are converted into bitmaps—a process called “rendering”—and then into signals that are shown on a screen. This conversion requires the Graphics Processing Unit (GPU) to have a lot of processing power, which also makes GPUs useful in machine learning
, artificial intelligence
, and other tasks that require a large number of complex and sophisticated computations.
History of the Graphics Processing Unit (GPU)
In 1999, Nvidia introduced the Geforce 256, the first widely available GPU. Nvidia defined a GPU as a "single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.” The GeForce 256 improved on the technology by other processors by optimizing 3D gaming performance.
While Nvidia still reigns supreme in the GPU market, the technology has greatly improved. In the 2000s Nvidia released its GeForce 8800 GTX which has a texture-fill rate of a whopping 36.8 billion per second.1 Today, GPUs have seen a resurgence in popularity. Their use has been extended into new industries thanks to the advent of artificial intelligence and cryptocurrencies. GPUs have also played a role in establishing wider access to higher-quality virtual reality gaming.
Before the arrival of GPUs in the late 1990s, graphic rendering was handled by the Central Processing Unit (CPU). When used in conjunction with a CPU, a GPU can increase computer performance by taking on some computationally intensive functions, such as rendering, from the CPU. This accelerates how quickly applications can process since the GPU can perform many calculations simultaneously. This shift also allowed for the development of more advanced and resource-intensive software.
Processing data in a GPU or a CPU is handled by cores. The more cores a processing unit has, the faster (and potentially more efficiently) a computer can complete tasks. GPUs use thousands of cores to process tasks in parallel. The parallel structure of the GPU is different than that of the CPU, which uses fewer cores to process tasks sequentially. A CPU can perform calculations faster than a GPU, which makes it better at basic tasks.
The term “GPU” is often used interchangeably with “graphics card,” though the two are different. A graphics card is a piece of hardware that contains one or more GPUs, a daughterboard, and other electronic components that allow the graphics card to function.
A GPU can, however, be integrated into the motherboard or be found in the daughterboard of a graphics card. Initially, high-end computers were the only ones to feature graphics cards. Today, most desktop computers typically use a separate graphics card with a GPU for increased performance, rather than rely on a GPU built into a motherboard. GPUs and Cryptocurrency Mining While GPUs were initially popular with video editing and computer gaming enthusiasts, the rapid growth of cryptocurrencies
created a new market. This is because cryptocurrency mining
requires thousands of calculations in order to add transactions to a blockchain
, which is something that could be profitable with access to a GPU and an inexpensive supply of electricity.
In recent years, two prominent graphics card manufacturers, Nvidia Corp. (NVDA
) and Advanced Micro Devices Inc. (AMD
) have experienced a rapid increase in sales and revenue as a result of cryptocurrency mining.
This had the side effect of frustrating non-mining customers, who saw prices increase and supply dry up. As a result, retailers occasionally limited the number of graphics cards that an individual could purchase. While miners of the more popular cryptocurrencies, such as bitcoin,
have shifted to using specialized and more cost-effective chipsets called application-specific integrated circuits (ASICs), Graphics Processing Units are still used to mine lesser-known currencies.
The rise in the popularity of cryptocurrencies has caused a massive shortage of GPUs. Reporting from the Verge calculated that GPUs are being sold for two to three times their street price on sites like eBay.2 Examples of GPU Companies Advanced Micro Devices (AMD) AMD
is one of the most trusted producers of graphic cards. The manufacturer began as a startup in Silicon Valley in 1969 and develops high-performance computing and visualization products.1
AMD entered the GPU market in 2006 when it acquired leading video card maker ATI. Since then, AMD and Nvidia have been the dominant players in the GPU market. As of May 2021, AMD has a market cap of $97.3 billion. AMD has shipped over 500 million GPUs since 2013 and controls 17% of the GPU market share.3
AMD places its focus in the GPU market on PC gaming and is a favorite among gamers worldwide. Nvidia
was the very first company to bring GPUs into the world in 1999. The first GPU in history was known as the Geforce 256. 1999 was also the year Nvidia launched its initial public offering
(IPO) at $12 per share.4
As of May 2021, the stock is trading around $645 per share.
Nvidia has a market cap of $404.8 billion and controls 13% of the GPU market share.56 Nvidia has considerable reach in the advanced GPU market. According to Nvidia's website "eight of the world’s top 10 supercomputers now use NVIDIA GPUs, InfiniBand networking, or both. NVIDIA powers 346 of the overall TOP500 systems on the latest list." Nvidia's own supercomputer, named Selene, is ranked fifth in the world and is the world's fastest industrial supercomputer.7 Graphics Processing Unit (GPU) FAQsWhat Is the Difference Between GPU and VGA?
Whereas GPU is a chip or electronic circuit capable used to render graphics for display on an electronic device, a VGA or video graphics array connector is a physical device used to transfer video signals and computer video output.
How Do You Overclock Your GPU?
Before overclocking make sure you thoroughly clean your device and install any updates and bug fixes to your software. Thanks to updates in technology, overclocking is fairly simple. Simply install software such as Afterburner and let the system go to work. After the installation is complete, run a gaming benchmark to test out the new software.
What Is GPU Scaling?GPU scaling is a feature that enables users to adjust the aspect ratio of a game based on their monitor’s resolution. Some users believe this adjusting the aspect ratio will further enhance the image quality of the display.
Application-Specific Integrated Circuit (ASIC) Miner Definition
An application-specific integrated circuit (ASIC) miner is a computerized device that was designed for the sole purpose of mining bitcoins. more
What is micro mining?
Micro mining is a lightweight mining activity in which a low-end hardware device performs the basic activity of transaction authentication. more
Semiconductors: Understanding the Objects That Power Our Digital Lives
A semiconductor is an electrical component in consumer and industrial products. Read how they work and how to invest in the semiconductor industry. more
Understanding Data Warehousing
A data warehouse is an electronic system for storing information in a manner that is secure, reliable, easy to retrieve, and easy to manage. more
Moore's Law Explained
Moore's Law refers to Moore's perception that the number of transistors on a microchip doubles every two years, though the cost of computers is halved.more
What Is Ethereum?
Ethereum is a blockchain-based software platform for creating and using smart contracts and distributed apps; the cryptocurrency Ether was created for it.more
Investopedia is part of the Dotdash