Friday, August 13, 2010

Computers 101: Graphics Card


PNY GeForce FX 5200 PCI 256 MB 2 Port VGA + S-Video Graphics Card VCGFX522PEB - RetailGraphics cards, also known as video cards, house the hardware needed to get all those nice images that your computer produces to display on your monitor.  Graphics cards connect to your computer through the motherboard, on one of the expansion slots.  On modern graphics cards and motherboards, this is a PCI Express 2.0 slot.

The main function of graphics cards are to render 2D and 3D graphics that are sent to it from the CPU and then display it.  Your program/game that you are running, sends information from the program to the CPU and requests an image be displayed on your monitor.  The CPU forwards that information, which is still in binary at this time, to the graphics card and requests an image be rendered out of that binary data it was given.  Let's make the image a 3D image that is being requested as this is where the graphics card shines. 


The graphics card takes the data it receives and first, creates a bunch of straight lines with the data.  The straight lines are then formed into a wire frame that resembles the image. 



From there, the graphics card then fills in the missing pixels through a process called rasterization.  After the wire frame has it's "skin" put on it, other effects are added to the image.



These effects include color, lighting, and textures.  We won't delve into great detail about all the different effects but they are all very cool and add a lot to the image.  All of this rendering and effects must be done very quickly, especially when playing a game, and requires a lot of computing power.  If not for the addition of the graphics card, this strain would be put on the CPU alone and would be impossible for it to handle. 


So what makes up these awesome rendering components that bring us such joy?  Let's take a look.

A graphics card is very similar to an entire computer in and of itself.  It has it's own graphics processing unit (GPU) and it's own memory (RAM) which is mounted on a circuit board.  It also has it's own BIOS which controls many aspects of the card and governs how the other components interact with the graphics card.

The GPU (Graphics Processing Unit) is similar to a CPU in that it is also a microprocessor.  The difference being that GPU is specifically used to handle floating point calculations (mathematical and geometric calculations) that are associated with graphics processing.  ATI (owned by AMD) and nVidia are the two main producers of GPUs today.  The main attributes of the GPU are the core clock frequency, which is measured in MHz and GHz and the number of pipelines, which translate a 3D image characterized by vertexes and lines in the graphics card into a 2D image on your screen formed by pixels.  As with most things computer related the higher the MHz/GHz and the higher the number of pipelines, the faster the graphics card.  On most motherboards there is also an integrated GPU.  While these are usually fine for normal computer use, ithere is to be any type of 3D rendering done at all, a dedicated graphics card is advised.

The GPU creates the images that are sent to it by the CPU and needs some place to store the data until it displays it to the screen.  This is where the RAM on the graphics card comes in.  The GPU stores the information about each pixel of the image and where it will be displayed on the screen in this RAM on the graphics card.  This RAM is very similar to the RAM in your computer in the fact that it is very fast and data can be written to and read from the RAM at the same time.   After the image is stored, it is ready to be displayed.

Most graphics cards today have 2 outputs on them.  A VGA output for analog signal and a DVI output for a digital signal.  If you are using a VGA monitor (the big, heavy, old CRT monitors) the data from the RAM will be sent to a RAMDAC which converts the digital data in the graphics card to an analog signal so it can be displayed on the monitor.  Modern monitors/TVs use a digital display, so the conversion is not needed and a higher quality image is maintained.

The future of the graphics card and the GPU is a bright one...and that's an understatement.  For decades, gaming has been the main power for the entire computer industry and as such, the GPU has been very important in fueling the gaming industry.  Advances in the GPUs power has allowed newer and more beautiful games to be developed.  As games became more and more detailed, more powerful computers were needed to keep up.  The GPU became a very powerful player the the entire scheme of a computer but the CPU was always there as the most important part of a computer as it handled the majority of all processing done by the computer.  This is starting to change.

Both ATI and nVidia since 2008 have been pushing their technologies that utilize the GPU for much more than just graphic processing.  ATI with their Fusion technology and nVidia with their CUDA technology are making a movement for the GPU to handle more and more of the CPUs workload when the demand for graphic processing is low. 

This has actually started a bit of a rift between nVidia as a GPU developer and Intel as a CPU developer as they appear to be starting to move in on each other's turf.  Who knew computers could be so violent?!

This wraps up the Computers 101 session.  I'll have a short recap post in the next few days.  Hopefully you've learned a little bit about computers while reading over this.  Now the blog takes a turn towards some more recent and updated topics.  Stay tuned for all your tech needs and again if you have any questions at all, make sure you ask away!  Until next time, stay uber my friends.

No comments:

Post a Comment