Pin
Send
Share
Send


A video card, also called Graphic card (among other names) is responsible for processing the data that comes from the main processor (CPU or CPU) and converting it into information that can be represented on devices such as monitors and televisions. It is worth mentioning that this component can have a great variety of architectures, although they are commonly called in the same way, even if there is talk of a chip of video integrated into a motherboard; In the latter case, it is more correct to say GPU (Graphic Processing Unit).

Since its inception, graphics cards have included various features and functions, such as the possibility of tuning in to television or capturing video sequences from an external device. It is important to note that it is not a component found exclusively in the computers current, but have existed for over four decades now and today are also an indispensable part of the video game consoles, both of laptops and homemade.

Its creation dates from the end of the 60s, at which time the use of a printer to visualize the activity of computers was left behind and began to be used monitors. At first, the resolutions were negligible compared to the one already known by all high definition. It was thanks to Motorola's research and development work that the characteristics of the chips became more complex and their products led to the standardization of the name of video cards.

As computers for personal use and the first video game consoles became popular, we opted for integrate graphics chips into motherboards, since this allowed to considerably reduce manufacturing costs. At first glance, this presents a clear disadvantage: the inability to update the equipment; however, it was about systems closed, which were built taking into account each and every one of its components, so that the final product was consistent and offered the highest possible yield.

It should be noted that today this is still happening with the consoles, and it is thanks to this type of unalterable design that after a few years the developers obtain results far superior to the first experiments; This is not possible on a PC, however powerful, since a software company cannot consider all possible combinations of the machines of its consumers. In addition, the architecture of a computer has weak points precisely because its parts are interchangeable, the most notable being the distance between the memory, the graphics card and the main processor.

In the early 1980s, IBM relied on the design of the unforgettable Apple II and made the exchangeable video card popular, although in its case it only offered the possibility of displaying characters in screen. It was an adapter with the modest amount of 4KB of memory (currently they can have 2GB, 512 times more) and it was used with a monochrome monitor. This was the starting point, and the improvements were not long in coming.

Some time later, IBM standardized the term VGA, which refers to a technology of video cards capable of offering a resolution of 640 pixels wide by 480 high, as well as the monitors that could represent these images and the necessary connector for their use. After the work of several companies dedicated exclusively to graphics, Super VGA (also know as SVGA) saw daylight, increasing the available definition (to 1024 x 768) as well as the amount of colors which could be represented simultaneously (from 16 colors in 640 x 480, 256 was changed in 1024 x 768).

Pin
Send
Share
Send