Monday, December 29, 2014

VGA vs. DVI

You are free to choose either VGA or DVI interface to connect a monitor into your computer. However, before you make any decision, it is necessary and advisable for you to have a good command of their differences. As a matter of fact, the most distinguished differences between VGA and DVI is that VGA is an analog standard while DVI is digital.

It is wise of you to choose DVI as the device to connect your computer and monitor. If you are using the VGA port to transfer digital resources, then the original digital signals will be converted into analog signals, which could lead to the loss of quality. You may not receive the image and video as clear and vivid as the original ones. However, VGA was very useful when it was invented because all monitor are based on CRTs which were analog in nature at that time. It is more economical to convert the signals through the CPU than transmit them directly.

However, with the emergence of LCDs, the data is digital at both the origin and destination. In such case, VGA doesn’t help but waste time and energy in the conversion of signals. Moreover, the quality of digital signals will also be damaged to some extent.

People spared no efforts to cope with this problem and DVI is created. If you have added DVI to LCD monitors and graphics cards, the digital data can be transmitted without any modification or conversion. Therefore, the digital signals will be as clear and vivid as the original ones. Gradually, DVI is becoming more and more popular among consumers.


Above all, DVI is digital while VGA is analog. As a result, VGA is suitable for CRT monitors while DVI is best for LCDs.   On the other hand, VGA may lead to quality loss because of multiple conversions.