From what ive read if you already have two DVI capable devices then dvi is the way to go, DVI is digital and vga is analog thats how the paragraph below compares them . Here is an answer to the same ? taken off experts exchange :
Watzman:
I have to disagree with some of the comments by Simkiss and also (and this is a rareity) Callandor. I was a product manager and engineer for displays (both CRT and LCD) for 7 years, but I've been involved with displays and video since 1965 (40 years), as in addition to my computer work, I also did televsion broacast engineering even as a teenager (I have an FCC license). The difference between analog and DVI can be very pronounced even at 1024x768, and even with a cable length of 5 feet or less. Analog quality is critically dependent on the quality of the cable, and also on the adjustment of the dot clock. Conversely, it is very difficult to get quality degradation on a DVI interface that is continuing to work. You do not get the same type of "pixelization" with loss of quality in a DVI cable that you get with dropouts in an MPEG data stream (DVD or satellite transmission). MPEG is compressed, and uses a data stream with "key fields", which are complete, and between them only the changes are transmitted, which is nothing at all like DVI that is an uncompressed pure digital transmission of a sequential list of pixels. Now as to whether or not you will notice what you are missing, I can't say .... that will depend on your particular hardware, and it also depends on how critical a viewer you are. But the difference may well end up being very, very substantial. Even at 1024x768, and even with a cable length of only 5 feet or less.