Aces High Bulletin Board
General Forums => Hardware and Software => Topic started by: Captain Virgil Hilts on March 29, 2007, 08:32:30 PM
-
My new video card, a PNY Verto 7600GS, has a DVI-I connector, and the IBM ThinkVision 22" monitor I picked up used also has a DVI-I connector. They both also have VGA type connectors. Is the DVI-I worth it? It's supposed to be fast, and hi definition, anyone have DVI-I set up and see a difference?
-
From what ive read if you already have two DVI capable devices then dvi is the way to go, DVI is digital and vga is analog thats how the paragraph below compares them . Here is an answer to the same ? taken off experts exchange :
Watzman:
I have to disagree with some of the comments by Simkiss and also (and this is a rareity) Callandor. I was a product manager and engineer for displays (both CRT and LCD) for 7 years, but I've been involved with displays and video since 1965 (40 years), as in addition to my computer work, I also did televsion broacast engineering even as a teenager (I have an FCC license). The difference between analog and DVI can be very pronounced even at 1024x768, and even with a cable length of 5 feet or less. Analog quality is critically dependent on the quality of the cable, and also on the adjustment of the dot clock. Conversely, it is very difficult to get quality degradation on a DVI interface that is continuing to work. You do not get the same type of "pixelization" with loss of quality in a DVI cable that you get with dropouts in an MPEG data stream (DVD or satellite transmission). MPEG is compressed, and uses a data stream with "key fields", which are complete, and between them only the changes are transmitted, which is nothing at all like DVI that is an uncompressed pure digital transmission of a sequential list of pixels. Now as to whether or not you will notice what you are missing, I can't say .... that will depend on your particular hardware, and it also depends on how critical a viewer you are. But the difference may well end up being very, very substantial. Even at 1024x768, and even with a cable length of only 5 feet or less.
-
Great explanation
I have a Hyundai LCD 19 inch L90D+
1280x1024 8ms .294mm
DVI monitor to DVI video card
I don't know what it would look like with vga as i always used DVI. I can tell you this setup is crystal clear without any problems and what i have read in reviews, if you have DVI capability always use it. That is if you can afford the cable.
Is it worth it to me......................Yep
-
I have a few dual monitor setups on computers at work and I can tell you that the DVI connected monitors always look better, colorwise, then the vga connected monitors. Each computer has a dvi and vga on the video card. All monitors are the same, they have connections for both types. When you see them sitting side-by-side, you'll notice the difference.
-
High def link needs a dual dvi in order to work. DVI has a limited bandwith capability.
-
That is if you can afford the cable.
igot dvi cable from newegg for $15.00 with shipping. radioshack wanted $72.
NOT
-
Originally posted by MrRiplEy[H]
High def link needs a dual dvi in order to work. DVI has a limited bandwith capability.
:huh
-
WHAT ARE SINGLE AND DUAL LINKS ?
The Digital formats are available in DVI-D Single-Link and Dual-Link as well as DVI-I Single-Link and Dual-Link format connectors. These DVI cables send information using a digital information format called TMDS (transition minimized differential signaling). Single link cables use one TMDS 165Mhz transmitter, while dual links use two. The dual link DVI pins effectively double the power of transmission and provide an increase of speed and signal quality; i.e. a DVI single link 60-Hz LCD can display a resolution of 1920 x 1080, while a DVI dual link can display a resolution of 2048 x 1536.
http://www.datapro.net/techinfo/dvi_info.html
-
I'm familiar with DVI technology. I just never heard anyone say you need 2 for high definition transmissions.
1080p = 1920x1080
Last time I checked, this was the highest HD resolution. The word "high definition" implies HDTV standard resolutions. If you meant, higher resolution, than okay.
I know you know about DVI so any help with this confusion I have would be great.
-
Originally posted by Kermit de frog
I'm familiar with DVI technology. I just never heard anyone say you need 2 for high definition transmissions.
1080p = 1920x1080
Last time I checked, this was the highest HD resolution. The word "high definition" implies HDTV standard resolutions. If you meant, higher resolution, than okay.
I know you know about DVI so any help with this confusion I have would be great.
Yep I should have said highest resolution monitors, not high-def.