Aces High Bulletin Board
General Forums => Hardware and Software => Topic started by: 1Boner on September 27, 2010, 09:47:21 AM
-
I have a new system coming in a few days.
From what I've read, there's not any video quality differences between dvi and hdmi.
However if I plug the hdmi into the output for the graphics card will the sound quality be affected?
Does the graphics card get its audio from the computers THX audio or does it have its own lower quality sound on board?
Thanks, Boner :salute
-
short answer is you need hdmi to play blueray dvds on your computer.
-
short answer is you need hdmi to play blueray dvds on your computer.
:headscratch:
The only difference between DVI and HDMI, is that HDMI has audio. The picture quality is the same. But your post is completely incorrect.
-
Both are digital video signals, and for the most part, both are somewhat compatible with each other (just need a converter). But, as Karaya said, the primary difference is that HDMI can also transmit audio signals, DVI does not.
-
hmm sorry. always thought you needed hdmi for blueray to play for copyright reasons
-
hmm sorry. always thought you needed hdmi for blueray to play for copyright reasons
The only way to benefit from BD, is to have a BD player and a 1080p TV. The connection (even Analog) does not matter. HDMI is simply an excellent way to keep an HT "clean" with fewer wires (connections) and eliminate the spaghetti normally associated with it.
-
my esteemed collegue has once again cleared the fog from my brain. :D
-
Where does the sound originate from if I am using a hdmi cord directly to the graphics card?
Does the graphics card have its own sound processer build in? Or is it somehow connected to the computers on board sound?
Or, would it be better to run dvi from the graphics card and run the sound with simple component cables.
-
you need to connect the GPU sound in to the mobo SPDIF Out, if your GPU has HDMI you should have been supplied a small cable to do this.
-
you need to connect the GPU sound in to the mobo SPDIF Out, if your GPU has HDMI you should have been supplied a small cable to do this.
Even if the GPU has an onboard sound chip and he's using HDMI? Wouldn't it bypass anything onboard then?
-
hmm sorry. always thought you needed hdmi for blueray to play for copyright reasons
That's HDCP
The only difference between DVI and HDMI, is that HDMI has audio. The picture quality is the same. But your post is completely incorrect.
That solely depends on DVI type. DVI-A for example, carries analog signal. DVI-D digital, DVI-I both. DVI-A is rare though...
-
I used HDMI on my TV and the difference was significant or seemed that way. At least it was from component to HDMI. Not sure about computers.
-
Where does the sound originate from if I am using a hdmi cord directly to the graphics card?
Does the graphics card have its own sound processer build in? Or is it somehow connected to the computers on board sound?
Or, would it be better to run dvi from the graphics card and run the sound with simple component cables.
The GPU has a built-in audio device (some have more than one). Usually the high-definition link passes from the motherboard to the GPU and I think the whole purpose is syncing video and audio so that the display and the display speakers as well as external speakers are all in synch. Otherwise you would probably experience audio lag from monitors to external speakers. So if you want to use just a monitor that has its own speakers then you want DVI or HDMI (either one) and if you want to use the monitor plus external speakers then you will need the HD-S/PDIF connection.
I have never tried to use the monitor audio because it just doesnt seem realistic that it would be high quality.
-
The GPU has a built-in audio device (some have more than one). Usually the high-definition link passes from the motherboard to the GPU and I think the whole purpose is syncing video and audio so that the display and the display speakers as well as external speakers are all in synch. Otherwise you would probably experience audio lag from monitors to external speakers. So if you want to use just a monitor that has its own speakers then you want DVI or HDMI (either one) and if you want to use the monitor plus external speakers then you will need the HD-S/PDIF connection.
I have never tried to use the monitor audio because it just doesnt seem realistic that it would be high quality.
My 5850 doesn't require an additional cable for sound. In fact, it doesn't have a plug for an audio cable like you're mentioning above. Is this something on older cards (older being very relative here...)?
-
I thought BD wouldn't play on connections that don't support HDCP? Does DVI support HDCP (like HDMI obviously does)?
-
My 5850 doesn't require an additional cable for sound. In fact, it doesn't have a plug for an audio cable like you're mentioning above. Is this something on older cards (older being very relative here...)?
No its for S/PDIF use (High Definition) as on the new GFX 480s.
-
I thought BD wouldn't play on connections that don't support HDCP? Does DVI support HDCP (like HDMI obviously does)?
I dont think the cable itself cares. If your TV/monitor is not HDCP compliant then you wont see any content. Yes there is such a thing as DVI/HDCP and HDMI/HDCP but no VGA/HDCP since it would be non-digital.
-
No its for S/PDIF use (High Definition) as on the new GFX 480s.
Gotcha on that.
But if he wants to use the GPU's onboard sound through a HDMI cable, he doesn't need to do anything except choose that audio device in Windows. Wasn't that his second question? Or am I missing something unstated here?
-
Ladies and gentlemen of the AH community.
I am but a simple caveman who after being frozen in ice millions of years ago, was brought back to life by your scientists.
Your technological jargon and abbrieviations confuse and frighten me.
I have decided to just go with what you call the "hdmi" and hope that this magical cord will allow me to both watch AND hear the Flintstones.
I am just a simple caveman with simple tastes, and I thank you all for your strange and magical advice!
Boner :salute
(http://img291.imageshack.us/img291/2717/unfrozencavemanlawyer.jpg)
-
haha.
RIP Phil.
-
HDMI's actually an older format that's making the rounds in popularity, but lacking in actual capability. [EDIT: P.S. I'm editorializing on this one, fair warning!] I'm not an expert, but from my own personal experiences trying to get an HDMI cable to show my computer on a 1080p TV more than once, it acts almost as if it's analog, scaling up and down and not displaying pixels nearly as clearly as it should.
IMO the DVI is far far better quality, if you can pull it off. Depends on the source and the destination and what cables you have. I have heard that DVI cables CAN use audio (they have the pathways) but that almost none of the cards actually use that part of the interface. [EDIT2: DVI may be better simply because of wider spread compatibility, whereas HDMI interfaces on certain laptops and on certain TVs do not support all resolutions -- it's less standardized]
Having had several bad experiences with a VERY expensive interface, I won't do that again. In the future I'll look for better ways of connecting to HD TVs.
P.S. Maybe the reason your sound went funny was the onboard sound was being pumped through the HDMI cable rather than your sound card sound? As mentioned above you need the 2-wire cable plugged into your vid card from your sound card to get HDMI out. I know a HD 3650 comes with its own onboard sound card (how funky is that??) and tries to use it in HDMI mode instead of default.
-
HDMI's actually an older format that's making the rounds in popularity, but lacking in actual capability. [EDIT: P.S. I'm editorializing on this one, fair warning!] I'm not an expert, but from my own personal experiences trying to get an HDMI cable to show my computer on a 1080p TV more than once, it acts almost as if it's analog, scaling up and down and not displaying pixels nearly as clearly as it should.
Krusty, that could be a function of your television, or your video card's output in conjunction with the television. A true 1080p signal over HDMI 1.2 or higher should in theory be just as clean and crisp as DVI. However, I have seen in some cases that some TVs won't recognize HDMI as a PC signal, and attempts to use TV style filtering on it, including not using the full color space available (normal TV/DVD signal only broadcasts 16-240 per color, whereas PC is 0-255 I believe) and attempting to internally scale to what the TV thinks is appropriate resolution. Perhaps that is what you are seeing.
If the TV and Graphics card are both doing their job, you should see no difference between HDMI and DVI, except at Distance which higher grade HDMI cables handle much better.
-
HDMI's actually an older format that's making the rounds in popularity, but lacking in actual capability. [EDIT: P.S. I'm editorializing on this one, fair warning!] I'm not an expert, but from my own personal experiences trying to get an HDMI cable to show my computer on a 1080p TV more than once, it acts almost as if it's analog, scaling up and down and not displaying pixels nearly as clearly as it should.
Because you are not using the "PC Input" or your TV does not have a PC Input to begin with. Which in most cases today is DVI.
Krusty, that could be a function of your television, or your video card's output in conjunction with the television. A true 1080p signal over HDMI 1.2 or higher should in theory be just as clean and crisp as DVI. However, I have seen in some cases that some TVs won't recognize HDMI as a PC signal, and attempts to use TV style filtering on it, including not using the full color space available (normal TV/DVD signal only broadcasts 16-240 per color, whereas PC is 0-255 I believe) and attempting to internally scale to what the TV thinks is appropriate resolution. Perhaps that is what you are seeing.
If the TV and Graphics card are both doing their job, you should see no difference between HDMI and DVI, except at Distance which higher grade HDMI cables handle much better.
You saved me a ton of typing.
-
Knite, I believe you are correct, but in my researching online what the problem is, it appears that MANY different computers and MANY different TVs have many random irregularities and incompatibilities....
Even a perfectly normal PC could still have issues with an otherwise perfectly normal TV with HDMI inputs.
There are many people on the internet searching, asking, and looking for help on the same problem I had, and the main answer is "Sorry, it's just not working because of X, or because of Y" -- whereas you rarely hear that about DVI inputs.
I'm up in the air I guess. I wish they'd come up with better interfaces, what with higher and higher resolutions, what with eyefinity taking off, etc.
-
There is a better interface for computer displays. It is called "Display Port". Much higher bandwidth than HDMI/DVI. Supports multiple devices on one bus.
One thing to note. DVI is a acronym for several different interfaces. The computer version of DVI is not the same as the television DVI connection. The pin-outs are the same, but the signaling is not.
-
i found a old 55 inch rear projection tv on the street it has DVI input got it running 1080i and looks almost as good at my roommates 42 inch 1080p TV
-
There is a better interface for computer displays. It is called "Display Port". Much higher bandwidth than HDMI/DVI. Supports multiple devices on one bus.
One thing to note. DVI is a acronym for several different interfaces. The computer version of DVI is not the same as the television DVI connection. The pin-outs are the same, but the signaling is not.
What kind of cord do you use to hook it up?
Is it commonly found on most computers and displays?
-
i found a old 55 inch rear projection tv on the street it has DVI input got it running 1080i and looks almost as good at my roommates 42 inch 1080p TV
It depends on the HD source. I've seen some "High Def Cable" signals that are atrocious.
-
We will probably see Displayport win out on the PC side and HDMI will take the television side but its a little early to call. Back in the day I would have thought Betamax would have won over VHS but that didnt happen.
-
Well, the latest eyefinity cards have mini-display ports on them, but you have to by a gawd-awfully-expensive adapter to convert that to DVI to actually plug a monitor in (something like $60 per adapter?!!).
Do any mainstream (i.e. "not costing 2,000 dollars") cards even have display port connectors yet?