Computer only detecting one HDMI source out of a monitor and HDTV.

publicenemy219

Distinguished
Nov 27, 2008
13
0
18,560
Hello,

I recently bought an HDMI splitter http://www.newegg.com/Product/Product.aspx?Item=N82E16812270376 in order to make use of my HDTV and my monitor. Both are connected via HDMI and only the monitor shows up in the nvidia control panel. I want to extend these displays and the only thing it does regardless of what settings I choose, is cloning. Ive tried going into windows 7 res settings to see if I could detect it in there, but to no avail. I have a GTX 570 GPU with one Mini HDMI port I am using to feed the HDMI splitter into the monitor and the TV. Is it possible for me to get both displays detected using this methods? I want to make use of HDMI on both displays as the monitor I have is 3D capable, and the TV doesnt make use of DVI.

Any information would be greatly appreciated.

Id like to also note that the monitor is 120hz and in nvidia it is only allowing 60hz
 

jcoultas98

Distinguished
Feb 19, 2009
505
0
19,010
You can't use a splitter to do that. You need a dedicated output per monitor. TVs don't actually have a 120HZ input. The only exception to this, is if it is running in 3D. In this case, it is receiving 60Hz per eye. The 120Hz your TV can do is interpolated(fake).
 

publicenemy219

Distinguished
Nov 27, 2008
13
0
18,560
I wanted to be able to use HDMI ports for both the monitor and TV. Is there another alternative that I could use, that won't degrade picture? Is there some specialized DVI cables that support 3D? I'm not too informed on the matter and I greatly appreciate you clearing that up for me.
 

jcoultas98

Distinguished
Feb 19, 2009
505
0
19,010
On the newer video cards, the DVI connector actually outputs HDMI signals. You simply need to use a DVI to HDMI cable. Your solution is easy. Connect your TV via Mini HDMI, and connect your monitor via a DVI to HDMI cable. This passes audio. Only one device will act as the audio device however. HDMI still is only able to select one connected display to handshake the full signal to.

If your displays are 3D, you need to connect an HDMI 1.3a or faster cable. You also will need the Nvidia 3D software. Check Nvidia.com to get a listing of supported displays.
 

publicenemy219

Distinguished
Nov 27, 2008
13
0
18,560
http://www.amazon.com/SANOXY-Plated-Female-DVI-D-Adaptor/dp/B0035B4LJM/ref=sr_1_43?s=electronics&ie=UTF8&qid=1345697660&sr=1-43&keywords=DVI+to+HDMI+connector+1.4

http://www.amazon.com/Mediabridge-Ultra-Ethernet-Certified-Available/dp/B0019EHU8G/ref=sr_1_2?s=electronics&ie=UTF8&qid=1345700242&sr=1-2&keywords=hdmi+1.4

Does it matter what kind of connector I buy? Cause I could just buy one on amazon for 1$ no biggie. Or does it have to be 1.4 compliant, if that even exists.

EDIT: I get the option to go into 120HZ refreshrate with the Dual DVI link, and only 60 with the HDMI hookup. Since I purchased the Nvidia software, I can enable 3d on the monitor, but I can't view it in 3D correctly (very distorted) because its DVI, I believe. It doesn't even give me the option to mess around with the stereoscopic settings when the HDMI hook up is connected, so I am going to assume that its an older HDMI cord.