Solved! Using HDTV for both TV and Monitor - Detecting the PC automatically

hostricity

Distinguished
May 24, 2011
13
0
18,560
I had a Dynex 1080P 19" TV which I used for both TV and as a computer monitor. It worked perfectly, right out of the box. Then, I upgraded to a larger DYNEX and had the problem in which the PC @ 1080P doesn't fill the screen. It turns out this TV was defective. (The computer controls on the TV wouldn't activate when connected to a PC and there was no way to manually activate them.)

I returned it and bought a Sharp LED LCD. My PC didn't fill the screen on it, either. That's when I discovered the overscan setting on my Radeon HD 5450 graphics card. Making that adjustment in the Radeon control panel filled up the screen beautifully.

(It turns out that the Dynex actually was defective, but that didn't have anything to do with filling the screen and the overscan setting on the PC graphics card. It is odd that I never had this issue with my old 1080P Dynex, but did with the new Dynex and the Sharp... Maybe someone can explain why I didn't need to set the overscan on my old 1080P tv, but I did on both of the new one's!)

When I switch between the Blu-Ray and the PC on two separate HDMI inputs, the TV doesn't detect that's it's connected to a PC, so it doesn't switch from video to graphics mode - which is required for it to look good as a computer monitor.

Since the TV deosn't recognize the PC and auto switch to graphics mode, when I switch from Blu-Ray to PC, I have to dig down into the menu and switch the HDMI mode to graphics manually - and then back when I go to the Blu-Ray. -- Major pain if I'm switching back and forth.

So, I have 2 questions:

1. Do any of the HDTV's out there allow you to set the various controls for each input individually and have them remembered? -- So I can have HDMI 1 set to video, and HDMI 2 set to graphics? I've searched and either the feature doesn't exist, I don't know what it's called, or no one bothers to mention it in their specs.

2. Would having a monitor specific driver instead of using the generic Windows 7 monitor driver allow the TV to detect the PC and switch to graphics mode when the HDMI is set to Auto? Right now, Auto just stays in video mode all the time.

I'm using Windows 7 64 bit Home Premium with the Radeon HD 5450 on a Sharp LC-26sv490u tv.

3. Any other tricks to get the HDMI Auto mode on this TV to actually detect the PC and switch to graphics mode?

This TV is only a temporary measure. I plan to get a different model soon, so any sage technical or shopping advice would be greatly appreciated.

Geoff
 
Solution
Ok. Looking at the manual (found here), It shows the PC connection via HDMI is connected to HDMI 1, on the side of the HDTV (see Page 22). Is that the HDMI port you're using?

Also, the manual is a bit confusing as to what resolution options are available (see page 47). It tells me that the PC connection only goes up to 1366x768, but I don't know if that's just for the 15-pin D-Sub connection.

If you can try connecting a VGA cable to the PC and HDTV, try that and see what resolutions are available in Catalyst Control Center (CCC). If you have a DVI to HDMI cable, try that as well.

If your PC isn't connected to the side HDMI port, give that a go and see if other resolutions become available to you.

Besides that, I'm pretty much...

hostricity

Distinguished
May 24, 2011
13
0
18,560


The TV has three HDMI settings: Auto, Graphics, and Video

When I set the TV to Graphics, it works beautifully as a monitor.

When I set the TV to Auto, it uses the Video mode instead of switching to the Graphics mode.

The problem isn't brightness, contrast, etc. In the Video mode, I get halos around the fonts. When I set the TV to Graphics, everything is perfect on the monitor.

I haven't found a sharpness setting in the Catalyst Control Panel for my HD 5450
 

hostricity

Distinguished
May 24, 2011
13
0
18,560


On the desktop mode - desktop properties, it lists basic and HDTV desktop area settings, which do change the resolution and refresh rate.

The basic settings only go up to 1680X1050, so I am using the HDTV setting for 1920X1080.

I think you have a valid point. What seems to be happening is that the card is representing the PC as an HDTV instead of as a PC, but I see no way to get the card to tell the TV that it is a pc and not an HD source (like a Blu-ray).
 
Ok. Looking at the manual (found here), It shows the PC connection via HDMI is connected to HDMI 1, on the side of the HDTV (see Page 22). Is that the HDMI port you're using?

Also, the manual is a bit confusing as to what resolution options are available (see page 47). It tells me that the PC connection only goes up to 1366x768, but I don't know if that's just for the 15-pin D-Sub connection.

If you can try connecting a VGA cable to the PC and HDTV, try that and see what resolutions are available in Catalyst Control Center (CCC). If you have a DVI to HDMI cable, try that as well.

If your PC isn't connected to the side HDMI port, give that a go and see if other resolutions become available to you.

Besides that, I'm pretty much at a loss.

-Wolf sends
 
Solution

hostricity

Distinguished
May 24, 2011
13
0
18,560


Thanks for these suggestions.

I agree the manual is confusing.

I saw the same things in the manual you saw and tried HDMI Input 1 and also tried a D-Sub connection. The resolutions you mention only apply to the D-Sub connection. Using HDMI 1 makes no difference.

There are "basic" resolutions and "HDTV" resolutions. The highest basic resolution is 1680x1050 on the HDMI connection.

But, the TV is supposed to detect that it is connected to a PC via HDMI and automatically switch from Video to Graphics mode.

To me, that means I need a monitor driver or I need to do something so that the video card tells the TV to switch to Graphics mode. I'm going to check with ATI and see if there's a way to write a configuration file of some kind that will tell the TV to switch to Graphics mode when HDMI mode is set to Auto.

 
The basic settings only go up to 1680X1050, so I am using the HDTV setting for 1920X1080.

Here's the thing. Since your graphics card is set to HDTV 1920x1080, it seems logical that it's sending out an HDTV signal (and not a PC signal). Hence why your HDTV is not automatically switching to graphics more???

The HD5450 should support PC 1920x1080. If it only goes to 1680x1050, that tells me there is an issue with the driver.

Just thinking out loud here.... err... typing out loud... oh whatever!

-Wolf sends
 

hostricity

Distinguished
May 24, 2011
13
0
18,560


Yes.

I'm investigating the video card driver, the possibility of a monitor driver, and a way to configure or write a profile that would do this.

I'm also investigating what "graphics" mode is. According to the TV's manual, graphics mode cause a "full scan". I don't know if that means in contrast to interlaced, or...
 

hostricity

Distinguished
May 24, 2011
13
0
18,560


Yes, I agree. Although I saw somewhere that there are descriptor files for HDTV's and monitors. If you know anything about that, I'm interested. I'm guessing it's something like an XML file that contains configuration information about the the TV.