Connecting a HDTV to a computer

elchico08

Distinguished
Jan 20, 2006
2
0
18,510
In order to connect a HDTV to a computer what will I need? Will I just need a graphics card that can display a 1080(1920x1080) or 720(1280x720) resolution and has a DVI output? Or is it not that simple?
 

Dafaad

Distinguished
Jan 18, 2006
5
0
18,510
It should be simpler then it is, but it isn't, you are right. First time I tried to connect to a regular TV (none HDTV) I couldn't see anything, because the monitor refresh rate is much higher then the refresh rate of a TV. After fiddling around with it for a good part of a day, was finally able to sort of see the very poor video. Text was impossible to read and just a real disaster.

Two weeks ago, I again tried to connect my computer to an HDTV. Figureing that with the hardware updates and the better TV we would surely be able to make it work. I use SLI computer and connected the HDTV out connector from the card to RGB in on HDTV. This should have worked, but no...the screen was all bleached out green mess.

I noticed that when I started the computer (rebooted many times trying to get it to work) that the VGA part of windows (no nvidia drivers loaded) would give me the nice colorful look as seen on a desktop. However, once the drivers were loaded from Nvidia, the screen was crap. If I remove the drivers, the screen looks as it should except for very grainy and pixelated. Loading the drivers again makes the screen bleached out green. So the drivers are **** and the vga default driver from windows makes the screen as ugly as it was years ago.

I say this in hopes someone will tell me where I went wrong, and possibley get a discussion on the correct way to make $600, to $1200 cards work with a $3000 to $10,000 HDTV. Why the heck can't we make this work folks?!

Help us please!
 

quicsilver

Distinguished
Mar 1, 2006
2
0
18,510
I to have had the same issue with my HDTV, BUT one thing we need to try and figure this out is we need your exact configurations, TV make and model, screen size. Then your video cards specs make model, and what connections you use to connect to and from the TV and the video card. Here's what I have.

TV-Panasonic 53" 1080i HDTV Tube TV, only a year old.

Video Card-eVGA Nvidia 6600gt AGP with optional component (B+G+R) output.

Now what I tried first was to get a DVI to HDMI cable and this output too large and was reeeaal fuzzy. Tried playing around with the monitor setting in the display properties, couldn't get anything to work. Weird thing was though that it showed up as Panasonic TV in the monitor settings, there may have been one or two settings I missed, but I think I pretty much tried every screen reso, and every refresh rate and setting. Still to big for the screen area, and too fuzzy.

Then I tried the optional HDTV component hook up. Display properties showed the option for HDTV monitor and gave me HD screen resolutions, now the picture was better, and the highest setting which I believe was the 1080i reso, cut off the start menu and task bar, but the second to last almost filled the whole screen, but when watching movies blacks were splotchy and still fuzzy, but reasonable. Now when searching the internet the white's would wash out the black letters and I couldn't read anything. This was the last thing I tried before I gave it up.

Now I know it's possible with this size HDTV, I've seen it before at another guys house, and the only difference I knew of right away was that his was a DLP, but didn't ask him how everything else was set up and what hardware he was using, but the screen looked perfect and filled the area completly. Maybe I will one day have to find this guy again and ask him if we can't figure this out. I also know there was no setting on my TV that I could change to make it any better, I tried and using the component and HDMI inputs it had no extra video settings other than the normal contrast and brightness. Please someone help with this!!!
 

Dafaad

Distinguished
Jan 18, 2006
5
0
18,510
Ok here goes, this may get confusing, as I am mostly confused. This should be simpler.

The TV is a rear projection Pioneer SD-582HD5 purchased about 4 or 5 years ago for the purpose of using the computer with the large monitor (HDTV). Nvidia was tooting their HDTV horn at that time to sell video cards for HDTV gaming. This is what I wanted, so I jumped in. I purchased a TV that had both S-video and Componant In connections.

S-Video doesn't work as will as Componant connection in my opinion.

The Computer/display adapter I had at the time only had S-video (Nvidia TI 4200 I believe) This actually worked but was so blurry you couldn't read text on the screen. Definately was not High Definition or even close. It was so bad I disconnected and forgot the whole idea for a few years. I also thought that S-video was the problem with the deal at the time.

Now time has passed and I upgraded my computer/video and Nvidia again is tooting their horn about HDTV. I bought a new computer with two PNY Technologies SLI PCI-Express GeForce 6800's with 128MB DDR on each that came with a Componant adapter to each. But only one should be needed of the adapter. Made for Nvidia (has their logo) by Compupack P/N 5511A001-001-LF Rev G. This device (adapter) to display cards looks like an S-video connection type connection. But it is more then that, it has many more pins, which are needed for all the connections on other end, including RGB Componant, S-video and composite I believe (yellow)

Logical connectivity would be to plug the adapter into the same video adapter that the existing monitor is connected to, then connect it to the Componant in on the back of the TV RGB to RGB and it should work if monitor is detected. That is what it looks like up front, however, there is much more to it. (Where is an electrical engineer when you need one?)

First things first, since the TV doesn't give you any information when it doesn't work, you need to have the computer monitor to get some feedback from what is going on. So, I connected the normal computer monitor to where it has always been, and connected the TV monitor to the secondary video adapter. This way I could see if the HDTV monitor was being detected correctly. Indeed it was showing up as the third display monitor. So this seemed to be going to work. Knowing that the refresh rate of TV's is no higher then 60, whereas the refresh rate of a computer monitor could go into the hundreds, you have to set the video refresh of the computer dispaly adapter to a compatible refresh of 60 for the TV. Otherwise, it might be hard on something, not really ready to go buy a new HDTV because of that mess up. I also dropped the resolution of the computer monitor setting to 1024 x 768 becase my TV says it can only go to 1080i. This is where I think the problem is occuring, the conversion of monitor display adapters resolution to the HDTV resolution is confusing. Why didn't they set a standard, is there a conversion program somewhere?

My TV Format Compatibility is: (Older HDTV)
480i; Horizonal refresh 15.734 kHz; Vertical refresh 60 kHz as mentioned
1080i; Horizonal refresh 33.75 kHz; Vert refresh 60 kHz
480p; Horiz refresh 31.468 kHz; Vert refresh 60 kHz

I know that the "i" stands for interlace, interlaced means the monitor draws every other line then comes back for second pass and draws the opposite line. This normally causes flicker and is not as pure a picture as the "p" format. Which means it is non-interlaced. It draws every line of pixels every pass, thus less flicker and better quality of a set.

So if my computer monitor is running 1024 x 768 Progressive (it is not interlaced) How do I change that to a 1080 Interlaced or a 480 Progressive signal. This is where I think the issue may lie. If a computer can detect an HDTV set, and it should, it should have a HDTV driver that converts this within the computer system to make the set work.
By scaling the display adapter signal to one that matches the video monitor as in HDTV. I think the failure is on Nvidia/Microsoft for this one. But more likely Nvidia. Microsoft's default VGA driver actually works a bit but is not in HDTV mode. So again it looks like the original S-video.
If you install the drivers for the Nvidia Cards, which do not contain the HDTVdisplay.inf for the Pioneer SD-582HD5 HDTV it will not know how to set the displays to work with that HDTV. Why hasn't Nvidia worked with Microsoft and the HDTV manufacturers to have the drivers available? Why don't the HDTV manufacturers have the correct monitor drivers available to computer users? How can you advertise HDTV on every box you sell without it working? This is marketing non-existent functionallity in my opinion.

Opps, starting to get a bit steamed again. I will shut up and see what you all have to say before I forget to think.......

Help Me Please!
 

CraziFuzzy

Distinguished
Apr 14, 2004
3
0
18,510
Okay guys, the key to getting good HDTV display from a computer is in the signal timings. Here's why. The video card is used to sending it's RGBHV (Red, Green, Blue, Horiz Sync, Vert Sync) signal from the VGA connection to a computer monitor, which will follow the sync signals and display virtually any resolution and refresh rate (within it's limits). HDTV's on the other hand, expect the information coming in to be in an HDTV format (480i, 480p, 1080i, or 720p). Any other rate that it can't immediately recognize as one of these, gets sent to a reprocessor, that will take that input, digitize it, and attempt to scale it to fit the screen. Different TV's do this with varrying degrees of success. The best solution would be to ensure the computer is always outputting an standard HDTV signal, so the picture doens't get scaled, and one pixel sent from the computer is displayed as just one picel on the screen (1:1 pixel ratio). This takes the use of 3rd party software that will allow direct manipulation of the horizontal and vertical refresh rates of the video card. The most popular software for this is PowerStrip. Basically, this software allows you to add custom resolutions and fix resolution to specific timings. My set (A Phillips Cineos 55" LCoS) is natively 1280x720x59.97 Hz, but because of the overscan, i had to create a custom resolution of 1240x700 to get the display to be fully visible (though the timings had to be the same as the default 1280x720). The timings I use are as follows:
Horizontal geometry
------------------------------------------------------
Scan Rate 45.055 kHz
Active 1240 pixels 16.700ìs
Front Porch 104 pixels 1.401ìs
Sync width 40 pixels 0.539ìs
Back Porch 264 pixels 3.556ìs
Total 1648 pixels 22.195ìs

Vertical Geometry
------------------------------------------------------
Refresh rate 60.073 Hz
Active 700 pixels 15.537ms
Front Porch 13 pixels 0.289ms
Sync width 5 pixels 0.111ms
Back Porch 32 pixels 0.710ms
Total 750 pixels 16.646ms

This works for me, but, every HDTV has different tolerances, and may need to be tweaked to work with your particular display. The way to check and see if you are truely getting a 1:1 ratio is to create a small 4x4 bitmap in Paint, of a black and white checkerboard pattern, and set it as the desktop background in tile mode. If you can see the crisp checkerboard spread across the screen, than the tv isn't scaling. If the checkorboard is blurred to a slight grey instead, then the timings are not being seen as HDTV. If this is set up correctly, you will be quite impressed with the clarity of the text, as it should be every bit as readable as on a computer monitor.

DVI may not always work, as DVI is 2-way, meaning the tv can tell the video card what resolutions it supports, and usually these are just the standard computer resolutions (640x480, 800x600, 1024x768). These will always result in scaling. Again, all tv's are different, and yours may behave differently.

Hope this helps,
CraziFuzzy
 

Kamma

Distinguished
Mar 14, 2006
2
0
18,510
Firstly, let me explain that I am well over my head on this one and I would be very grateful for any advice on my problem.

Okay, I have been struggling with some similar problems since I tried hooking up my Manavox 32MF605W 32" LCD to my computer using a DVI connection. I havent ever had a problem getting a picture however the resolutions are fixed so that they only take up a part of the screen. In other words, 800x600 is a small box on my screen and the screen caps out at around 1360x638 (I think). Also, the only way I can set this resolution is through adjusting it using ATI's Catalyst Control Center. (I am running on an old 9800 pro) There is no flicker when running in the 720p mode however I do see a flicker in the 1080i. I am not sure how to adjust the signal timings and was wondering if anyone who is familiar with ATI's systems can help me in this. Also, a huge annoyance is every time I reboot the computer all of the adjustments I made in the Catalyst control center are reset and I have to reset them all again.

I have other issues involving games however I dont know if this is the place for them and I would be quite content just getting the above issues fixed. I am wondering also if perhaps a newer video card might resolve some of these issues. It seems as though these LCD TVs should just be larger LCD Monitors but they dont appear to be as plug and play as we wish.

Thanks.
 

CraziFuzzy

Distinguished
Apr 14, 2004
3
0
18,510
Kamma:

My recommendation would be to try Powerstrip, and create a custom resolution from there. The resoution you'd be looking for for perfect 1:1 would be 1366x768. However, most video cards will not switch to a resolution over DVI if the monitor does not explicitly state that it accepts that resolution. This is why I choose to use the VGA input instead. It is much more flexable, and you should be able to find a resolution that wll provide you with the PQ and full screen viewing you want.
 

Kamma

Distinguished
Mar 14, 2006
2
0
18,510
I love how computers magically fix themselves if ignored for long enoug.

Long story short, I plug in the dreaded VGA cable that disgusted me with with its poor picture the first night I bought my TV and low and behold... the TV turns into a 32" LCD monitor with literally no over or underscan, no flickering, and all the crispness that I never quite got to work with the DVI.

I thank you for your advise (even though after 2 hours of working with powerstrip I only figured out new ways for my computer to lag) and I hope that in the future things are more plug and play friendly.

(yeah, after all that I just had to do what the manufacturer suggested and plug the VGA cable in the "PC" slot on my tv... but nooo, I wanted to be different. sigh, different is overrated)

Thanks CrazziFuzzy,

kamma
 

sithspaun

Distinguished
Apr 19, 2006
1
0
18,510
Okay, so I am a complete novice when it comes to these matters.
I just bought a DVI-HDMI cable to hook up to my 30" LG CRT HDTV. The resolution that works best seems to be 1152x648, but even then the sides are blurry and hard to read. Is there anyway I can get this to work with my DVI-HDMI cable? And I d/l Powerstrip, but I have absolutely no idea where to start when creating custom resolutions or timings or whatnot. Any help would be much appreciated.
 

Haemogod

Distinguished
Aug 13, 2006
1
0
18,510
Hi i was using google to find something and this is the only place that had hits can i find out what:

Compupack
P/N.:5522A001-001-LF
REV.:E
MADE IN CHINA

Is thanks


EDIT: It came with a pixelview 7800 GT
 
G

Guest

Guest

 

DKramar

Distinguished
Oct 29, 2007
1
0
18,510
I have had GREAT success with my Samsung A330 connected through a custom built PC.

The TV: http://www.bestbuy.com/site/olspage.jsp?skuId=8742818&st=A330&lp=1&type=product&cp=1&id=1202648987833

The PC:
Q6600
1 GB RAM
MSI NVidia 8800GT

The Connection:
DVI to HDMI connector
16ft HDMI cable

Madden 07, Brothers In Arms: Hells Highway, Flight Simulator 2004 (I wish I had FSX) look spectcular. Vista Ultimate Dream Scene is just fanatsic and makes all the hassel of Vista wort it when I use my 720p scences.

All games have to run at the native 1360 * 768, but they look great even from 2feet away.

No issues with text on screen... I completely stopped using my ViewSonic 20.1 1680 * 1050 LCD Monitor.

Hope that helps!

 

dracoaffectus

Distinguished
Dec 5, 2008
2
0
18,510
I've recently decided to connect my older computer to my new LCD HDTV, the primary purpose of this is to watch shows on Hulu, and downloaded movies from Netflix.

The PC:
ATI Radeon 9550 AGP 256mb graphics card
2.4 GHz P4 processor
Vista Business


The TV:
Samsung LCD HDTV 1080p(I forget the model number...)


Using a VGA connection, 1360x768 resolution, 60 Hz refresh rate



The picture displays beautifully, but there's a major lag issue when trying to watch videos. The videos don't play smoothly at all, in fact, they really stutter seemingly showing only 1 frame every second (roughly).

I'm trying to narrow down the problem between the video card, processor, OS, or the connection type, and would appreciate any advice. Thanks.
 
You neglected to mention how much RAM you have in that system. I'd say 1 GB is probably the minimum you want and probably a graphics card upgrade to something newer like the HD 2400 Pro (AGP).

-Wolf sends
 

pojan

Distinguished
Apr 17, 2010
2
0
18,510
Yes i'm replying to this but just to give a straight example of how to do it.

1. Get a samsung tv that's 720p or 1080p simple nowadays. 1360x768 , 1920x1080
very simple

2. Get a graphics card that supports HD. nowadays the cheap ones that will work great are probably the 4800 ATI cards. If you have some extra cash buy the 5870 ATI card. It's currently the highest selling card although is $400. with it you'll be able to play most if not all dx11 games at max settings.
*This is the more important part. don't buy some cheap crappy video card cause it will suck.

3. If you plan on playing bluray through your compuer. Win 7 64 bit is nice. 4 gigs of decent ram, 2.7ghz dual core cpu, decent mobo, above stated video card. 7200rpm drive minimum. If you get a SSD drive that would be best or a 10,000-15000 rpm would be great too. but 7200rpm is minimum for it to be able to play without lag. Get someone you trust to build you a nice computer for about $1000. I know i can make a high end gaming comp for that that can do everything a gamer needs. unless they are crazy hardcore and want above 80fps in games.

4. Media player classic home cinema is a very nice program which has a 32bit or 64 bit version that I have found to be the best bluray and regular video player. It is very tweakable. It has not let me down on 1080p videos at all. It doesn't play some videos but that could be because i missing some drivers. Although just use like smp player which is nice but not the best for bluray rips.

5. the cable from your video card to tv hd. most people will say cables are cables. 100$ same as $25. most likely true. Although the more expensive have more shielding and depending on where these cables are you may want more. but a $30 cable is usually enough for a 3meter length. don't be ripped off. DVI to HDMI cable or HDMI to HDMI with a DVI-HDMI dongle on your video card.

6. LCD tv hd. Don't buy a cheap brand. TV brands have quality too. a cheap brand will die faster as the parts that are used are cheaper that's why its a cheaper tv. Best brands are Samsung, Sony, i heard something of toshiba.. but the later to are best. currently you can get a samsung or sony 40inch 120hz 1080p for $1000 or less which are very good televisions.


** I've had 2 lcd samsung tv's now and been using them as a monitor for the last 4 years. No problems they are plug and play if you don't buy crappy parts.

Goodluck all.
 

JohnnyTechSocial

Distinguished
Apr 20, 2010
6
0
18,510
pojan couldn't have said it better. Spend the money and get yourself a good LCD that will last you a good number of years. I have a Samsung 1080p LCD TV and I'm able to connect it to my mac via DVI to HDMI, I can watch videos from my camcorder, and I can update software that I download from the web site.

If you're planning to connect it to your PC often, a solid graphics card is necessary. ATI Radeon HD is the way to go; expect to spend around $200.

I can be here all day, but these are the basics.
 

sturticles

Distinguished
Sep 24, 2009
1
0
18,510
Hi. I'm using a PC with a 5850 ATI Sapphire GPU connected via HDMI to my Sony Bravia 46" X3100 model HD LCD. I have a problem.
It isnt giving me full screen. The resolution states its currently at 1080p, however there are black bars all around, the active screen in the middle.
I'd say about an inch or so on the left and right, and less at the top and bottom. 16:9 ratio i assume.
What is my problem? I've updated all drivers and stuff...
 

fazers_on_stun

Distinguished
Aug 31, 2006
207
0
18,860


Just an FYI - you really should start your own thread instead of tacking on to a 4-yr-old thread - more likely to get responses that way.

However, in answer to your question, not only do the TV and video card have to be at the same 1920 x 1080 resolution, but you also have to set the TV to "dot-by-dot" or no overscan in order to fill it up exactly. Look in the TV manual to see how to do this. On my Pioneer plasma TV, it's called "dot-by-dot" but I imagine Sony uses some other terminology.
 

DV8_77

Distinguished
Jun 30, 2010
1
0
18,510



I started a new thread but no response (http://www.tomshardware.co.uk/forum/291984-15-monitor-flickers-connected-hdtv-hdmi)
Basically, my problem is when i connect my PC to the HDTV, the PC monitor starts flickering. Any ideas?

PC config
q6600
4GB ram
2x 250GTS SLI
Philips 22" 220VW (1680x1050)
HDMI cable over DVI to HDMI dongle

Sorry for tacking onto another thread but i am desperate