1080p tv can't use 1080p from computer

GraySenshi

Commendable
Apr 15, 2016
9
0
1,510
Basically I have a philips flat tv 42pf9966/37 it has a hdmi input where I plugged in my laptop and or desktop and it wants to crop the image if it's on 1080p even tho it's a 1080p tv. The PC says 1024x768 is recommended. So I looked up the full specks and is said

Computer formats
640 x 480, 60Hz
800 x 600, 60Hz
1024 x 768, 60Hz
Video Formats
640 x 480i - 1Fh
640 x 480p - 2Fh
720 x 576i - 1Fh
720 x 576p - 2Fh
1280 x 720p - 3Fh
1920 x 1080i - 2Fh

What's the difference between computer and video format pixels are pixels right. And is there a way to get 1080p from my computer. I was going to set up a streaming computer to the tv and do some light gaming with a controller but will make it hard if I got to use a 4:3 aspect ratio.
 
Solution
With tv`s it comes down to the specs of the tv display screen it`s self.

If your card and the resolution table only has a maximum of 1920 x 1080 i .

Then it means the tv it`s self is only capable of providing a interlaced resolution at the stated resolution.

Interlaced is where the image of the screen is displayed at a lower Hz frequency of 30 Hz vs the 60 Hz or a progressive video signal.

The image is actually doubled by adding extra display signal information, or extra blanking lines to the image.
That is why if you look closely at a Interlaced signal say of 1920 x 1080i the image looks slightly less defined and text seems to be blurry in some cases Vs a true progressive video signal of 1080p where it draws each line of the...

t53186

Distinguished
Aug 6, 2006
140
1
18,710
Looking at the specifications for that plasma TV it appears 1080p is not a supported format when using the HDMI input. So, sorry to say the TV is the limiting factor, not the PC.
 

Shaun o

Distinguished
With tv`s it comes down to the specs of the tv display screen it`s self.

If your card and the resolution table only has a maximum of 1920 x 1080 i .

Then it means the tv it`s self is only capable of providing a interlaced resolution at the stated resolution.

Interlaced is where the image of the screen is displayed at a lower Hz frequency of 30 Hz vs the 60 Hz or a progressive video signal.

The image is actually doubled by adding extra display signal information, or extra blanking lines to the image.
That is why if you look closely at a Interlaced signal say of 1920 x 1080i the image looks slightly less defined and text seems to be blurry in some cases Vs a true progressive video signal of 1080p where it draws each line of the display image.

With Interlaced it is only drawing half of the lines for the display image and fitting in blanking lines to reach a resolution of 1080i.

In windows to confirm if the tv can only display a maximum resolution of 1080 i
Right click on the desktop, and select screen resolution.
Click on the advanced settings tab.
Then the list all modes tab for the graphics display adaptor you have.

If there is no 1080p resolution settings listed in there to select then your tv is only capable of displaying a interlaced resolution of 1080i as a maximum GraySenshi.

Short of that you will have to by a new tv that does a 1080p resolution.
It can also depend on the type of video output to the tv you are using, if you are using a 15pin dsub from the laptop, or your desktop it is likely the reason why you are not getting the full 1080p resolution to display.

You must use a Hdmi connector from the laptop, or desktop system to a Hdmi port on the Tv, or use a Digital Dvi port to Hdmi cable or adaptor.


 
Solution

GraySenshi

Commendable
Apr 15, 2016
9
0
1,510
Rephrase the bolth computers can't get the 1080p it can only get 1024x768 max and the tv specks online I found it was 1080i video format and 1024x768 computer. And bolth computers can support 5120x3200

So you allready answered why it's not working and the difference between i and p ... So thats why the PC format it does 4:3 aspect ratio.