63% of Consolers Think Firmware Makes TVs 3D

Page 4 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

sagansrun

Distinguished
Jan 15, 2008
47
0
18,580
No, console gamers are NOT tech savey. Thats why they use consoles because they dont have to learn anyting about their PC and its gaming API's, video cards, memory, etc...
 

nforce4max

Distinguished
Sep 9, 2009
516
0
18,960
This is why console gamers are often called "newts" and they are hardly true gamers as there is no work involved in building their machines ect. A true gamer always builds his/her own rig from parts only and knows how to maintain their hard earned gear.
 

spectrewind

Distinguished
Mar 25, 2009
194
0
18,630
[citation][nom]haxfar[/nom]The reason that 120MHz is min. is that when the screen shows a 3d image it is 2 pictures. That means the screen can show a single image at 120MHz or two images at 60MHz.[/citation]

Clarification for everyone else:
You meant Hz, not MHz...
 

randomstar

Distinguished
May 2, 2007
120
1
18,660
[citation][nom]JMV2009[/nom]Most TV's claim they are 100/120/200/400 Hz, but their input only takes 50/60 Hz. TV's which take >100 Hz as input are sold as 3D TV's.[/citation]


that 50/60 is input power alternating current, not video signal. no relation except they are connected to the same device. and the AC power is then "recified" and smoothed and filtered to DC power for just about everything anyway.
 

HavoCnMe

Distinguished
Jun 3, 2009
245
0
18,830
This is why consolers think they are better than PC gamers...but its so sad that they don't really know what a firmware update is.
 

rrobstur

Distinguished
Mar 30, 2009
10
0
18,560
[citation][nom]roguekitsune[/nom]This is just my experience but console users fall into two groups: Group 1: Morons that cant tell you anything about a computerGroup 2: Relatively knowledge people that cannot afford a high performance desktopUnfortunately most console users fall under group 1. And yes there is an extremely small percentage of people that fall into neither[/citation]
you missed more catagory
3. enthusiast hordes... xbox, ps3, wii, 6 computers all built by owner and a room full of junk computer parts
 
G

Guest

Guest
good heavens no....

Watching/playing anything at 30Hz would have the same effect as a flickbook
 

dstigue

Distinguished
Jul 14, 2006
46
0
18,580
The only difference between 3D and non 3D is IR emitter for timing of the glasses and the fps. Which accepts 120fps vs 60 of standard non 3D but with most old LCD's capable of 120 and 240 hz why wouldn't 3D be capable. I know HDMI 1.4 is a new standard but the cable hasn't changed neither has the interface. Do Tv's use software/firmware for the interpretation of the signal or is there a dedicated signal processor? If dedicated I understand the problem. But I don't understand the price jump in 3D vs non. The cost associated doesn't double the price. That's just plain ridiculous. If google TV does come out there will be more software processing within TV's and stuff like this may be possible. If it isn't right now. Technically the IR synching could be external addon through usb. I mean how hard is that the Wii does it for their controllers. Hell use the wii one with timing from the TV. I don't know maybe I am all wrong but I can see why people believe it's possible.
 

decrypted

Distinguished
Apr 16, 2010
41
0
18,580
[citation][nom]alidan[/nom] now if im right, movies display film at 24 frames a second, that would60hz is 60fps, so devideing that up its 30fps per eye, which is higher than movie theater 3d.this all hinges on if im correct.[/citation]

Not exactly. Original movies were 24fps (and still are today) and shown at 24fps, which is why old movies used to flicker, hence the name "flick". Todays movies are still 24fps, but each frame is shown 3 or 4 times giving you 72Hz or 96Hz, similar to what TVs do which is why modern movies don't flicker anymore. Watching material at actual 30fps would cause a lot of eyestrain.
 

ulysses35

Distinguished
Jul 2, 2008
86
0
18,610
Too many fanboys on here... jump on the obvious without reading the full details.

Any TV can display 3d using those cheap and nasty cardboard glasses, and any DVD player cna play a dvd that has 3d content so yes a console by the same definition display 3d content on any TV - again using cheap cardboard 3d glasses.

Bored with the whole PC is better than a console.... sure it is... much more expensive too

 

kartu

Distinguished
Mar 3, 2009
379
0
18,930
I am a "technically inclined" person, with a lot of experience in IT, including soldering.

Could you enlighten me, why can't we use "plain TVs" that support high refresh rates as 3D TVs, please? With the player sending sync signal...
 
G

Guest

Guest
Just out of interest but has anyone mentioned that the survey could have been misleading? As the article points out, Firmware and 3D display are linked and it could be that the question is ambiguous in that it could be interpreted as referring the the system attached to the TV?

Justin
 

rasagul

Distinguished
Jan 27, 2010
30
0
18,580
This is too funny a local radio station did an april fools joke telling people all they had to do was reverse the wires and wrap tin foil around their sunglasses to get 3d. They were even walking callers thru the steps like tech support.
 

f-14

Distinguished
Apr 2, 2010
774
0
18,940
i remember an adaptor you could hook up to your tv back in the 70's or 80's that would split and slightly offset the position of where the red and blue in the rgb placement on a tv effectively making the 1950's 3-D on your home tv. seems tv manufacturers are just now getting around to using that 1950's gimmick in a concerted effort to make up for lost sales with the digital conversion that isn't going over so well, which also has just brought television stations to the brink of their demise ( a plot by libtards to kill the boobtube?).

i see no reason why that technology from the ittby bitty adaptor from the 70's-80's couldn't be incorporated into a firmware update that could just do the same, reassign the placement of the red and blue pixels to offset. seriously if you think you need special hardware to do this, you clearly can not copy cd's or dvd's or play starcraft 2 on a 2005 computer.

for all of you claiming to see 3-D on a 1-D panel, please share what you guys are smoking with me, because i see a screen that when i stand 90º to either side of it or behind it i should be able to see it as 3 demensional. perfect example is ironman tony starks house and his blueprint visuals. that's 3-D you can walk into it and see everything from the inside, left right front back up down. anything less then this is a gimmick and outright LIE. 1950's red blue color split/shutter techonology is a joke and a migraine in the making.
 
Status
Not open for further replies.