I am looking to replace my current 5 year old 42 inch Sony with a new RP HDTV and I have fallen in love with the 50 inch SXRD XBR. In this month's issue your magazine stated that Sony has a new lower priced SXRD which accpets 1080p. This is very appealing to me in that the price point is very nice. With the knowledge that the old XBR upconverts to 1080p but has no input, and the new SXRD has the input but seems to make little difference, I was trying to decide if I should try to get a clearance 2005 XBR or a new lower priced SXRD.Could you please tell me what the differences are between the new lower priced SXRD line and the higher end 2005 SXRD XBR line?
More 1080p Questions
Asked by Aron:
Hi Geoff. Good article -- it's important for people to understand that, in principle, you don't lose info with 1080i. However, I think it's also important to emphasize that, in practice, deinterlacing is not trivial, and that therefore virtually all TV's (and progressive scan DVD players) introduce some degree of error into the process (even if it's not the kind of wholesale error that you uncovered in your recent article on bob vs. weave).
Also suppose you have a TV that accepts 1080p/60, but not 1080p/24. In that case, 2:3 pulldown is unavoidable for film-sourced material. But is it at least possible to avoid interlacing/deinterlacing? i.e, will any of the next-gen HD-DVD or Blu-Ray players that output 1080p/60 implement 2:3 pull down using entire frame (rather than by creating interlaced fields, as is done in the current players)? I'm looking for a player that will do 1080p/24->1080p/60, rather than 1080p/24->1080i/60->1080p/60.
Answer: You are correct, it is possible to lose something in the interlacing or de-interlacing process. It's going to be very slight unless it's done horribly wrong. While it would surely be ideal to be able to output the source at its native rate and display it at its native rate, for the vast majority of displays this isn't possible (due to technology and $$$). What I hoped to point out is that the difference between 1080p and 1080i is incredibly slight, for the most part. I had been getting emails from people who seemed to think 1080p was twice as good as 1080i, and for movies this just isn't the case. In fact, on most TVs I doubt you'd see the loss of anything from correctly done de-interlacing, given all the other things a TV can do to screw up an image. Does seem worth investigating, though. Perhaps an article in the future.
I believe the next BD players from Pioneer and Panasonic will skip the interlace step and just add the 3:2 to the 1080p/24 to get 1080p/60 directly. The Samsung, being the first player, is doing something rather wacky with that extra step.
Asked by TomH:
Great article, but how do I know if a TV de-interlaces 1080i correctly?
Answer: Unfortunately, there is no easy way for an end user to check this. You can check for your TV in this article or this article. With any luck, we'll do more of these on a regular basis. Also, we do that test to any 1080p display we get in here. Check back after CES, there may be some news about a way for you to do the test (but still not easily).
Asked by Mike:
Loved the article. What about those of us that purchased the 720p DLP TVs that were top of the line last year? How can we take advantage of the additional lines of resolution of the new video formats?
Answer: Just because your TV is 720p doesn't mean you won't be able to see the benefits of HD DVD and Blu-ray. This is a BIG question, and one I get a lot. If you can see a difference between HD and SD on your TV, then you will absolutely be able to see a difference between HD DVD/Blu-ray and DVD. If you like watching HD (and who doesn't), then you will love these formats, as they are the best looking HD you have ever seen. Vastly superior to the compressed crap we get on cable and satellite. A word of caution, though, the Samsung and Toshiba players have terrible 720p outputs, so you're better off sending your TV 1080i. Of course that brings up the whole de-interlacing question again…So check which looks better on your TV.
Asked by Jake:
I have a Vizio P50 720p plasma... I notice that if I output 1080i from the Xbox360 and Sony upscaler DVD player, the image "looks" better/sharper. How is that explained relative to this article? Does 1080i technically & visually look better on 720p displays? Curious to know! :)
Answer: I have found similar things in my testing of many TVs, though I'm not entirely sure why this is the case, as 720p and 1080i have the same bandwidth. There are many variables (including the source not doing a good job with 720p), but if it looks better with 1080i, stick with 1080i.
Asked by Dave:
So tell me this, on both my LCD and CRT computer monitor I sending HD signal at a refresh of 72Hz. Now why doesn't the image look 'progressive' as some of the plasma/lcd/dlp sets? The motion just doesn't seem as fluent even though I am running at the same or higher resolution!? This has been a huge question that has never been answered.. or that I have found :P
Answer: First, I want to be clear that resolution and interlacing are two completely different things. 1080i and 1080p are exactly the same resolution. This is more a difference of frame rate, really. 1080i has 30 different frames, and these are interlaced (split in half if you will), and flashed on the screen fast enough that your brain combines them into full frames. 1080p is (as far as we're talking) 60 different frames per second. So there are, as far as the format goes, twice as many frames as 1080i. What the last post discussed is that as far as HD DVD and Blu-ray are concerned, those extra frames are created by the player, and not in the source. So in this case there is basically no (Ok Aron, very, very slight) difference between 1080i and 1080p, as long as the de-interlacing is done correctly.
As far as your question goes, there can be a number of variables here. I'll take one that most people overlook. No matter what you refresh at, 24fps material (film) is not going to look as smooth as 30fps material (video). Ever notice that your local news looks a certain way, and movies look a different way, that's video v. film. You wouldn't want film to look as smooth as video, it wouldn't look right. Philips and their PixelPlus circuitry makes film look like video. Some people like this, I for one don't (thankfully, you can turn it off). The advantage of the 72Hz refresh is that it takes the stutter out of the motion that happens when you double certain frames and triple others. The most noticeable example of this is a slow pan across a wide scene (say a landscape). The camera will look like it's moving, then stutter, then move, then stutter. That's the 3:2. When you do a 3:3, it will be smoother, but never as "fluid" as if the same scene was shot on video. This is a good thing. Years of subconscious learning has trained our brains to equate film with fiction, and video with reality. Who wants reality at the movies? It's also why 1080p/24 HD cameras are so prevalent, as film (and most fiction TV) directors want to keep that look.
Keep the great questions coming.
- Log in or register to post comments
Hi Geoff. I really appreciate your response to my question - so please forgive me, but I'm going to press you a bit more on the issue of deinterlacing. Basically, I'm confused by your statement that "the difference between 1080p and 1080i is incredibly slight, for the most part." And the reason I'm confused is that, even in your own reviews of display devices, you frequently comment on deinterlacing performance. For instance, note your repeated (and, I assume, quite accurate) observations about "jaggies" - or your praise for the deinterlacing peformance of the Realta HQV. All of which says to me that the deinterlacing in the typical HDTV is, in practice, far from perfect. So if one can avoiding deinterlacing, by using 1080p instead of 1080i, one should therefore obtain a visibly better picture.I realize you wanted to dispel the myth that "1080p is twice as good as 1080i." But isn't it going too far to the other extreme to say the difference is, essentially, marginal? Or am I missing something?
Justin, About the new Sony displays. I too have been following the progress on these XBR displays.They have two LCoS models out now that are 1080p with 1080p input KDS55A2000 and KDS60A2000. These aren't a 1 to 1 upgrade to the XBR series but are cheaper. These displays are missing a couple of features like Picture in picture and the DRC-MF engine is an older one than the initial XBR series. These displays are both <$3000.00 I believe.I'm not sure how significant the DRC-MF engine is, but I've read it does make a difference when watching standard def. TV.Sony is supposed to release pricing on the KDSR60XBR2 and KDSR70XBR2 any day now. I'm guessing these will be in the $4500 range.
Thanks for the help. I actually got to see one of the new SXRD displays yesterday and they are an upgrade on the Grand Wega but the SXRD XBRs are noticeably better. So it seems like the A2000 line is kind of a compromise between the two. I really would like a 50 inch instead of something bigger so it seems I may have to pursue the 2005 model XBR. That said, does it benefit in the long run to be able to have the 1080p input and the additional HDMI and component inputs the new SXRD offers? The 2005 XBR only has one HDMI input and is not capable of accpeting the 1080p signal even though that is its native resolution. I understand that it is exceptional at de-interlacing but the number of high end inputs may better fit my needs in the next 5 years.
[CONTD. FROM LAST POST] I should call to your attention that your article has generated headlines in, e.g., www.hdbeat.com like:
Everyone is talking about 720P TV's. Many HDTV's are 768P...so would is it better to send these TV's a 1080I signal from a HD DVD player? Would the TV's downconvert/deinterlace directly to 768P or downconvert/deinterlace to 720P and then upconvert to 768P?NOTE: Based upon your tests my TV does deinterlace properly.
Geoffrey, thanks for all the truly useful information you provide.I have a Sony LCD 40 inch XBR1, which has I believe a native resolution of 768p.With respect to a program on HDTV which is being broadcast in 1080i, if all else were equal, would it look the same on a televison (e.g Sony XBR2) which has a native resolution of 1080p ?
I'd like to retract a comment I made in a previous post, about the Sony XBR1 being 'overall a better TV' than the A2000. I'd never done the comparison myself, and should have instead said that the XBR1 "has been reported to be better than" the A2000. Mea culpa for not being more precise. But now I have seen both. I compared the two using the HD-DVD of Sahara. Neither set was calibrated, so it's not useful to talk about, say, color accuracy. But one thing that should be independent of calibration is their relative level of SSE (silk-screen effect). And to my eyes, the very prominent SSE of the XBR1 (which had always bothered me) was somewhat reduced on the A2000. I'm not saying that the A2000 is the better set - maybe it's not; I'm simply saying that there is at least one area in which the A2000 seems to be better.
Hi,Have a question..but I don't know if it's on the topic I keep reading, mainly high def displays. I recently bought a Denon AVR-4806 reciever as an open box demo. The salesman told me that it was "about a month old", it wasn't abused, and not used much at all. I got it for such an unbelieveable price that I just couldn't pass it up. What risks am I incurring by buying an open box receiver, even one of such high quality as the Denon? Is there any way I can tell if its not up to par, and should I get it bench tested or checked out just to be sure? Any feedback you have would be greatly appreciated. Thanks. Jerry.
I have a quick question about scaling a 720p or lower resolution image to 1080p:I have a Westinghouse 37w3 1080p set and recently Madden 07 came out. In all the reviews and most impressions I've read on message boards and what not the aliasing in the game doesn't seem to be that big of a deal. However, the image on my Westy is quite aliased - more than pictures/videos I've compared to it on the web.I'm wondering if aesthetic flaws like aliasing might become more pronounced when going from whatever the native resolution of the game is to 1080p - similar to what would happen if I enlarged an image or played a computer game at a lower than native rez that was still filling the whole screen.Thanks
First, in response to Aron, I'm not exactly clear what he means by "silk screen" effect, but I can say that making adjustments in the Custom Mode on the Backlight, Picture (contrast), Sharpness, etc. have a GREAT impact on the overall look, far beyond just "color accuracy" and the like.I've read that 1080p sets (e.g., again, the XBR2) may have more difficulty with 480i SD signal thru the HDMI input. Is there any truth in this?
Not all of us have $5 or $6K in our pockets for the "best" 1080p/1080i TV now. Neither are we prepared to spend another few thousand bucks 5 years down the road to "upgrade".We need a reliable good performing TV, which is as future-proof as possible (i.e. will accept and properly process signals from a blue ray/HD DVD player).Are there examples of such sets (presumably soe 720p sets, or something like the 1024 x 1080 resolution sets from Hitachi)?
Allen, thanks for your response. "Silk screen effect" (or more commonly, "SSE") supposedly results from the coating applied to screens on RPTVs to increase light ouputput. It looks like a layer of very fine granular material floating above the pic., and is most evident when viewing uniformly colored areas (e.g., a large patch of blue sky). I've heard that adjustments do matter (e.g., see my posts of Aug 23, above), but unfortunately I can only compare sets in stores, so I have no way to assess what their relative PQ's will be like post-calibration (e.g., pre-calibration the Sony might look better, but post calibration it might be the Sammy or JVC). Have you had a chance to compare adjusted versions of the Sony XXA2000, Samsung HLSXX87/88, and/or the JVC XXFH/N-97? Also, in response to Al's question: you can get 55-56" 1080p RPTV's for well under $3000. [And if you want to go below $2K there are some great deals on 720p sets -- e.g., the JVC.]
I own a Sony Qualia KDS-70CQ006 70 Rear-Projection LCoS TV (SXRD). In the November 2006 issue of Home Theater Magazine, Gary Merson lists 61 HD TV sets that "Passed" or "Failed" a "Deinterlace" test and a "3:2 Film Candance Test" (page 42). Do you have the same results for the Sony Qualia KDS-70CQ006 (a very expensive set sold in 2005 that was and may be still is considered one of the best HD TV sets)? I was concerned that two out of the three SXRD HD TV sets sold by Sony in 2006 failed the Deinterlace test and all three failed the 3:2 Film Candance Test. The Sony Qualia KDS-70CQ006 (SXRD) uses similar technology. I can tell you that the performance of this set with a Toshiba HD DVD player (first generation) was outstanding and clearly the best of any set tested.Thank you for reading my Blog. Is it possible to provide a copy of the results from the 54 different 2005 HD TV sets tested by Gary Merson earlier in 2006 (as referred to on page 39, first paragraph)?
Geoffrey, great discussion here and in HT mag. I have a better understanding of the 1080p/1080i issue. One aspect is still confusng however: Gary Merson in the Nov HT also does a 3:2 film cadence test. I'm trying to understand when and if this aspect of a TV's performance is relevant. You have stated previously that the 3:2 dropdown can happen in various places. If it is done in a Blu-ray etc player, then a TV could then accept 1080p/60 if the player outputs it, or it could accept 1080i and deinterlace it. In these scenarios, I would think there would be no need for any further 3:2 processing -- is that correct? Similarly with 1080i broadcast hd, hasn't the 3:2 dropdown already been done, and the TV only has to deinterlace? So is the 3:2 dropdown capability in a TV only for accepting 1080p/24 direct? If so, it would seem this isn't that important? Continued ...
continued from last post...Case in point, Gary's results on the Sony KDL46XBR2. It gets a fail in all 3 categories. Except that turning DRC off allows a pass in deinterlace and bandwith. And if I am not mistaken, as long as you are not using a 24 fps source, the 3:2 Film Cadence test would be irrelevant. Or am I missing something?BTW, it would be great if you reviewed some of the latest generation of 1080p LCDs -- perhaps addressing these issues. The Sony XBR2's have been referenced in HT Mag since August, yet with the exception of a very brief mention in "Coming Attractions", there has been no serious review of that generation.Finally re: your debate with Aron on 1080i. I appreciate that the original intent of your 1080i/1080p articles was to point out that due to the inherent nature of 24 fps film, there is no real resolution difference in the two formats. However, given so many 2006 TVs in Merson's article still fail the interlace test, how can you not advocate 1080p source into 1080p TV?
Hi, I have the same question as Ralph above...is the 3:2 dropdown capability in a TV only for accepting 1080p/24 directly? I'm very interested in this answer, as my set (Panasonic TH-50PX60U) passed the 1080i deinterlace test but failed the 3:2 film cadence test.
Another angle on 1080p: I have the Sony 60A2000, and am waiting for my service provider to deliver hd (4 days away!). In the meantime, I have been quite dissapointed by the set's handling of 480i. I understand that the set must upconvert the 480i to 1080x somehow, and it has to do that somehow, either internally converting analog to digital, then upconverting, or upconverting the analog, then converting to digital. In either case, I'm not too happy. 480p from a dvd looks much better. I just got a Samsung dvd player that upconverts via Faroudja to 1080p, but got to thinking I should have got a dvr instead as I don't imagine I'll be happy with my VCR either. I started looking for a dvr with Faroudja upconversion, but the best I could find was 1080i. A salesman said there would be no difference between 1080i and 1080p because the set would convert the i to p anyways. I know you've said that there is no difference between 1080i and p, but (cont'd...)
Geoffrey - Thanks for your relentless coverage of this escalating technology. In reading through your past articles on whether certain TV's properly de-interlaces 1080i, I couldn't help but notice that you mostly concentrate on LCD's and Plasmas. I purchased a Sony "Ruby" projector, and was wondering if it's doing the best that it can. Until I can purchase a good combo HD player, I want to make sure that what I've paid for, I'm getting. Thanks. Matt
Just a question - every best buy or CC or Sears store I've been in has been running coaxial cable to the back of their tvs so they can share signal among all tvs. this to me makes comparing them in stores pointless since that's not the signal I'd be using at home (component or hdmi). am I missing something. secondly, i've read many posts by former or current employees at these stores that say they often fudge tv settings to make certain tvs look better so that they get sold. can we really trust any tv comparisons except by those who can afford to buy them or are lucky enough to be on a "reviewer" list for tv companies?
First off, all your articles are awesome. You're the best source of info on this topic that I've found thus far on the web, so Thanks!But ...I'm still a bit confused as I DO NOT have a 1080p HDTV. I have a 5 year old Sony 1080i HDTV which I love. Will my Sony 1080i Direct View HDTV be able to receive and correctly display the content from the latest Blu-Ray players? And if so, what kind of a difference in picture quality can I expect?
just wanna ask. i have and lcd tv that transmit 1080i and not 1080p. Can i watch movies that are recorded in 1080p? Will it downscale to 1080i? What about movies that are recorded in 720p? Will it be upscaled to 1080i? If so, will it be better? Thanks