Thanks for the great informative article!Now i understand this whole 1080p stuff much more clearer!
1080i v. 1080p
For clarification, let me start by saying that there are essentially no 1080i TVs anymore. Unless you bought a CRT based TV, every modern TV is progressive scan (as in LCD, Plasma, LCOS, DLP). They are incapable of displaying a 1080i signal as 1080i. So what we’re talking about here mostly applies to people with 1080p native displays.
Movies and almost all TV shows are shot at 24 frames-per-second (either on film or on 24fps HD cameras). All TVs have a refresh rate of 60Hz. What this means is that the screen refreshes 60 times a second. In order to display something that is 24fps on something that is essentially 60fps, you need to make up, or create new frames. This is done using a method called 3:2 pulldown (or more accurately 2:3 pulldown). The first frame of film is doubled, the second frame of film is tripled, the third frame of film is doubled and so on, creating a 2,3,2,3,2,3,2 sequence. It basically looks like this: 1a,1b,2a,2b,2c,3a,3b,4a… Each number is the original film frame. This lovely piece of math allows the 24fps film to be converted to be displayed on 60Hz products (nearly every TV in the US, ever).
This can be done in a number of places. With DVDs, it was all done in the player. With HD DVD, it is done in the player to output 1080i. With Blu-ray, there are a few options. The first player, the Samsung, added the 3:2 to the signal, interlaced it, and then output that (1080i) or de-interlaced the same signal and output that (1080p). In this case, the only difference between 1080i and 1080p is where the de-interlacing is done. If you send 1080i, the TV de-interlaces it to 1080p. If you send your TV the 1080p signal, the player is de-interlacing the signal. As long as your TV is de-interlacing the 1080i correctly, then there is no difference. Check out this article for more info on that.
The next Blu-ray players (from Pioneer and the like) will have an additional option. They will be able to output the 1080p/24 from the disc directly. At first you may think that if your TV doesn't accept 1080p, you'll miss out on being able to see the "unmolested" 1080p/24 from the disc. Well even if your TV could accept the 1080p/24, your TV would still have to add the 3:2 pulldown itself (the TV is still 60Hz). So you're not seeing the 1080p/24 regardless.
The only exception to that rule is if you can change the refresh on the TV. Pioneer's plasmas can be set to refresh at 72 Hz. These will take the 1080p/24, and do a simple 3:3 pull down (repeating each frame 3 times).
Short Version
What this all means is this:
• When it comes to movies (as in HD DVD and Blu-ray) there will be no visible difference between the 1080i signal and the 1080p signal, as long as your TV correctly de-interlaces 1080i. So even if you could input 1080p, you wouldn't see a difference (because there is none).
• There is no additional or new information in a 1080p signal from movie based content.
• The only time you would see a difference is if you have native 1080p/60 content, which at this point would only come from a PC and maybe the PS3. 1080p/60 does have more information than 1080i/30, but unless you're a gamer you will probably never see native 1080p/60 content. It is incredibly unlikely that they will ever broadcast 1080p (too much bandwidth) or that 1080p/60 content will show up on discs (too much storage space and no one is using it to record/film).
So all of you people who bought 1080p displays only to be told by the companies that you had bought 1080i TVs, relax. The TV will convert everything to 1080p. Now if you bought a TV that doesn't de-interlace 1080i correctly, well, that's a whole other story.
- Log in or register to post comments
Hi Geoff. Good article -- it's important for people to understand that, in principle, you don't lose info. with 1080i. However, I think it's also important to emphasize that, in practice, deinterlacing is not trivial, and that therefore virtually all TV's (and progressive scan DVD players) introduce some degree of error into the process (even if it's not the kind of wholesale error that you uncovered in your recent article on bob vs. weave). Further, is it really the case that you don't lose anything if your TV deinterlaces perfectly? I understand that interlacing is even harder than deinterlacing. And in order for your statement to be true, the interlacer would also have to be perfect (which it may not be). So while interlacing and deinterlacing between source and display can be done perfectly in theory, in practice you're probably better off if you can go directly from a 1080p source to a 1080p display without introducing those two extra conversion steps.
Follow-up to my previous post: Suppose you have a TV that accepts 1080p/60, but not 1080p/24. In that case, 2:3 pulldown is unavoidable for film-sourced material. But is it at least possible to avoid interlacing/deinterlacing? I.e, will any of the next-gen HD-DVD or Blu-Ray players that output 1080p/60 implement 2:3 pull down using entire frames (rather than by creating interlaced fields, as is done in the current players)? I.e, I'm looking for a player that will do 1080p/24->1080p/60, rather than 1080p/24->1080i/60->1080p/60. OR--does this alternative approach for doing 2:3 pulldown without interlacing (again, with frames instead of fields) create its own problems?
Loved the article. What about those of us that purchased the 780p DLP TVs that were top of the line last year? How can we take advantage of the additional lines of resolution of the new video formats? Are there any tips for the Toshiba 56hmx85
I have a Vizio P50 720p plasma... I notice that if I output 1080i from the Xbox360 and Sony upscaler DVD player, the image "looks" better/sharper. How is that explained relative to this article? Does 1080i technically & visually look better on 720p displays? Curious to know! :)
So tell me this, on both my LCD and CRT computer monitor I sending HD signal at a refresh of 72Hz. Now why doesn't the image look 'progressive' as some of the plasma/lcd/dlp sets at say Futureshop? The motion just doesn't seem as fluent even though I am running at the same or higher resolution!?This has been a huge question that has never been answered.. or that I have found :P
1) I believe all HD-Ready branded HDTVs are obliged to support both 50Hz and 60Hz signals. The US *could* follow Europe, and play movies (slightly fast) with a 2:1 pull-down.2) You presume that a 1080p/30 (or 1080p/24) source will reach the TV unmodified. Having a TV accept a 1080p/60 signal means that one could use an external scaler with a different (more expensive) motion prediction implementation, which may well result in a smoother perceived picture - even though the extra information is "made up". Similarly, a television with 1080p/60 *can* use the deinterlacer/scaler in the decoding device, whereas a television which only accepts 1080i forces the owner to accept its implementation, which may be inferior.3) Although films are usually stored at 1080p/24, not all content has to be. Content aimed at the next generation disk formats could sensibly use 1080p/60 - this would allow the best reproduction on both 1080i and 720p/60 screens. Sony *do* have a HDTV camera that records 1080p/60.My $.02. :
In your article, you said:"For clarification, let me start by saying that there are essentially no 1080i TVs anymore. Unless you bought a CRT based TV..."So what does this mean for those of us that DO have 1080i TVs? I own a 34" Sony Wega (model KV-34HS420). It is a CRT TV that supports 1080i, so I'm assuming that my type of TV is what you are talking about in the quote above.
Thanks, Geoff.I'd not realized the ATSC hadn't forced 50Hz compatibility in US HDTVs while Europe was requiring both 60Hz and 50Hz. It would've been a way to ensure that programmes imported from Europe didn't need pull-up, and mean one could produce one television world-wide. Since they didn't, I can see 72Hz would appeal!Re. external scalers: true, I doubt most people will buy boxes just for that (although some will), but it doesn't mean your Blu-Ray player (which has the compressed stream motion vectors to play with) might not do better than the TV. Why isn't this a disadvantage of a 1080i input screen?Agreed 1080p/60 won't be broadcast/in cinemas, but new disks have spare room. 1080i on 720p and 720p on 1080i lose quality. Why not get the best from both? Space and bandwidth get cheaper; it's only 2x. This could do as much to increase quality on the wrong screen as bandwidth gains over broadcast HDTV or h.264. Once stored, it's worth displaying directly.1080i doesn't hurt *yet*, but it migh
As a PC gamer I do want to use a largescreen TV for games. From the above it seems important to me that I get a true 1080p that accepts 1920x1080 resolution from my PC videocard (that is able to produce that).Any recommendations for me re DLP vs LCoS? (I figure LCD and Plasma are not suitable.)Any recommendations re how to get the best image from my computer onscreen?Thanks all. This is a very useful forum.
You want to look for not just a TV that will accept 1080p, but one that will accept a 1080p signal from a computer (on RGB, DVI, or HDMI). Not all will. They almost always say in the owners manual (which you should be able to find online).
3:2 pulldown on 1080p/24 source to 1080p/60 will introduce judder. Something you can't get around if your display can only do 60Hz, BUT there are quite a few displays on the market today and in the future that do support mutiples of 24 like 48Hz or 72Hz. Most of the front projectors I've been looking at support mutiples of 24. Isn't saying "All TVs have a refresh rate of 60Hz" misleading?
After reading the article and comments I'm still confused. Should or shouldn't the consumer population be laying out more money to purchase a set capable of supporting 1080p input or not? If it is not likely in the near future for source content to be in 1080p what's the big deal?
A Question please from your average consumer. I am about to purchase a projector for my HT. I understand that 1080P presents all pixels simultaneously and 1080i scans and presents several times each second (3/2). Do you suggest buying the 1080P or 1080i?I plan on watching the new HD-DVDs (don't like Sony and won't buy their products ever) along with my present collection of DVDs. Will my old DVDs look better on the 1080P or will I only receive benefit from HD-DVD on the 1080P? The kids probably won't be using the projector for games to keep bulb life extended. After reading the article by Geoffrey I still have to ask that if the 1080i is converting the signal up to the 1080P standard will I really be able to recognize the difference? I plan on using a 120 inch 16x9 screen.Also, does the new 1080P give you less eye fatigue when viewing for extended periods because it may be a purer picture?Thank you.
But there is roughly 44% more pixels in a 1080p display compared to a 720p display. That's why having a 1080p display(all digital displays') is ideal for a 1080i signal. Displaying 1080i on a 720p display is not as good as displaying it on 1080p displays.. Hence everyone is running to a get a 1080p display so that they can display a 1080i signal in it's true originality.
So, as I'm sure many of us are wondering... which is better? 1080p DLP, or 720p (768p) plasma like Samsung's new HP-S5072? If we want plasma, should we wait for 1080p plasma (although it probably won't be "affordable" for another year or two)?
I purchased the October HT issue at Barnes & Noble and found it to be very informative. I am going to subscribe. My old Mitsubishi RPTV crapped out on us, so we will be buying a new HD TV very soon. I was close to pulling the trigger on a Samsung HL-S5686W 720p model, but this whole 720p or 1080p dilemma has made me slow down and dive into the technology. Your stance of "You don't need 1080p" has made me flip back to serioulsly considering buying it (not to mention you have it as a favorably reviewed unit) on the website; however it is not listed as a PASS or FAIL on your list of tested 1080i deinterlacing tests. Do you know if this model passes or fails? Thank You for your expertise, and I look forward to reading HT for many years to come.
Wow! Thanks for finally explaining 3:2 for me. Now if you can just explain why we need HDMI. Who listens through their display? Source to A/V reciever, OK. But many projectors don't even have audio capability. Is it purely for the so far unused copy protection?Thanks HT, for everything I know about, well, HT!
So heres a question. If the source you're watching is 1080p, and the tv is 1080p then wouldnt that look more fluent than a 1080i tv? Especially on a size like a 72" or something of that nature. Just throwing the frame rate out and factoring in the way that an interlaced signal is drawn im comparission to a progressive signal.
Very helpful article -- but what about the TVs that output 1080p/24? (The new Mittsubishi HC5000BL projector for example). In that case, doesn't the 1080p matter (and hence a 1080p/24fps output from say a new blu ray player will look better on it)? Then it's never being converted from or to anything and will have less imperfections that would result from the various conversions you mention? My concern here is interestingly the opposite of what you're talking about -- I'm not worried about the TV, I'm trying to figure out if it matters that the first generation Hi-def players output 1080i whereas the 2nd generation will output 1080p. Obviously if it doesn't matter, why pay more for the Toshiba HDX-A2 if the HDX-A1 (or the A1 for that matter) can output 1080i?
This article was fabulous!But can I confirm:My potential DVD player purchase (Oppo 970 or 971) will output 1080i through its componet outputs?If I pass it through my Marantz THX reference SR 12 AV AMP it will also pass the 1080i to my Hitachi plasma (which is 1080i capable)- this amp has no HDMI?I felt pretty sad when I was under the impression only HDMI pased 1080i signals.ThanksP
Ok, for some of us that will not change their viewing area for 5 - 10 years, I'm looking at a 40 - 46" Sony Bravia xbrt2. The viewing area will be ~ 10 - 12 feet. I would think the image would improve dramatically with 1080p over 720p from that distance. Is that correct assuming a HD cable box and HD-DVD player capable of 1080p output?
Just bought a Hitachi 42HDS69 Plasma. I saw it in the store, and it looked great, but it's at home when it counts. Does anyone know if this will de-interlace 1080i correctly?Three reasons I bought this: 3 HDMI connections, 1024x1080 resolution, and $1500(Circuitcity.com, with 1 day $200 discount on all big screens) out the door. But it needs to perform well, otherwise it's not worth it. Any input on this Hitachi plasma?
i just purchased a jvc 56" 1080p lcos rear projection lcd hdtv. the colors are great and some of the hd stations are perfect. I just finished watching the superbowl and many of the moving players were very blurry or pixelated. would i get the same result if i trade it in for the sony? like is this as good as it gets? does the refresh rate have anything to do with the poor pq? watching hockey is sad as well because they move so fast. please help me.
i just purchased a jvc 56" 1080p lcos rear projection lcd hdtv. the colors are great and some of the hd stations are perfect. I just finished watching the superbowl and many of the moving players were very blurry or pixelated. would i get the same result if i trade it in for the sony? like is this as good as it gets? does the refresh rate have anything to do with the poor pq? watching hockey is sad as well because they move so fast. please help me.
Some months ago I bought a Phillips 42 inches 9731D. I bought it thinking it was 1080p, and resolution-wise it is, but it only accepts 1080i and it was a big letdown for me to know what this means. Now, I am happy to know that theoretically there is no loss in quality if it deinterlaces correctly, but how can I know it is doing it correctly? Does anyone know if this is a good lcd in regards to this? Any comments? George?
Just to point out that you are contradicting yourself."THERE IS NO DIFFERENCE BETWEEN 1080i AND 1080p."vs"The only time you would see a difference is if you have native 1080p/60 content."Make up your mind which sentence represents your position and change the article to so it brings clearness instead of adding more ambiguity to the 1080i/1080p discussion.
I have a PS3 on a Philips 32PF5320/28, if modern TV's take the 1080i and convert it to either 720p or 1080p, why wont my PS3 output 1080p to the TV? am I actually getting 1080p? (PS3 output is set to automatic and only goes for 480p, 720p and 1080i)
I'm preparing to purchase all new AV gear. I've been researching HDTVs and read with great interest Gary Merson's article dated October 2006 "Are you getting. . . round 2". I don't see a way to contact him with my question so I'll try you. I'm looking for 45-50 in LCD display, seating distance 9-10 feet, cable will provide most input with the occasional DVD from the video store.Based on the pass/fail table on the delinterlace test and 3:2 film cadence test, I see no LCD HDTV larger than 37" that pass both test.What the practical effect of this? Do I have to chhose a set that passes one test vs another? Thoughts?
I believe there is a problem converting 1080i to 1080p. Progressive frames are created at the same instant. Intelaced frames are created with the odd lines in one half-frame and the even lines in another. The two half-frames are produce 1/60th of a second apart. When the two half-frames are combined to produce one progressive frame any motion in the frame is blurred.I have a Canon HV-10 HD camcorder. It produces 1080i video. Any motion in the video is blurred if played on my computer (progressive scan) but look good when down-converted to 480i and played on a old-fashion TV (except for the aspect ratio and resolution of course). I have seen no specs on HD camcorders that produce 1080p video, and no HD TVs that play 1080i. This means moving objects filmed on HD camcorders will always be blurred when played on an HD TV.What's the point of taking video pictures if moving objects are blurred? You might as well take stills.
OK. I understand most of this stuff, but I am relatively new to this HD bandwagon and have a very silly question! In one of the ads of a leading HD TV (32 inch LCD), it said that the screen resolution is 1366 x 768, BUT, the ad also says that its display capability is 1080p. How is this possible if the screen resolution is only 768 in height? Thanks!
What would you say to an article that seems to understand all the details you mention but states that 1080P is worth it? http://hometheaterhifi.com/volume_14_1/feature-article-1080p-3-2007-part...
hi there, i've just brought a sony av amp which has a hdmi switch box on but it says it is in 1080i. will this also transmit a 1080p picture. i've been told that the switch box would work well as it is amplified but i dont want to loose any picture quality. i'm connecting a ps3 (for blue ray) and sky hd box to my 1080p samsung lcd which does have 2 hdmi ports, but using the amp would save me having to buy 2 5m hdmi leads.what would you reccomend i use.thanks for any advice.
All New Projectors and HDTV's are Progressive devices, so the picture you watch, if 1080 (1920 pixels wide by 1080 pixels high)is a 1080P picture (not 1080i). All 1080i means in the specs, is they will accept a signal coming in that is interlaced, but it is converted to Progressive 1080p inside the HDTV or projector as it is displayed. No new HDTV or projector, LCD, DLP, Plasma, Lcos, are interlaced devices, they are all progressive. The 1080i/1080p appearing in the specs refer only to the input signal they will accept and handle properly. A 720p display (1280 pixels wide by 720 pixels high) that accepts a 1080i signal will "interpolate" the pixels from 1920x1080 to 1280x720, so it will lack some detail, but will still look as great as a 1280x720 picture can. "Interlaced" came from old CRT TVs, that were 525 lines, but scanned as 2 x 212 line 60hz fields, with the second 212 lines scanned "interlaced" in between the first 212 lines. This was done to reduce visable 30 hz f
Sorry, couldn't finish my last sentence. The interlacing was only done to REDUCE THE FLICKER seen at the 30 hz frame rate on old CRT TVs, and this allowed them to increase the effect "field" rate to 60 hz and still maintain the same smaller video bandwidth. If not for that, old TV's from day 1 could have been "progressive scan", where each line is "progressively drawn one after the other", rather than interlaced in between previous lines.
I offer this piece of advice to everyone: Go to a store, and LOOK at the TVs, and decide based on emperical evidence from your eyes, which TV looks best to you. Then look at the price tag, and figure out where you want to compromise. Making an HD TV purchasing decision based on technical specs alone is not a good idea. Check out the specs after you've looked at the picture.
i have a ps3 that is currently playing 1080p but i only have a vizio 42" plasma that maxes out to 1080i. im trying to understand why the tv setting is reading it as 1080p even if there not a setting for that on my tv. my info button on the tv says 1080p so is it really reading that. if not, what is the best setting for games and moves thank you,adrian.
This main post could be true one year ago. Will the author,say the same today with this LCD TV:Philips 42PFL9732 Full HD 1080P, 100HZ. Sorry, i do not know anything about all this and trying to learn from you. With other words, what does he say at toady on the market available TV vs those in 2006?
But you failed to mention that TVs often sold as 1080i-capable are actually only at 1366x768 resolution. Most are that way these days, in fact. The image is downconverted to 1366x768 resolution.What you are saying could mislead a person into thinking that a 1080i-capable TV (which has only 1366x768 resolution) is the same as a 1080p TV. This is certainly not the case.
my plasma is a th-58px60u panosonic built in feb 2007,, but it is a 1080i ,, the writter Geoffreymentions 1080i tvs are no longer being madeas off the date he posted, aug 2006,,i hope there is no difference in picture quality as he mentions.
first there is vhs then there is dvd then progressive dvd for hd tvs then blu ray maybe i missed a few steps i can not understand much what you are saying much all i know is the movies i watch they put way to much make up on now and look fake the only really great scenes i see are superheros with costumes are cartunes but not to much into watching movies where i can see the pores of peoples faces it never got this bad till blu ray i have a hd tv am completely happy with it would never get a blu ray 1080i as it says or what ever it is is fine for me people that need more are missing somthing out of there lifes to need that much more or just have way to much money and should be maybe spending it on somthing of people that dont have so much in other countrys start thinking of more wonderfull things to do with there time then figure out how better then can get there movies to look maybe this is a negative statement to most but there the people trapped in this world in the end you will never ever get the best.
How much better?? I have a 4 year old Mitsubishi WS-55413 that has resolutions of 480i, 480p and 1080i. It doesn't do 720p for some reason. Anyway I have been considering purchasing a new 1080p HDTV to get the full benefit of my new Panasonic DMP-BD30 Blu-Ray player. Even though my Mits is over 4 years old it still has a great picture especially with Comcast HD channels. My question is how much better is 1080p compared to 1080i? Is it really noticeable? Thanks.Eddie
Just wanted to mention that this line:"Movies and almost all TV shows are shot at 24 frames-per-second (either on film or on 24fps HD cameras)."is incorrect. Movies are shot at 24, yes, but all NTSC SD broadcasts (the VAST majority of TV in north america) are shot and broadcast at 29.97fps. The majority of HDTV shot in North America is also 29.97fps. This has been the case since the introduction of colour. PAL is shot and broadcast at 25fps. There's also SECAM, but I have absolutely no clue what the specifications for that format are. In North America/Europe at the least, TV is VERY rarely shot at 24fps.
Wow! When I began reading this thread my eyes were opened and I saw the light. But the further I read, the dimmer the light got! Now I need to ask a question about my new 46" Sharp Aquos LCD HDTV (model LC-46D43U. It says it supports 1080i (HDTV), 480i (SDTV), 480p (EDTV), 720p (HDTV), and 768p . But I have no idea whether it will also --as you said above-- correctly display (ie: deinterlace) 1080i so that it displays as good a picture as 1080p (I only use it to watch movies and TV.)
Let me make this simple. If you are using a 1080p source, it will look better on a 1080p display than it will on a 1080i display. Case closed! Stop confusing people with your interlacing/ de-interlacing mumbo jumbo. I have a blu ray player and currently watch it on a 1080i rear proj. t.v. However, I will now be replacing my set with a full hd 1080p t.v. because it looks better. This is not rocket science people.Also, for those of you who are tired of seeing blured HD sports when images are moving quickly on your hd t.v., it is because plasma is far better at motion rendering than lcd or rear proj. So, if you want the best picture when purchasing a new home theatre, get a blu-ray player (hd dvd is toast now) and buy a full hd 1080p plasma. Pioneer PDP-5010FD is the best reviewed on the market at the moment and you cannot go wrong with that.
Great post and thanks for all the additional comments as well. That being said, On my Toshiba 62MX195, which is a 1080p set, when I am watching a cable show ie HD Discovery,the tv displays that the signal coming in is 1080i. So that means that my TV is projecting it at the true 1080p resolution?
first off,all shows are filmed in 24fps, hence the 2:3 pulldown, 29.97fps is the corrisponding result, if you go look at a film camera in a studio it does 24fps, and digtal camcorder does 29.97 fps b/c it is digital. but how many movies are shot with a camcorder. Second the only time that motion looks blured on Interlace video is when you pause it, other wise it looks like regular film. 1080p is a marketing gimic. I wook for the 4 largest cable company in Canada, HD tv will never be brodcast in 1080p, no bandwith, look at you computer, how much power does it take to send a 1080p signal you are sending 6 megabytes a second. That is a lot, and 90min movie is 32Gb uncompressed, and if it is compressed then you aren't getting your full resulotion. Right now most tv stations are lucky to broadcast 4 hours of Hd content and most of the hd content isn't 1080i. The "HD revoltion" is a ploy to sell tv's to everyone who has tv's and to free up carrier waves for wireless internet. If anyone
first off,all shows are filmed in 24fps, hence the 2:3 pulldown, 29.97fps is the corrisponding result, if you go look at a film camera in a studio it does 24fps, and digtal camcorder does 29.97 fps b/c it is digital. but how many movies are shot with a camcorder. Second the only time that motion looks blured on Interlace video is when you pause it, other wise it looks like regular film. 1080p is a marketing gimic. I wook for the 4 largest cable company in Canada, HD tv will never be brodcast in 1080p, no bandwith, look at you computer, how much power does it take to send a 1080p signal you are sending 6 megabytes a second. That is a lot, and 90min movie is 32Gb uncompressed, and if it is compressed then you aren't getting your full resulotion. Right now most tv stations are lucky to broadcast 4 hours of Hd content and most of the hd content isn't 1080i. The "HD revoltion" is a ploy to sell tv's to everyone who has tv's and to free up carrier waves for wireless internet. If anyone
sorry, about that java messed up, hasn't noticed yet, that all digital tv is still around the 480i or p range and still looks better on a crt than an hd tv, so unless all you plan on watching is blue-ray movies don't rush to get a HD tv, b/c all I get all day is service calls to houses about people complaining about how bad 480I looks on there 50"plasma and that there 34"crt(which is all most as big consider that the 50" is widescreen and the 34" is 4:3 and 95% of tv is still 4:3 so unless you like short fat people on or streched pictures) don't rush into it. Wait till almost all shows are HD(true HD) its' just like when Color tv came out for years only a couple of shows a day where in color the rest in b/w. most people just bought color tv's when there old b/w set broke.Just my thoughts as a person who gets 2-3 service calls a day about poor HD quality, when it's just lack of HD content.
Hello,I just bought a Philips 42PFL7962D(TV). I'll use this in combination with the LG LH-RH760IA(DVD). TV uses 1080P while DVD uses 1080i. HDMI is connected. It's not possible to change the resolution on the TV itself, I can only do that on the DVD player. The best image I get is when I use 576i/p. When I go to 720 or higher(1080i), the screen gets less clear. This since the signal is not well converted. I really need to buy a scaler or this problem can be fixed with a cable etc? http://www.p4c.philips.com/files/4/42pfl7962d_12/42pfl7962d_12_pss_nld.pdf
imvu credits www.imvuol.comcheap imvu credits www.imvuol.combuy imvu credits www.imvuol.com
The game has become the people's lives in a large part of the game in the park where, dofus kamas is a favorite for all of the game,acheter des kamas on dofusvault.com we will be very much on the dofus kamas, Come on, friends, to understand All your favorite,acheter des kamas you should feel some happiness.