Is it possible that the native signal was instead 1080P, and the 4K TVs were the ones doing the resolution conversion?
Do 1080p TVs Downconvert 4K Video Signals?
Q I recently went to my local electronics store to check out the TVs. Every set, whether 4K or 1080p, was displaying 4K programming, and I have to say that even the 1080p sets looked great when showing 4K. Here’s my question: Do 1080p TVs downscale 4K video, or does the 4K server perform that function before passing the signal on to the TV? —Bill Carman / via e-mail
A All 4K content needs to first be downconverted to 1080p format by the server or an external video processor before the HDTV can display it. If you were to try to pass a 4K signal through to an HDTV’s HDMI input, it would fail to sync up, and all you would see is a blank screen.
That’s not to say that only true Ultra HD TVs can display 4K programs. A new line of Quattron+ sets from Sharp can accept up to 30-fps 4K signals via an HDMI 1.4 connection. But while Q+ sets can accept 4K signals, they technically are not Ultra HDTVs: Sharp splits up subpixels in the display in an effort to deliver better-than-high-def detail, though resolution still falls short of UHD’s 3840 x 2160 pixels.
- Log in or register to post comments