Most TVs today have two modes with similar names: 1080i and 1080p. Both have the same screen resolution, so what’s the difference between the two? Which one should you choose?
1080i video is “interlaced.” 1080i video plays back at 60 frames per second, but that’s a bit deceptive, because it’s actually broadcast at 30 frames per second. The TV then displays those frames twice, in a way—the first pass is 1,920-by-540 for the even scan line field, and the second pass is 1,920-by-540 for the odd scan line field. The process by which this occurs is called interlacing. It contributes to a sense of motion and reduces perceived flicker.
1080p video is called “progressive scan.” In this format, 1,920-by-1,080-pixel high-definition movies are progressively drawn line after line, so they’re not interlaced. On paper, that may not seem like a huge deal. But in the real world, what you end up seeing looks sharper and more defined than 1080i, particularly during scenes with a lot of fast motion.
Sometimes 1080p is termed “full HD” or “true HD,” to distinguish it from 1080i or 720p video. Blu-ray discs contain 1080p video at 24 frames per second, and then, using a method known as 3:2 pulldown, display it at 30 frames per second on screen.
Both formats look similar on smaller TVs. As a general rule, you need a larger TV to notice the difference between 1080i and 1080p. Depending on your eyesight, you can probably pick up the difference on a 32-inch LCD if you’re particular about it. But most consumers don’t really see a marked difference until at least a 42-inch screen, if not larger.
1080p isn’t even the best anymore. Technology never stands still, of course. High definition meant 1080p (1,920 by 1,080) resolution for years, but it’s quickly being overtaken by ultra high definition television, commonly called 4K.
A UHD or 4K display is one with at least 8 million active pixels. For televisions, that resolution has standardized to 3,840 by 2,160. Digital cinema 4K (the resolution in 4K movie theaters) is slightly higher at 4,096 by 2,160. However you define it, it’s four times the number of pixels on a 1080p display, and over 23 times the resolution of standard definition television.
If you don’t have a 4K source video, a 4K TV can still make your movies and shows look better. All 4K televisions use some kind of upconverter to display 1080p and lower resolution video. These upconverters do more than just break each pixel into four identical pixels; they employ edge smoothing and noise reduction algorithms to produce, ideally, a sharper picture. When it works well, you get video that looks natural on a 4K screen (though it doesn’t add any actual new details, just sharper lines and more even color and light). When it doesn’t, the picture can look a bit blotchy, like a painting.
While regular viewers struggled to see the difference between 1080p and 720p on smaller televisions, it’s much more obvious on 50-inch and larger TVs. 4K is another significant jump in terms of clarity and detail, especially as people are becoming more and more used to the incredibly tiny pixels displayed by today’s high-resolution screens on mobile gadgets.
Source: PCMag.