To help us provide you with free impartial advice, we may earn a commission if you buy through links on our site. Learn more

What is HD? 720p vs 1080i vs 1080p

Sony X93C TV

Wondering what all of the different HD TV options are and how they stack up, we have the answers

Although Ultra HD 4K TVs are rapidly becoming standard, with their resolutions of 3,840×2,160, the truth is that for a long time to come it’s HD content that’s going to remain on top. Unfortunately, it’s not that easy to decipher exactly what HD is, as it’s not a single standard and you’re likely to see a range of different standards quoted, with 720p, 1080i and 1080p all options. For example, the Now TV box uses a 720p stream, while Netflix can deliver 1080p content (and 4K if you’ve got the right type of TV). So, what are the real differences?

720p vs 1080i vs 1080p resolution

The number used in each standard tells us how many pixels tall an image is. So a 720p image is 720 pixels high, while 1080i and 1080p images are both 1,080 pixels high. The reason for this convention is down to the old analogue TV system, where the standards were described by the number of lines high an image was: PAL TV was broadcast at a resolution of 576 lines.

What the standard doesn’t tell you up front, is the width of the image. All HD standards assume that the picture has an aspect ratio of 16:9. In other words, for every 16 horizontal pixels, you have nine vertical ones. To work out the conversion, you divide the number of vertical pixels you have by nine and multiply by 16 to get the width. This means that a 720p image is 1,280 pixels wide (a resolution of 1,280×720 pixels and a total of 921,600 pixels), while a 1080i and 1080p image are both 1,920 pixels wide (a resolution of 1,920×1,080 pixels, a total of 2,073,600). This might make 1080i and 1080p sound the same, but there’s a crucial difference between them.

TV frame size comparison

1080i vs 1080p – interlaced vs progressive

The difference between 1080i and 1080p is down to the letter, where ‘i’ means interlaced and ‘p’ means progressive. With a progressive picture, you get all 1,920×1,080 pixels all at once. With an interlaced image, the picture is split into two: the first has the full horizontal resolution, but only carries the even lines; the second has the odd lines. Effectively, you need half the bandwidth by sending half of the information (1,036,800 pixels vs 2,073,600 pixels). Interlacing is largely used for TV broadcasts for this very reason.

Interlaced video

This used to be fine with CRT TVs, which were designed to work with interlaced video, but modern screens have to convert an interlaced image into a progressive one first. To do this, your TV has to put the image back together. How it does this depends on the original footage. If the original was shot at 25fps and interlaced later, creating 50 half-frames, then your TV just has to wait until it has two half-frames and put them back together. This is easy, as TVs can display up to 50 frames per second (50fps) in the UK, but TV shows are only recorded at 25fps. 

However, if the footage was shot using an interlaced video format, then there’s some information missing and it’s not easy to simply stitch back two frames into one: in a fast moving scene, the second half of a frame could be totally out of sync with the first frame. In this case, your TV has to have some clever processing in it to create a single image, filling in the missing information intelligently.

That’s the theory, at least, as not all TVs are very good at combining the frames, leading to some artefacts. This goes double if the original source material was shot at a higher frame rate originally, as differences between each interlaced frame can mean that the image doesn’t line up properly. Fixing this issue requires some rather intense processing, with varying degrees of success. My advice is to see if your output device has a 1080p mode, and then using that to see if it gives you better results.

What’s best, interlaced or progressive?

Without a doubt, progressive video is the best, as each frame is completed in its entirety, so you don’t get artefacts or need any deinterlace the material. While interlacing was a good way to save bandwidth, modern compression techniques largely do away with the need for it: 4K TV, which has four times the resolution of 1080p, will be broadcast with a progressive signal and can be in 25fps or even 50fps for smoother motion.

What kind of TV do I need?

Given the range of content that’s available for 1080p televisions, it makes sense to buy one of these at the minimum. The one exception is for a small (22in or less) TV for a kitchen or bathroom, where the small size of display means that you probably won’t see the difference in screen quality. But, what about 4K TVs? Well, the price is tumbling and the content is certainly improving, with the likes of the Panasonic DMP-UB900 and Samsung UBD-K8500 Ultra HD Blu-ray players, although the majority of content will still be HD for a few years to come. One potential issue that people have with 4K is whether or not these TVs will make HD content look bad, as it has to be upscaled to fit the new TV’s resolution. Well, this isn’t true, as a 4K TV has four times the pixels as a 1080p TV. That means that given the same screen size, a 4K TV has four pixels where a 1080p TV has one. So, if you watch 1080p content on a 4K TV, it simply uses four pixels to represent one pixel of a 1080p TV, so the quality is at worst the same and at best, thanks to better image processing in new TVs, even better. Read our guide to the best TVs (4K and 1080p) for more information.

Read more

In-Depth