To help us provide you with free impartial advice, we may earn a commission if you buy through links on our site. Learn more

A history of television: TV inventor, digital broadcast, HD and 4K

We take a look at the history of television, from its invention to digital broadcasts, HD and 4K

Is it a bit much to say that television changed the world? No, it most certainly is not. After all, we can watch news reports from anywhere on earth live these days, we can see a man land on the moon and we can watch presidents being assassinated. TV has unquestionably changed the world.

And in these days of Netflix streaming 4K video and OLED TVs that are thin as paper it seems worth taking a look at the journey that TV has been on, how it has improved over the years, and what might be next for the medium.

Who invented TV is something of a hotly contested matter. Ask an American, and they’ll mention Edison, ask a Brit and we might say John Logie Baird. The truth is, a lot of people had input into what ultimately became a global phenomenon.

The very start

The first TV-related patent was granted in 1884 to a 23-year-old student called Paul Gottlieb Nipkow in Germany. Nipkow invented something called an “image rasterizer” but he was never able to build a working one. His system was used in most of the electro-mechanical TV systems that were trialled before the all-electric versions were developed.

paul nipkowThings began to heat up in 1925 when, on 25th March, John Logie Baird demonstrated television pictures of silhouettes moving in Selfridges. In May AT&T labs in the US did a similar test with halftone silhouettes and then later on, in June, Charles Francis Jenkins did similar, but this time sent the signal five miles from a radio transmitter.

John Logie Baird apparatus

These events are, in themselves not much to do with the final invention of television. They were not TV images as we understand them today, they were crude and had very little detail. In January 1926 though, Biard managed to get a TV image working at 12.5 frames per second. At the end of that month he presented his findings to members of The Royal Institution. The result was still generated through a partly mechanical process and the resolution was somewhere around 30 lines.

Baird went on to broadcast signals from remote locations, and his system was eventually selected as one of two that would be the standard for TV in the UK. It was complex and required that footage was shot on film, then scanned. Ultimately, the BBC would end up using the Marconi-EMI system that had a resolution of 405 lines.

Over in America, Philo Farnsworth was busy perfecting his electrical system using something called a image dissector which is an early version of the camera tubes that would end up capturing live video right up until the 1990s. Indeed, in April 1933 Farnsworth applied to patent a product that was basically a cathode ray tube camera sensor.

Farnsworth image dissector tube

The UK remained an important place for TV, thanks to the BBC and it was home to the first regular TV broadcast in the world. The EMI team continued to make improvements to technologies invented by others too, and in 1937 performed an outside broadcast of Armistice day.

Colour TV

In black and white TV, the US and UK mostly had very similar systems. The US had a higher frame rate, even then, at 30fps, while the UK used 25fps. This was a direct result of the mains frequencies, and would have been hard to get around with the technology of the time.

When colour TV started things changed. The US managed to create a clever system that allowed colour information to be sent separately to the black and white signal, but would allow colour TV with reasonable ease. Obviously, new sets would be required to view the signals, but because the colour portion of the signal was additional, it wouldn’t prevent black and white sets from handling the black and white part of the signal.

The downside of this was that their system, known as NTSC after the National Television Systems Committee that developed it, was that the colour had to be reduced in quality to fit into the allowed space. The US had 6MHz available for each channel, and when the UK got PAL later, it would have a more capacious 8MHz of bandwidth into which colour signals could be placed.

As a broadcast system, NTSC was clever, but it had huge problems with the quality of the colour. Noticeably, it was possible for atmospheric conditions to affect how the image looked, giving it the joke acronym, Never The Same Colour. PAL, on the other hand came along later and fixed a lot of these problems. It also had the advantage of offering 625 lines, over NTSC’s 525.

BBC Colour test card

PAL and NTSC were later replaced by digital transmission standards. The Americans went for ATSC while in the UK we opted for DVB-T.

Improvements in image quality

During the tenure of analogue TV there were lots of changes made that gradually improved the quality of TV pictures. The first was the arrival of outside broadcast video cameras. Before these arrived, anything shot outside of a studio would usually be recorded on film, then converted before broadcast. See the likes of Monty Python for examples of how different on-location segments look to those shot in a studio.

Old TV camera

Eventually, tube cameras were reduced in size and power consumption, and could be taken out in the field. This had a huge impact in consistency and it sped up things like TV news enormously.

Tube cameras had some problems of their own though, point them at a bright light and you would see trails coming from it that would persist for several seconds once the light source had gone. It was possible to burn a camera tube permanently if you pointed it at a bright light for too long too. Tubes are also enormous, heavy and need a lot of power to operate.

Eventually though, the CCD would arrive on the scene and change everything. The move to cameras that used these little chips instead of massive tubes meant that the size of cameras reduced considerably. This was also the point at which TV broadcasts started to look better than ever before and while there hadn’t been a resolution increase or other massive leap, the quality of TV continued to increase until digital cameras came on the scene and changed everything again.

Digital television

With digital came a new era, and it’s still striking just how much things have changed since analogue was retired. For one thing, digital has reduced the costs of TV production considerably. These days, domestic video equipment is as capable as professional equipment. You can get 1080p video cameras that are good enough to be used for TV production, and that cost just a few thousand pounds.

Panazonic Z1000

But for most people, the biggest advantage of digital was the new choice in channels. This was possible because where analogue TV signals can’t be bundled together in one block of frequencies and must be well separated from each other, digital signals can be multiplexed. When you factor in the reduction in data needed to transmit compressed digital signals, it means you can have several channels in the space that was once consumed with a single analogue channel.

And, of course, the move to digital also brings with it some quality improvements. For example, proper widescreen and a much sharper image with much better colour reproduction than PAL could offer. Digital TV was a great upgrade, even if it lacked HD when it launched.

Of course, nothing is perfect, and to save money some broadcasters have pushed the MPEG and DVB-T systems too far. The result is Freeview channels that have too much compression applied to them, and when scenes move quickly, you can get unpleasant “macroblocks” and picture quality degraded to something you might see on a 360p YouTube video.

In TV terms though, the digital revolution was important because it increased the choice of channels, and allowed us to move technology forward with the eventual move to HD but also things like high-quality widescreen broadcasts, that were something of a stopgap between the end of analogue and the start of HD.

In the UK, the move to digital happened quite rapidly. On 01 October 1998 Sky launched its new Sky Digital platform, delivering a whole new range of channels into the homes of those with a compatible dish and receiver. On 15 November the same year, DVB-T broadcasts started over-the-air too. A pay-tv service called OnDigital launched alongside broadcasts from the BBC, ITV, Channel 4 and 5.

HD Television

It took a long time for the UK to get HD. While we were launching our digital TV services, the Americans were already up-and-running with their HD systems. Because of a lack of bandwidth, and the number of existing services it took until 2010 for HD to launch on Freeview in the UK, and then with a very modest number of channels. Initially, the BBC just had BBC HD but later this was joined by a simulcast of BBC One in high definition. In 2013, the BBC shut BBC HD and replaced it with BBC Two HD. Sky, of course, has been a pioneer in HD from the start. Sport has driven this, along with movies and there are now 60 HD channels on Sky’s platform.

Sky HD box

As frustrating as it was to have such a long wait for HD to arrive in the UK, it did allow us time to perfect the service. For example, Freeview HD uses MPEG-4 and DVB-T2 to transmit video. These are far more efficient than the American system of ATSC and MPEG-2, so we are able to house far more HD channels over the air, and mix them with the older MPEG-2 standard definition broadcasts. As a guide, a 6MHz US TV frequency block allows for about 20Mbit/s of data, while in DVB-T2 it’s possible to get 40Mbit/s in our 8MHz channels.

It’s interesting to note that two new systems are in the works too, 1024-QAM would increase the bandwidth on a multiplex to 50Mbit/s, and 2048-QAM would allow another 25 per cent increase over that. It’s uncertain if these systems will make it to Freeview, as they would need new decoder hardware, but it’s likely that by the time a decision is made about 4K on Freeview, it will be possible to use them to increase space further. This combined with the new h.265 video compression used for 4K delivery, could make ultra high definition a possibility over the air.

4K and the future

Although, as mentioned above, some sort of broadcast 4K service could be theoretically possible, the truth of the matter is that it’s probably not going to be worth it. By the time we get to a point where that’s a viable option it’s likely that most homes will have internet connections fast enough to deliver 4K either in real time, or via a download.

The 4K revolution

Broadcast TV has always had, and will always have one major problem – it’s very expensive. It’s also probably not the best use of our limited radio frequency spectrum, when we could instead use that space for providing nationwide ultra-fast internet access over 4G, or one day 5G.

TV is in rude health now, the quality of both the images and the programming are higher than they have ever been, but the way we get TV is ailing. Live broadcasts really only suit news and sport these days. People don’t want to be told when to watch Game of Thrones, they just want to see it before some idiot on Twitter ruins the end of episode two for them.

As with the music and film industries, the TV companies have resisted a change in their business model. After all, the art of TV scheduling has been around a long time, and it’s obvious those who make their living doing it don’t want to give up. But what Netflix and even illegal downloads is teaching the entertainment industry is that people don’t care about channels or times, they just want their TV when they want to watch it.

So perhaps in 20 years time broadcast TV will have gone away, and all of our 8K super-ultra-HD will be delivered via our mobile phones. Only time will tell, but we can feel things changing for TV, and faster than ever before.

Read more

In-Depth