There were plenty who didn’t see the need for 4K resolution – but it’s happening. 8K is probably no less inevitable, as Ian McMurray describes.
It may well have passed you by, as the results were viewable by a very small audience in Japan – but the 2016 Olympics was broadcast in 8K resolution. The first-ever 8K outside broadcast vehicle – built by Ikegami – was deployed, with eight 8K cameras onboard. Back in Japan, the results were shown to the public on a 200” screen. This was a test, in preparation for the 2020 Olympics, which will be held in Japan and which local broadcaster NHK has said will be broadcast in 8K – although the company plans to start regular 8K transmissions next year.
But that’s the world of broadcast. It’s significant that it will be the Olympics that will see the first major deployment of 8K infrastructure – significant because, so far, the industry is agreed that what has really driven the adoption of 4K resolution is sports. Yes, there is an increasing amount of 4K content available from the likes of Netflix and Amazon – but regular broadcast? Not so much – except in the case of sports.
In the UK, for example, pay TV providers Sky and BT Sports have been broadcasting Premier League games in 4K since 2015. It’s also a regular feature of American sports broadcasting. But, if I understand correctly, in Australia, the major channels are still thinking about it – and there are even channels who have not yet committed to broadcasting in HD.
But: what of the pro AV world? (Although there’s little doubt that in many ways pro AV and broadcasting are converging).
Size is important
The first thing to say is that many commentators agree that the difference between 4K and 8K is only visible on screens above 85” diagonal. That’s not to say, of course, that customers deploying smaller screens won’t go for 8K: there are plenty of organisations out there for whom image is everything, and who like to be on the leading edge.
For all practical purposes, however, the 8K market is likely to be limited to larger screens – and the question becomes one about what proportion of installations are looking to deploy 85”+ screens? (There’s also the issue of the practicability of transporting, delivering and mounting such large screens – and the huge strides made by video walls, especially based on LED technology, that are modular and that now feature such small bezels as to effectively present a single, continuous image.)
However: let’s say that there will indeed be a market for 8K. What are its implications? It’s all about the continuing proliferation of pixels. 8K implements 16 times the pixel count of HD. That’s a staggering 33 million pixels – per frame. Add to that the complexity created by technologies such as high dynamic range (HDR) and wide colour gamut (WCG) – and the processing task becomes a ferocious one. Add to that the increasing use of augmented reality and virtual reality… Uncompressed, we’re probably looking at a requirement for 50Gb.
Here’s what interests me, though. Moving huge numbers of pixels around the network has seen the rise of codec technology. The rationale is, of course, that actually transmitting that amount of picture data would bring a network to its knees – so how do we reduce the amount of data we move? We’ve had JPEG2000, H.264 and now, HEVC/H.265 – all algorithms designed to minimise pixel counts while maximising image quality. One goal is to provide the viewer with a ‘visually lossless’ experience: the image should look as if it had never been compressed.
The second is to minimise latency in order to minimise ‘glass-to-glass’ time – the time it takes an image to move from the camera lens to the viewer’s screen. That latency, in large part, is a function of the time to compress an image at the sending end and decompress it at the receiving end. The more processing that has to be done – which depends on the number of pixels and the available bandwidth – the longer the delay.
Of course, compression and latency are anathema to the pro AV world, which has long prided itself on delivering pristine images in no time at all. Talk to integrators, and that often comes up as a reason why they view the progressive IT-ification of the industry with a distrust bordering on dislike. (When I was at Texas Instruments and we were creating DLP technology, the joke was always that our development engineers could spot a visual artifact at five miles…)
Minimising latency and image quality loss while maximising network throughput is a big challenge, and one that has got harder with 4K – and will get even more so with 8K. It will take codecs well beyond what we have now. Questions arise about whether software codecs – even running in the best commercial hardware available – will be man enough. Does the future belong to hardware codecs?
To me, it seems inescapable that, even with the very best efforts of codec developers, processing 16 times as many pixels as we mostly do today will inevitably mean some degradation in image quality (thus pretty much destroying the whole point of 8K resolution) and extended latency. For most pro AV applications, latency is less of an issue – if an image arrives at its destination in a second, rather than half a second, it’s no big deal. For some, though – mission-critical applications in command/control centres (who are perhaps most likely to be early adopters of 8K systems) – latency is a bigger concern.
Here, though, is where, in my mind, things get really interesting. Codecs, first and foremost, acknowledge that there isn’t enough network bandwidth to move all the bits we need to move. But supposing there were no limitation on network bandwidth?
Today’s Ethernet networks predominantly operate at 1Gbps. However: 10Gbps speeds are becoming increasingly commonplace; there are occasional deployments of 40Gbps switches; and the industry is gearing itself up for the widespread deployment of 100Gbps networks.
Of course, all that’s fine – but what about when the network goes outside the building into the wild, wild west that is the Internet? Today, the fastest download speeds available max out at around 100Mbps (we can dream, right?). That’s still pretty phenomenal compared with the 28k dial-up download speeds that we used to ‘enjoy’ back in the day – but it’s a very far cry from 100Gbps.
My point, though, is that if network speeds weren’t an issue – we wouldn’t need codecs. We could deliver images in their original form, with no latency – just like we used to do…
It’s not, of course, just about network speeds and capacity. There are other issues. Take DisplayPort, for example. According to VESA, DisplayPort 1.3 will be able to support 8K video at 60fps and 24-bit colour using a 2:1 compression ratio, or 30-bit colour using a 2.5:1 compression ratio. The new HDMI 2.1 spec includes support for 8K. 8K is on the HDBaseT roadmap. Those kind of things all need to be worked out.
In all this, of course, it doesn’t really matter whether anyone actually needs or wants 8K resolution. In theory, it should be a niche market. But that’s not the way the world works. It’s newer, it’s better, it’s a way of getting us all to upgrade the screens we only bought a couple of years ago. It’s going to happen, whether we like it or not. Turning it into a reality that the pro AV community can not only live with, but embrace, promises to be a significant challenge – but, as it has always done, it’s a challenge the industry will certainly rise to.