Eurovision Services' Franck Reynaud discusses the state of the art in live event production and the adoption of new technologies
Franck Reynaud from Eurovision Services recently spoke to French publication MediaKwest about the evolution of video formats and the challenges of production and distribution of UHD-HDR for traditional linear channels. Since the original interview was in French, we have translated it into English to make it available to a wider audience. We have also included questions which didn’t make it into the final interview in French for reasons of space, but which provide some extra insight into the subject.
Eurovision tested a 100 frames per second (fps) format with Sony at the 2018 European Athletics Championships in Berlin. Could this format be used for live events today?
No it cannot yet, which is a pity. Although the UHD standard allows for quite a wide range of frequencies, the 50 fps format has been widely adopted by equipment vendors. The reason for this is simple: the electronic payload (or gross bandwidth) of a UHD signal is 12 Gbps in the 3G-SDI four quadrant format, so it will just about fit into a 10 Gbps IP 2110 connection. In live production, such rates are already a challenge to compress in order to produce slow-motion sequences or archives. By increasing the frame rate to 100 fps, you automatically double the bandwidth required. Most of the players in the broadcast industry have chosen not to keep pace with this in order to keep their equipment affordable.
Furthermore, the faster a camera sensor has to work (to deliver more frames), the more it heats up and becomes unstable from a colorimetric point of view. It’s difficult to develop the technology needed to counteract this phenomenon, so only a few cameras can natively work at 100 Hz.
We shouldn’t forget either that the more you increase the frame rate, the more light you need at the location. The infrastructure for many indoor sports simply could not cope with such demands. For the moment, the target audience remains restricted to major sports events.
Are all the links in the production chain for live sports (capture, slow motion, graphics, mixing, external sources, contribution and distribution) ready for UHD-HDR?
UHD-HDR cameras are now almost common on the market, at least each major brand proposes a range of equipment. We are also seeing the emergence of relevant recording and conversion equipment for UHD-HDR workflows from vendors, but they are still quite expensive. On the other hand, the appetite of broadcasters for UHD-HDR is still not there, so service providers are less willing to invest in it. This naturally puts a brake on the development of this format. We face the same situation, with the same financial constraints, for contribution and transmission equipment.
There are also different image capture formats for HDR (HLG, SLOG-3) depending on the manufacturer or consortium, and the formats for distribution are still evolving. None of this makes things any easier for broadcasters and, by extension, for the final customer to have access to this format.
For better clarity, wouldn’t it be more important to move from 1080i to 1080p, especially for sport, which needs a higher frame rate than entertainment, than to move from 1080p to UHD?
I’m not so sure. There is clearly a benefit for the fluidity of slow motion when moving from 1080i to 1080p. However, the progressive scan introduces an element of visual staccato for fast panoramic movements (similar to cinema), which some viewers find even more distracting than the blur that interlaced scanning creates for rapid movements. Increasing the spatial definition (UHD) would make no difference here, so for me the move to a higher frame rate, or the introduction of new camera generations equipped with a global shutter system would make more sense.
Depending on the sport (ice hockey or football, for example), some people in the industry seem more interested by high frame rates than the extra pixels of UHD. Do you share this view?
Yes I do. If you think about the speed that the puck travels in an ice hockey match, you soon see the value of HFR for showing where it actually is on screen. Using the same bandwidth as a UHD signal at 50 fps you could have a 1080p signal at 200 fps. That gives you food for thought when you consider that a UHD close-up brings hardly any added value to the viewer.
Does the arrival of UHD-HDR mean that cameras need to be repositioned in the venue and shots framed differently?
I don’t think so, but this is just my own personal opinion. Directors in sports programming are always looking for more immersive ways of showing the action, but the viewer will always need a descriptive foundation for the action. A wide-angle camera at a football match is still vital for understanding the action. But thanks to its higher definition and extended dynamic range, UHD-HDR reveals details that viewers couldn’t see before.
Could UHD-HDR change the storytelling at live sports events? Do you have an example?
Consider the last football world cup in Russia, where there was an HD camera with a standard lens and a UHD camera with a wide-angle lens for the same wide shot. The UHD camera showed a much wider angle than the HD camera and was used to give UHD viewers a more “static” description of the action that was less taxing on their eyes. Tests in sports broadcasting have shown that UHD images do not offer any real benefits for fast action that is filmed close up.
At sports events, the weather can change and this can have an impact on camera settings from one frame to another. What techniques are used for camera shading between SDR and HDR?
This is the art of the shader, who has to focus on adapting the colorimetry and luminosity of the subject in real time depending on the production conditions. At the same time, one or more master shaders adapt the SDR to HDR, or vice versa, depending on the dominant camera type.
Compared with filming in HD-SDR, what are the technical and operational challenges of UHD-HDR production?
To understand the issue, it’s important to separate UHD (Ultra High Definition) from HDR (High Dynamic Range). Most UHD productions use SDR (Standard Dynamic Range). In this case, there are no major issues to consider, since the cameras just deliver a higher resolution (in other words a more detailed image). It is easy to offer backwards compatibility with HD-SDR.
HDR, on the other hand, involves an extended dynamic range and an extended colour space. You therefore need production monitoring that can replicate these colour spaces that were hitherto invisible to SDR cameras and at the same time reproduce much better degrees of luminosity. Recording and replicating such images requires new compression codecs that can support HDR and keep capacity requirements reasonable for recording.
Sometimes, specific cameras, such as hyper-motion cameras, which do not offer native HDR, also need to be integrated into the production workflow, so conversion systems are required.
You also need to maintain compatibility with the widespread HD-SDR standard and ensure that the two signals are delivered in parallel, with the correct dynamic range and in their own respective colour space.
Beyond production, is it now common to have a single workflow to distribute HDR and SDR, or do you need to duplicate technical and operational resources?
As your question hints, it would certainly not make sense financially to duplicate operational resources. Given the way the technology is evolving, it would be ideal if we could have an entire production chain with native HDR that offers backwards compatibility with SDR.
We are getting there slowly but surely, in the same way we did with HD in a world that was dominated by SD at the time. But at the moment a hybrid solution with a mixture of HDR and SDR systems is unavoidable.
Does the current state of technology force you into a compromise where HDR quality comes at a certain detriment to SDR?
I don’t think so. The technology is available. Obviously, it can still be improved and made more ergonomic but it is fit for purpose. However, we still have human beings using this technology and they need time to adapt to this new HDR standard. Understanding the full technical complexity of such a standard is not an easy task. We need to be patient and humble in how we learn to adapt to this new technology.
What about the problem of bandwidth for contribution and distribution of UHD, especially for traditional linear channels on digital terrestrial television, which is holding up the adoption of UHD?
UHD was born at a time when the world’s satellite and fibre networks were not equipped to handle its gross bandwidth. H.265 or HEVC encoding brings the size of a UHD broadcast contribution down to around 64 Mbps without any noticeable loss in quality. This is higher than a standard HD contribution in H.264 at 22 Mbps but it is still reasonable. Networks are growing and evolving, but so is the demand for bandwidth, so we are always looking for the next developments in compression technology.
When do you foresee HDR becoming widespread at live sports events, for production and distribution?
Major live sports events are often the catalyst for investment in and widespread adoption of new formats. Let’s hope that the Olympic Games in Paris in 2024 will see the advent of UHD-HDR and that some sports will subsequently be able to benefit from it over the long term.