Remote and Smart Media Production Incorporating User-generated Content

Due to steadily rising costs, broadcasters are looking for new, low-cost and time-saving production methods, which include participatory and user-generated media archives in the production. These types of production methods are now combined under the term Smart Production. A sub-category of smart production techniques includes Remote Production. Usually, productions often need large teams and long preparation times where audio and video equipment is physically moved to outside broadcast sites where it is set-up, configured and tuned for the specific production activity. Another time-consuming part is the set-up and facilitation of a control room for sound and vision engineers, editors and the directing team. To reduce complexity and costs, more and more productions take place remotely. In a remote production, the control room is at a fixed location, usually in the facility of the broadcaster. The control of equipment at the venue itself happens remotely from this room. But establishing remote links that deliver the required performance often requires dedicated connections and it is only feasible, at acceptable cost levels, it the productions reoccur regularly from the same locations. The use of 5G technology will help to realize a capable alternative for more flexible and ad-hoc solutions, while the low-latency and high-quality requirements can be satisfied. Virtual encoding and compression engines have the potential to replace dedicated encoder hardware and cognitive network optimization algorithms together with QoS-monitoring techniques can improve significantly over current Internet best-effort practices to ensure the required performance needs.

Another smart production area is the transmission of high quality content back into the studio which meets broadcast-quality performance requirements. Today interviews on the street are often recorded and, only at a later stage, processed and edited in the studio. Often highly relevant and live content, e.g. for breaking news, have poor or unreliable quality. 5G infrastructure needs to improve this by ensuring that audiovisual material from remote and mobile reporters gets delivered reliably and at high quality without relying on dedicated lines and equipment.

Scenario 1

On the venue side, a small production team places the equipment and interconnects it via a 5G network. For the processing of the video and audio streams virtual encoding engines are deployed. A virtual control unit will be responsible to forward the received control signals from the studio back to the cameras. In the studio, the preview stream is available and the video engineers can adjust the technical parameters like, focus aperture and colour grading. The director can create the final broadcast stream. This stream is then distributed through the current broadcasting distribution channels. Another option is to create the directed picture off-site with the help of a Media Process Engine deployed as a virtual application on the 5G infrastructure. Afterwards the control signal which video stream should be the active one and with audio channel should be used is sent to the off-site. A virtual encoding engine will create the final broadcast stream. Finally, the off-site generated broadcasting stream can be distributed.

Scenario 2

At a news event, contributors (professional journalists as well as regular users) may accesses media production applications deployed on a cloud-based 5G-infrastructure accessed via a smartphone. A virtual compression engine adjusts the required bandwidth and a media-aware cognitive network optimizer function deployed within the 5G infrastructure adjusts the network forwarding graph dynamically according to the specific requirements and conditions. This ensures that the best quality is delivered to the broadcaster. Hence, the live contribution could be used in a timely manner in a live reporting with broadcasting quality.

Use Case 2 overview of H2020 5G-MEDIA Project by Gordana Polanec-Kutija