NAB Show

NAB Show

Session.

Capture Anywhere, Produce Anywhere: 8K Immersion, Deep-Sea UHD and Multi-Vendor Live Workflows

Tuesday, April 21 | 11 a.m. – noon | N256

Broadcast Engineering and IT (BEIT) ConferenceAdd to MY Show Planner

From immersive screens to extreme environments, broadcast imaging is expanding in both capability and context—while production infrastructures become more software-defined and multi-vendor. This session highlights three breakthroughs: a compact 8K×8K square-format camera enabling IMAX reframing, high-resolution VR capture, and real-time ROI switching for broadcast; a UHD deep-sea camera system engineered for full broadcast fidelity at 7,000 meters using robust optics, pressure-tolerant design, and fiber telemetry; and the EBU DMF Media eXchange Layer (MXL) SDK, an open, high-performance interoperability layer connecting containerized live production functions across vendors with ultra-low latency.

Subsessions

  • Compact 8K×8K Camera; Imaging the Immersive Era

    Tuesday, April 21 | 11 – 11:20 a.m. | N256

    Kodai Kikuchi

    The development of image sensors for video production has traditionally focused on improving resolution in horizontal formats such as the 16:9 aspect ratio. However, the growing demand for wide field-of-view content for large cinema screens, dome theaters, and head-mounted displays necessitates a greater vertical imaging coverage that exceeds the capabilities of conventional sensors. In the broadcasting industry, it is important to explore ways for conventional framing and immersive content production to coexist effectively. To address this challenge, we developed a compact 8K×8K image sensor with a square 1:1 aspect ratio. This sensor incorporates backside-illuminated stacked pixels and a proprietary high-speed readout circuit, achieving an optimal balance between compact size (5/6-type) and high throughput (59 megapixels, 60 fps, 14-bit). Compared to recent large-format high-resolution sensors, our sensor enables a more compact camera design that performs superiorly in high-mobility immersive shooting scenarios. Utilizing this technology, we developed an 8K×8K camera system and conducted multiple shooting experiments and field trials. This report outlines the camera system and introduces three novel applications: 1) IMAX Production: Wide-angle footage captured with a fisheye lens is reframed to generate full-size IMAX content (1.43:1 aspect ratio), incorporating pan, tilt, and zoom effects from any desired viewpoints during post-processing. 2) Immersive Content Production: Single or multiple fisheye images are converted into 180 or 360-degree VR (virtual reality) videos at 15K resolution, enabling highly detailed immersive experiences for virtual tours, flying theaters, and educational applications. This production leveraged the camera’s compact form to enable videos to be shot using drones and electric bicycles. 3) ROI Switching for Broadcast: Wide-angle studio footage is segmented into multiple ROIs (regions of interest) and output in real time, allowing dynamic video switching with a single camera setup in conventional broadcasting workflows. These advancements demonstrate that compact, square-format, high-resolution sensors open up new possibilities for producing broadcast content through greater flexibility in multi-platform deployment and improved workflow efficiency.

  • Broadcasting from the Deep: Engineering a UHD Imaging System for Extreme Deep Sea Environments

    Tuesday, April 21 | 11:20 – 11:40 a.m. | N256

    Aaron Steiner

    In the pursuit of extending broadcast-quality imaging into environments previously considered unreachable, the MxD SeaCam UHD Imaging System represents a new milestone in extreme-environment video engineering. Developed by DeepSea Power & Light and the Monterey Bay Aquarium Research Institute (MBARI), the MxD SeaCam adapts a Sony HDC-P50 broadcast camera and Canon CJ15ex4.3B UHD Super Wide lens into a titanium-housed, pressure-tolerant platform rated to 7,000 meters of seawater. The system maintains full 12G-SDI and HDR broadcast fidelity through a single-fiber Coarse Wavelength Division Multiplexing (CWDM) telemetry link, carrying video, control, and environmental monitoring over 10,000 meters of optical fiber. Custom multi-element optical correctors and a borosilicate dome port preserve diffraction-limited UHD performance while minimizing distortion, chromatic aberration, and field curvature caused by the air-to-water interface. Precision servo controls provide variable-rate zoom and position-based focus that remain responsive under 10,000 PSI of pressure. This presentation examines the technical architecture and design validation of the MxD SeaCam as a case study in broadcast engineering innovation—from optical performance modeling to mechanical and thermal simulation, and from pressure housing validation to system integration. By leveraging familiar broadcast standards and control protocols, the system demonstrates how studio-grade UHD imaging can operate in the most hostile conditions on Earth without compromise in color accuracy, dynamic range, or operator control. Attendees will gain insight into design methodologies that extend the reach of broadcast technology into scientific, industrial, and documentary production applications—illustrating that the same engineering discipline used to produce live television can now “broadcast from the deep.”

  • The EBU DMF Media eXchange Layer (MXL) SDK for Multi-Vendor Live Video Production

    Tuesday, April 21 | 11:40 a.m. – noon | N256

    Thomas Edwards

    Top-tier live event production demands using best-in-class tools from multiple vendors. To maximize workflow flexibility and server utilization, containerized software on multi-core/multi-GPU servers has become essential. But the industry needs a common mechanism to connect together these containerized media processing functions from different customer-curated vendors while maintaining uncompressed video and audio quality. The European Broadcasting Union (EBU) Dynamic Media Facility (DMF) is a layered model for software-defined production infrastructures, allowing users to make independent technology choices. DMF operates on generic IT infrastructure (compute, networking, storage, and timing) and supports dynamic software media function deployment across on-premises, remote, or cloud environments. The DMF Media eXchange Layer (MXL) SDK enables high-performance interchange between software media functions. It provides a common, vendor-neutral platform with clear observability. MXL uses the lowest latency asynchronous data transfer methods to avoid delays in the live signal chain. Shared memory is used for same-host functions, and remote direct memory access (RDMA) OS-bypass networking is used for cross-host communication. The Linux Foundation hosts the open source MXL Project in collaboration with the EBU and the North American Broadcasting Association (NABA). The SDK code is available on a public GitHub repository under the open-source Apache-2.0 license. This presentation will describe the essential technology elements of MXL, provide an update on the current status of the open source project, and will show how multiple vendors have already demonstrated interoperability of the MXL SDK at the IBC 2025 trade show.

Interested in Sponsorship Opportunities?