Technical Sessions - 12
Date & Time
Thursday, November 18, 2021, 1:45 PM - 2:45 PM
Hojatollah Yeaganeh Michael Smith

 Understanding Banding - Perceptual Modelling and Machine Learning Approaches For Banding Detection

Speaker: Hajatollah Yeganeh

  • Banding is an annoying visual artifact that frequently appears at various stages along the chain of video acquisition, production, distribution and display. With the thriving popularity of ultra-high definition, high dynamic range, wide color gamut content, and the increasing user expectations that follow, the banding effect has been attracting a growing deal of attention for its strong negative impact on viewer experience in visual content that could otherwise have nearly perfect quality. Here we present two different types of technologies of great promises on banding detection. The first is knowledge-driven and is built upon computational models that account for the characteristics of the human visual system, the content acquisition, production, distribution and display processes, and the interplay between them. The second is data-driven, and is based on machine learning methods, by training deep neural networks with large-scale datasets.

Hue-Preserving Color Transforms for LED Wall Virtual Production Workflows

Speaker: Michael D. Smith

  • Virtual Production using LED wall display technology is gaining popularity in the entertainment industry to produce motion pictures, episodic television, live broadcast and esports content. This new paradigm typically uses one or more large LED displays that contains millions of individual Light Emitting Diodes (LED) that are used to display a virtual background and/or foreground objects from a virtual scene that is simultaneously captured by the digital photographic camera on set, resulting in so-called "in-camera visual effects". Realtime video game rendering engines update the image shown on the LED wall image to compensate for changes in camera location, camera pose and focal length. Modern digital camera workflows typically include color transforms that were not designed to accurately render the large areas of saturated colors that are possible when capturing the image shown on an LED wall. Some examples of a hue shift that can occur in typical workflows are blue to cyan, red to pink, red to orange and green to yellow. We found these hue shifts can occur dynamically while racking focus to and from the LED wall, and also statically when the LED wall is kept out of focus, which is a common technique that is used to minimized moir√© artifacts. This paper explores a simple modification of these common color transforms to preserve the hue of the scene while also creating a similar Look of the existing color transforms.
Virtual Session Link