The lighting is dimmer than the mastering display, and matches its white point in color. A color grading suite with controlled artificial light. The further from this ideal we get, the greater difference we’ll perceive from what was seen in the color suite. So what are the ideal viewing conditions? In a perfect world, we’d have a fixed amount of artificial light, no brighter than our display, whose color temperature matches the white point of that display. The same piece of content viewed on the same device in a daylit room versus a tungsten-lit room will appear respectively warmer or cooler, because of the adaptive nature of our vision-our eyes are forever white-balancing to our environment. This concept applies not only to the overall amount of ambient light, but also to the color of ambient light. This one may sound obvious, but that’s also what makes it so easy to forget: a piece of content viewed outside on a sunny day will look much dimmer than on that same device in a darkened room. The combinations here are endless, but suffice it to say that any mismatch between mastering gamut/gamma and display gamut/gamma is a problem. įor example, if I grade a film on a P3 Gamma 2.6 theatrical projector, it will not appear correctly on a Rec. This concept leads to the first cause of poor image translation across devices: a mismatch between the mastering gamut/gamma pair of a piece of content and the native gamut/gamma pair of the display it’s being viewed on. Common capture gamut/gamma pairs include: Arri WCG LogC, Sony S-Gamut S-Log3, and RedWideGamutRGB Log3G10. These properties can describe the capabilities of a display, a capture device, or even an abstract intermediate container such as ACES.Ĭommon display gamut/gamma pairs include: Rec709 Gamma 2.4, sRGB Gamma 2.2, and P3 Gamma 2.6. In simplest terms, gamut describes a finite range of possible colors, and gamma describes a finite range of luminance and contrast. One of the key concepts we’ll be returning to throughout this article is that all content is mastered to a specific target gamut and gamma. So let’s start by looking at the key factors that cause images to translate poorly from the grading suite to the end viewer. Without understanding the causes of our problems, we’re unlikely to solve them. Let’s dive in! Why images translate poorly So what should our goal be?Īs a colorist, my aim is always to give the greatest number of viewers the most faithful reproduction possible of the creator’s visual intent. It would be great if we could make every image look identical across every possible device and environment, but, for reasons we’re going to discuss, this is simply not possible. We’ll also touch on best practices when you don’t have access to a proper grading suite, as well as the impact of HDR in this area.īefore we go any further, we need to define our goal. Armed with this, we’ll discuss practical strategies we can employ for more consistent results across various screens and viewing environments. In today’s article, we’re going to tackle this issue head-on, starting with understanding what causes images to translate poorly in the first place. While we lack control over a number of critical factors in the visual experience of our work, there are several other factors we can control, or at least influence. Is it pointless to pretend otherwise? Is there anything we can do to remedy this problem? In those moments, it feels like we’ve got no control over viewers’ experience of our work. Content looks one way in the color suite, and a thousand different ways out in the wild, spanning phones, computers, tablets, TVs, and other screens in every conceivable viewing environment. At some point, every content creator is confronted with the problem of inconsistent renderings of their images.
0 Comments
Leave a Reply. |