Apple introduced a TV calibration feature with the new Apple TV 4K in April. The color balance option uses the front-facing sensors on an iPhone with Face ID to optimize the output of Apple's streaming box (including the 2017 model). According to Apple, viewers will see “much more accurate colors and improved contrast” after calibration. But, the analysts at HDTVTest believe the feature doesn't always deliver on its promises. And they know a thing or two about visual quality having poked holes in The Mandalorian's true HDR claims.
In a new video, HDTVTest's Vincent Teoh armed himself with an older Apple TV 4K box and an iPhone 12 Pro to apply Apple's calibration across a trio of 55-inch TV sets: An LG OLED TV, a Samsung QLED and a Sony Bravia LED LCD TV. The AV buff also ran side-by-side comparisons on a Sony LCD mastering monitor with reference class color accuracy.
Across all three TVs, Apple's color balanced result appeared bluer than the original output, Teoh said. On the Sony LCD LED display, which was set to the most accurate out-of-the-box custom preset, Apple's calibration gravitated towards a bluer white point than the D65 standard used within the broadcast industry. In fact, it caused the color accuracy to deteriorate with an increase in delta errors, Teoh noted.
On the Samsung QLED TV in filmmaker mode, Apple's balance also resulted in a bluer image, with Teoh's objective measurements confirming this blue shift. And, on the LG OLED TV set to technicolor expert picture mode, the calibration yielded lower grayscale errors, contributing to improved color accuracy with reduced delta error figures. But, it was still nowhere near the results of a properly conducted calibration using specialized tools and software, according to Teoh.
Another shortcoming is the feature's inability to profile between different display technologies. This doesn't bode well for the broader range of display tech on the market, according to Teoh. Think LED LCDs with a traditional or PFS phosphor backlight, QLED TVs with quantum dot enhancement film and WRGB OLED TVs. While, certain 2021 OLEDs also have a new green-emitting layer.
As Teoh states in the clip, all of these display techs have different spectral power distribution (SPD). So, to achieve an accurate result in terms of luminance and color measurement, a colorimeter needs to be profiled against a spectral radiometer. However, the analyst notes that an iPhone likely won't be profiled to a spectral radiometer, which may explain why color balance produced different results between OLED and LED LCD displays.
Giving Apple the benefit of the doubt, Teoh claims it may be possible for Apple to "identify the TV through the EDID" and apply the necessary EDR offset based on the known spectral response of the display tech. But, that depends on whether the TV manufacturer is providing the correct info and Apple is following that process.
Teoh also admits that Apple's color balance could work well for less accurate TV presets. Yet, even when the image quality was inaccurate, Apple's calibration apparently introduced posterization into the picture. This is when low color bit depth results in defined “stepping” from one color gradation to another versus a smooth, continuous gradation. Finally, Teoh emphasized that Apple's calibration feature only works for the Apple TV output rather than all of the TV's sources.
The analyst concluded that the feature can't match a proper calibration as that requires overriding a bunch of TV settings, from the video black level to superfluous edge enhancement and noise reduction to optimizing motion interpolation. In a nutshell, simply selecting the most accurate out-of-box picture preset is "more important than running the color balance procedure," Teoh said.