Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anyone else having an issue with Resolve 3D LUT Creation? #448

Closed
decayedxistence opened this issue Oct 22, 2024 · 7 comments
Closed

Anyone else having an issue with Resolve 3D LUT Creation? #448

decayedxistence opened this issue Oct 22, 2024 · 7 comments

Comments

@decayedxistence
Copy link

decayedxistence commented Oct 22, 2024

With the latest DisplayCal and Installer, I'm having trouble with creating a proper 3D and unsure if it's me or a bug somewhere. I don't remember it being this complicated or having this many issues.

Macbook > Ultrastudio Monitor 3G > BenQ Monitor

Video Levels in Master Project Settings in Resolve
Video 3D LUT in DisplayCal preset (tried both 1886 absolute blackoutput offset to 0 & gamma 2.4 relative)

Run the calibration and import the LUT and it's all washed out on my monitor - which makes me think it's a levels mismatch

If I try an old Calibration from a few months ago, it looks fine.

Can provide more info if necessary and questions arise.

My external monitor IS shit but has great contrast but also wondering if it's because I have no way to change the RGB input data levels on the monitor as that setting is always greyed out. Does that feature not work on Mac? Is there a way to manually set the monitor levels on Mac?

It should be video in resolve, video on monitor, auto scale on 1st page of displaycal and full on calibration page, correct?

just tried a few more variances in calibrating and I'm winding up with pixelated and blocky blacks

Versions (please complete the following information):

  • OS:Sonoma
  • Python Version: 3.13
  • ArgyllCMS Version: 3.3
  • DisplayCAL Version: 3.9.14
@Adam-Color
Copy link

Hello, I had this exact same issue, and so I wanted to comment here to see if you found a solution different from mine. Here's my bug report, let me know if you have experienced something different:

#465

Basically the settings are incorrectly reset to defaults during profiling resulting in the automatic LUT always having a mismatched gamma curve, creating the exact artifacts you are describing. To get around this, create the LUT with the correct settings AFTER profiling, don't use the "automatically create 3D LUT after profiling" option

@decayedxistence
Copy link
Author

decayedxistence commented Dec 25, 2024

So, the solution I found - for my use case - was to set the levels to full in resolve, Auto on the main DisplayCal page, and on the profiling end I set it to TV Range.

Since I can't change the computer monitor levels, I purposely mismatched the levels as stated above.

After verification, I get some really incredible results. No errors anywhere in grayscale or blacks and great RGB lineup. Think I ran around a 300 patch sequence.

@Adam-Color
Copy link

So, the solution I found - for my use case - was to set the levels to full in resolve, Auto on the main DisplayCal page, and on the profiling end I set it to TV Range.

Since I can't change the computer monitor levels, I purposely mismatched the levels as stated above.

After verification, I get some really incredible results. No errors anywhere in grayscale or blacks and great RGB lineup. Think I ran around a 300 patch sequence.

Got it, thanks. It seems like you perhaps temporarily ran into the new issue I found (that's what can cause the distorted blacks) while trying to fix this. Your fix actually did work for me, but I was still able to get an even better calibration by only creating the 3D LUT after the profile is done as I explained previously, so I would be curious as to what results you would get if you did this.

I have confirmed in the source code that displaycal produces the wrong gamma curve when "create 3D LUT after profiling" is checked, so there's a chance you might be trying to tighten a screw with a mallet here. Plus, if you ever need to work with video levels (such as for h.264 delivery) your calibration won't work.

Setting full levels in DisplayCAL doesn't actually do anything, it's only setting to video levels that causes a conversion to happen. This means that auto levels is always full levels which is what I would recommend. I would advice against mismatching levels unless 100% certain that the gamma bug is not altering your calibration LUT, thereby causing you to think there's a video/full issue when there isn't one. I only say this because I made the same error and it took me months of trial and error (and looking through the source code) to figure it out.

@decayedxistence
Copy link
Author

decayedxistence commented Dec 25, 2024

I'm curious as well if the results would be even better with your advice.

Just so I'm understanding correctly, you're saying to check video levels in resolve, leave levels as auto on the main display cal page, and leave the levels as full on the profiling page.

Then UNcheck, "create 3D LUT after profiling", run a calibration, THEN create the 3D LUT?

Or keep my settings as I had them (since that produced great results prior) and just create the 3D LUT AFTER calibration instead of checking the box to automatically create it? I've been using BT1886 instead of changing the setting to 2.4 Relative since my monitor has high contrast.

@Adam-Color
Copy link

Adam-Color commented Dec 26, 2024

I'm curious as well if the results would be even better with your advice.

Just so I'm understanding correctly, you're saying to check video levels in resolve, leave levels as auto on the main display cal page, and leave the levels as full on the profiling page.

Then UNcheck, "create 3D LUT after profiling", run a calibration, THEN create the 3D LUT?

Or keep my settings as I had them (since that produced great results prior) and just create the 3D LUT AFTER calibration instead of checking the box to automatically create it? I've been using BT1886 instead of changing the setting to 2.4 Relative since my monitor has high contrast.

Your display needs to be capable of 0 nit blacks (generally speaking: OLED or Full Array Local Dimming, not LCD) to comply with the bt.1886 tone response curve, otherwise you should be using a pure gamma 2.4 curve with 100% black output offset. You should set resolve to video data levels as that is the standard unless you are working with a VFX team or something that specifically needs a "full" pipeline all the way through.

DisplayCAL, however, should have everything set to full levels. This will tell it to do no level converting in the LUT, so counter-intuitively this will give you the LUT needed for video levels.

Also, I re-read your original comment, and it seemed like something happened with your monitor configuration, you should be able to change RGB values on any OS as that is a change handled by a chip in the monitor and not on your actual computer. Best practice here would be to find a "reset configuration" or similar option in the monitor's menu and reset everything, hopefully that will unlock these settings.

Also regarding monitors - BenQ has modes for different content, such as CAD/CAM, Web, and Rec.709. You want to avoid these and use the standard or default option. Calibrations are a subtractive process, so using these modes in conjunction with a calibration LUT will end up shrinking your monitor's color space more than necessary.

Make sure vcgt is not selected and that in profiling black point compensation is off. In the calibration page, make sure Black Level and Tone Curve are set to "as measured" (using these controls caused issues for me).

Make sure you are using the right correction LUT for your monitor and colorimeter. You are correct that 3D LUT creation should be done not only separately, but also after profile creation. This way, when DisplayCAL resets your 3D LUT configuration during profiling (the bug I'm trying to fix), it doesn't mess with the LUT you end up creating (can't tell you how many hours this took me to figure out... I only noticed by reading the terminal output line-by-line).

Important to note, assuming you are experiencing the same bug I did, you will have to change the 3D LUT settings twice, both before and after profiling. If this isn't the case, please let me know.

For verification, make sure you set the simulation profile to ITU-R Rec.709 and match your 3D LUT settings in the options that appear. Do not use a device link profile.

Let me know if you have any questions.

@decayedxistence
Copy link
Author

decayedxistence commented Dec 26, 2024

I'm curious as well if the results would be even better with your advice.
Just so I'm understanding correctly, you're saying to check video levels in resolve, leave levels as auto on the main display cal page, and leave the levels as full on the profiling page.
Then UNcheck, "create 3D LUT after profiling", run a calibration, THEN create the 3D LUT?
Or keep my settings as I had them (since that produced great results prior) and just create the 3D LUT AFTER calibration instead of checking the box to automatically create it? I've been using BT1886 instead of changing the setting to 2.4 Relative since my monitor has high contrast.

Your display needs to be capable of 0 nit blacks (generally speaking: OLED or Full Array Local Dimming, not LCD) to comply with the bt.1886 tone response curve, otherwise you should be using a pure gamma 2.4 curve with 100% black output offset. You should set resolve to video data levels as that is the standard unless you are working with a VFX team or something that specifically needs a "full" pipeline all the way through.

DisplayCAL, however, should have everything set to full levels. This will tell it to do no level converting in the LUT, so counter-intuitively this will give you the LUT needed for video levels.

Also, I re-read your original comment, and it seemed like something happened with your monitor configuration, you should be able to change RGB values on any OS as that is a change handled by a chip in the monitor and not on your actual computer. Best practice here would be to find a "reset configuration" or similar option in the monitor's menu and reset everything, hopefully that will unlock these settings.

Also regarding monitors - BenQ has modes for different content, such as CAD/CAM, Web, and Rec.709. You want to avoid these and use the standard or default option. Calibrations are a subtractive process, so using these modes in conjunction with a calibration LUT will end up shrinking your monitor's color space more than necessary.

Make sure vcgt is not selected and that in profiling black point compensation is off. In the calibration page, make sure Black Level and Tone Curve are set to "as measured" (using these controls caused issues for me).

Make sure you are using the right correction LUT for your monitor and colorimeter. You are correct that 3D LUT creation should be done not only separately, but also after profile creation. This way, when DisplayCAL resets your 3D LUT configuration during profiling (the bug I'm trying to fix), it doesn't mess with the LUT you end up creating (can't tell you how many hours this took me to figure out... I only noticed by reading the terminal output line-by-line).

Important to note, assuming you are experiencing the same bug I did, you will have to change the 3D LUT settings twice, both before and after profiling. If this isn't the case, please let me know.

For verification, make sure you set the simulation profile to ITU-R Rec.709 and match your 3D LUT settings in the options that appear. Do not use a device link profile.

Let me know if you have any questions.

My BenQ has 3000:1 static contrast and when I've tried profiling with gamma 2.4/100% black output offset, it never calibrates well and has tons of errors - - Now, that could be because it measured for a pure 2.4 but displaycal reset the tone curve back to 1886 (the bug you mentioned) so the blacks become super lifted (images look washed out). When I did my method mentioned in a previous post that worked, it wasn't washed out and my verification report was good so I stuck with that.

The monitor is reset to factory settings with only RGB adjustments for the 1D LUT. I'm on Mac OS and still can't find any way to adjust video/full levels on the monitor itself. From what research I did, it basically says I can't and Mac automatically outputs full levels.

Also, unsure if this is correct or not, but since the 3D LUT isn't getting uploaded to the monitor itself but rather a 3D LUT inside Resolve that VCGT SHOULD be checked to be included.

Correction profile is indeed correct.

Verification profile settings are correct.

I'll run another calibration tonight as I didn't have time last night and will see how creating the 3D LUT after the profiling instead of automatically works....one step at a time as a process of elimination before changing everything at once, ya know

Thanks for the help and will keep you posted!

@decayedxistence
Copy link
Author

Can't even get DisplayCal to run currently. I keep getting "broken pipe" messages.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants