-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: bitdepth option for avif/heif #4036
Conversation
Thank you for the PR. The prebuilt binaries only include support for a bitdepth of 8, so you won't be able to add a unit test for this path, but you can check valid values are accepted. It looks like lots of the tests are failing as the new |
Oh, right! I forgot the default build only supports 8-bit output. |
add validation unit tests add ushort unit test add test for valid bitdepth, make test formatting consistent consistent test wording add setting to constructor remove ushort test fix unit tests bitdepth is not a boolean
I got the tests working locally after adding |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the updates, all the tests are passing now, this is good to merge.
(I'll also add something to the docs to highlight that the prebuilt binaries only support a bitdepth of 8.)
I added a guard against this property when using the prebuilt binaries via commit aa1bbcb as the failure condition could otherwise lead to a segfault. |
Yeah, that makes sense. |
Just a question, is there a harm if we pass |
Following #4031, this PR exposes a
bitdepth
setting to allow outputting 10-bit or 12-bit AVIF/HEIF files. The default remains 8, so this doesn't change the current behavior.Fixes #4031