Skip to content

Commit 737298b

Browse files
committed
lavc/vaapi_decode: add missing flag when picking best pixel format
vaapi_decode_find_best_format currently does not set the VA_SURFACE_ATTRIB_SETTABLE flag on the pixel format attribute that it returns. Without this flag, the attribute will be ignored by vaCreateSurfaces, meaning that the driver's default logic for picking a pixel format will kick in. So far, this hasn't produced visible problems, but when trying to decode 4:4:4 content, at least on Intel, the driver will pick the 444P planar format, even though the decoder can only return the AYUV packed format. The hwcontext_vaapi code that sets surface attributes when picking formats does not have this bug. Applications may use their own logic for finding the best format, and so may not hit this bug. eg: mpv is unaffected.
1 parent 9e029dc commit 737298b

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

libavcodec/vaapi_decode.c

+2
Original file line numberDiff line numberDiff line change
@@ -358,6 +358,8 @@ static int vaapi_decode_find_best_format(AVCodecContext *avctx,
358358

359359
ctx->pixel_format_attribute = (VASurfaceAttrib) {
360360
.type = VASurfaceAttribPixelFormat,
361+
.flags = VA_SURFACE_ATTRIB_SETTABLE,
362+
.value.type = VAGenericValueTypeInteger,
361363
.value.value.i = best_fourcc,
362364
};
363365

0 commit comments

Comments
 (0)