diff --git a/index.html b/index.html index fdcd059..4ff29e6 100644 --- a/index.html +++ b/index.html @@ -551,6 +551,71 @@
Some platforms or User Agents may provide built-in support for human + face lighting correction, in particular for camera video streams. + Web applications may either want to control or at least be aware that + human face lighting correction is applied at the source level. + This may for instance allow the web application to update its UI or to + not apply human face lighting correction on its own. + For that reason, we extend {{MediaStreamTrack}} with the following + properties. +
+The WebIDL changes are the following:
+partial dictionary MediaTrackSupportedConstraints { + boolean lightingCorrection = true; +}; + +partial dictionary MediaTrackCapabilities { + sequence<boolean> lightingCorrection; +}; + +partial dictionary MediaTrackConstraintSet { + ConstrainBoolean lightingCorrection; +}; + +partial dictionary MediaTrackSettings { + boolean lightingCorrection; +};+
When the "lightingCorrection" setting is set to true
by
+ the ApplyConstraints algorithm, the UA will attempt to correct
+ human face and background lighting balance so that human faces are
+ not underexposed.
+
When the "lightingCorrection" setting is set to false
by
+ the ApplyConstraints algorithm, the UA will not correct human
+ face and background lighting balance.
+
+<video></video> +<script> +// Open camera. +const stream = await navigator.mediaDevices.getUserMedia({video: true}); +const [videoTrack] = stream.getVideoTracks(); + +// Try to correct lighting. +const videoCapabilities = videoTrack.getCapabilities(); +if ((videoCapabilities.lightingCorrection || []).includes(true)) { + await videoTrack.applyConstraints({lightingCorrection: {exact: true}}); +} else { + // Lighting correction is not supported by the platform or by the camera. + // Consider falling back to some other method. +} + +// Show to user. +const videoElement = document.querySelector("video"); +videoElement.srcObject = stream; +</script> ++