Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ideas on making src element only generate frames when a fresh buffer is created #84

Open
aiden-jeffrey opened this issue May 21, 2024 · 3 comments

Comments

@aiden-jeffrey
Copy link
Contributor

aiden-jeffrey commented May 21, 2024

I'm looking to add a vsync type functionality to the cefsrc element that would only push buffers when a fresh one is painted in the RenderHandler.OnPaint method. Ultimately I want to be able to record webgl applications that may have a variable frame rate into a constant frame rate video. In other words, I want one frame in my mp4 file per requestAnimationFrame in js land.

Currently it's clear that (baring some initial paints), there is 1 OnPaint call per animation frame.

I can sort of get there by controlling the duration and pts time in gst_cef_src_create, but I was wondering if you had some better idea. Is the answer something to do with making the element non-live for this vsync use case?

@MathieuDuponchelle
Copy link
Collaborator

Would a meta to tag "original" video frames as opposed to copied / made-up be enough for you? You could then discard things as you see fit. Another option of course would be a property that would cause the source to output buffers marked as gap buffers, then in the demuxer they would be transformed as straight gap events, with the audio buffers still being demuxed as normal.

@aiden-jeffrey
Copy link
Contributor Author

Mmm, yes I was thinking that my approach might mess up audio. Out of interest, from an architecture perspective, why was the cefmux element required? I.e. why doesn't the cefsrc just expose an audio pad as well? Is it standard to stream audio packets on a video/raw pad and then demux?

@MathieuDuponchelle
Copy link
Collaborator

MathieuDuponchelle commented May 23, 2024

I.e. why doesn't the cefsrc just expose an audio pad as well? Is it standard to stream audio packets on a video/raw pad and then demux?

No, it is not standard but a workaround for the fact that GstBaseSrc is designed to expose a single Always source pad.

The alternative solution is a wrapper bin, with one source per output stream and a shared context (in this case the CEF browser), but at the time this was implemented cef had no support for audio capture, and it was then easier to retrofit a demuxer to the initial implementation :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants