Supports async / await pattern for FFmpeg operations.
- Support async / await pattern for FFmpeg operations
- Support Ctrl + C
This package supports FFmpeg asynchronously invoke with async / await pattern
wrapping ffmpeg.run_async()
of ffmpeg-python and returned subprocess.Popen
.
The async / await syntax makes asynchronous code as:
- Simple
- Readable
User can stop FFmpeg process gracefully by Ctrl + C.
This works as same as sending q
key to running FFmpeg.
This action is guaranteed by pytest.
pip install asyncffmpeg
asyncffmpeg.FFmpegCoroutine
class has asynchronous method: execute()
.
To run concurrently, it requires not multi threading but multi processing
since FFmpeg process is CPU-bound operation.
The package asynccpu
is helpful to simple implement.
Ex:
import ffmpeg
from asynccpu import ProcessTaskPoolExecutor
from asyncffmpeg import FFmpegCoroutineFactory, StreamSpec
async def create_stream_spec_copy() -> StreamSpec:
stream = ffmpeg.input("input.mp4")
return ffmpeg.output(stream, "output1.mp4", c="copy")
async def create_stream_spec_filter() -> StreamSpec:
stream = ffmpeg.input("input.mp4")
stream = ffmpeg.filter(stream, "scale", 768, -1)
return ffmpeg.output(stream, "output2.mp4")
async def main() -> None:
ffmpeg_coroutine = FFmpegCoroutineFactory.create()
with ProcessTaskPoolExecutor(max_workers=3, cancel_tasks_when_shutdown=True) as executor:
awaitables = (
executor.create_process_task(ffmpeg_coroutine.execute, create_stream_spec)
for create_stream_spec in [create_stream_spec_copy, create_stream_spec_filter]
)
await asyncio.gather(*awaitables)
if __name__ == "__main__":
asyncio.run(main())
Unfortunately High-level APIs of asyncio
doesn't support CPU-bound operations
since it works based on not ProcessPoolExecutor
but ThreadPoolExecutor
.
When we want to run CPU-bound operations concurrently with asyncio
,
we need to use Low-level APIs which need finer control over the event loop behavior.
The argument of Coroutine
requires not "raw Coroutine
object" but "Coroutine
function"
since raw Coroutine
object is not picklable.
This specification is depend on the one of Python multiprocessing
package:
multiprocessing — Process-based parallelism
Note When an object is put on a queue, the object is pickled and a background thread later flushes the pickled data to an underlying pipe.
See: Answer: Python multiprocessing PicklingError: Can't pickle <type 'function'> - Stack Overflow
class FFmpegCoroutineFactory:
@staticmethod
def create(
*,
time_to_force_termination: int = 8
) -> FFmpegCoroutine:
The time limit (second) to wait stopping FFmpeg process gracefully
when send Ctrl + C.
At first, subprocess will try to send q
key to FFmpeg process.
In case when FFmpeg process doesn't stop gracefully by time limit,
subprocess will terminate process.
class FFmpegCoroutine:
async def execute(
self,
create_stream_spec: Callable[[], Awaitable[StreamSpec]],
*,
after_start: Optional[Callable[[FFmpegProcess], Awaitable]] = None
) -> None:
Coroutine
function to create stream spec for FFmpeg process.
Created stream spec will be set the first argument of ffmpeg.run_async()
of ffmpeg-python inside of FFmpegCoroutine
.
stream spec is a Stream, list of Streams, or label-to-Stream dictionary mapping
in ffmpeg-python.
Coroutine
function to execute after start FFmpeg process.
This package was created with Cookiecutter and the yukihiko-shinoda/cookiecutter-pypackage project template.