Skip to content

Audio and Video Textures

Patricio Gonzalez Vivo edited this page Oct 23, 2022 · 2 revisions

This documentation is assuming you already follow the steps on how to compile from source and now you are interested on adding the Audio/Video experimental features.

AUDIO stream as texture

You can load real-time audio data (buit-in mic, monitor output, external audio device) as texture by selecting "capture device id", example:

glslViewer shader_audio.frag --audio 0

Where 0 is capture device id. All available capture devices are listed with --audio parameter. If id is not provided, system default one will be used. It's possible to fetch frequency spectrum data and wave data from the texture. To do so, please refer to this example shader: examples/2D/00_tests/test_audio.frag

VIDEO streams as textures

To load video streams from files (local and remote) just do:

glslViewer shader.frag video.mp4
glslViewer shader.frag rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov
glslViewer shader.frag /dev/video0

Although we are using ffmpeg for most of the work, video decoding works slightly different depending the OS:

  • MacOS have some particularites loading the available video cameras because it's done through AVFoundation instead from a device file like /dev/videoX. Each video device in AVFoundation have an index number which you can obtain by doing ffmpeg -f avfoundation -list_devices true -i "". Then to open that device use the argument --video follow by the index number of the video camera you want to use. For example:
glslViewer shader.frag --video 0
  • On RaspberryPi 4 there is no OpenMAX/MMAL support so all the decoding is done by ffmpeg on the CPU exactly as describe above, which is slow!

  • On RaspberryPi Zero, 2, 3 and 3+ we use hardware acceleration through OpenMAX for the decoding of H264 files and MMAL to read from RaspberryPi Cameras streams. Both achieve surprisingly fast speeds like 60fps at 1080p!! To open the video camera stream use the --video argument (similarly that MacOS) but follow by the desired width, height, fps and rotation: --video <widthxheight>[xfps][xrotation]. (note the fps and rotation are optative). For example:

glslViewer examples/2D/00_tests/test.frag --video 1920x1080x30

To open H264 video files on RaspberryPi zero/2/3/3+ just add the path to it. For the moment it needs to be a raw H264 file (finishing on the .h264 extension) otherwise will be picked up by the cpu-based ffmpeg decoder (if it's present).

You can test with the big bunny example shipped on all Raspbian distributions:

glslViewer examples/2D/00_tests/test.frag /opt/vc/src/hello_pi/hello_video/test.h264

You can convert any video file to H264 using ffmpeg in the following way:

ffmpeg -i test.mp4 -vcodec copy -an -bsf:v h264_mp4toannexb test.h264