Difference between revisions of "gstreamer"
m (→Gstreamer) |
|||
(6 intermediate revisions by the same user not shown) | |||
Line 8: | Line 8: | ||
See also this [http://wiki.oz9aec.net/index.php/Gstreamer_cheat_sheet Gstreamer cheat sheet]. | See also this [http://wiki.oz9aec.net/index.php/Gstreamer_cheat_sheet Gstreamer cheat sheet]. | ||
− | This will display the camera view in a window: | + | == Play a video == |
+ | |||
+ | This will figure out almost any video format and play it. The magic happens in '''decodebin2'''. | ||
+ | <pre> | ||
+ | gst-launch filesrc location=video.mov ! decodebin2 ! autovideosink | ||
+ | </pre> | ||
+ | |||
+ | == Display live camera view in a window == | ||
+ | |||
+ | This will display the live camera view in a window: | ||
<pre> | <pre> | ||
gst-launch v4l2src ! ffmpegcolorspace ! autovideosink | gst-launch v4l2src ! ffmpegcolorspace ! autovideosink | ||
Line 14: | Line 23: | ||
gst-launch v4l2src ! ffmpegcolorspace ! xvimagesink | gst-launch v4l2src ! ffmpegcolorspace ! xvimagesink | ||
</pre> | </pre> | ||
+ | |||
+ | === Display live camera view with specified size and framerate === | ||
This will display the camera view in a window with a specified size (320x240) and framerate (30 fps). Note that the device is explicitly specified. The '''ffmpegcolorspace''' elements appears to be optional. I am not sure if removing this element will cause the command to fail in some environments. | This will display the camera view in a window with a specified size (320x240) and framerate (30 fps). Note that the device is explicitly specified. The '''ffmpegcolorspace''' elements appears to be optional. I am not sure if removing this element will cause the command to fail in some environments. | ||
Line 22: | Line 33: | ||
</pre> | </pre> | ||
− | + | === Add timecode overlay === | |
<pre> | <pre> | ||
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! timeoverlay ! ffmpegcolorspace ! autovideosink | gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! timeoverlay ! ffmpegcolorspace ! autovideosink | ||
</pre> | </pre> | ||
− | Record video stream to a file using Motion JPEG encoding (MJPEG) | + | === Display live Xwindows display in a window onthe Xwindows display. Yes. === |
+ | |||
+ | <pre> | ||
+ | gst-launch -v ximagesrc ! ffmpegcolorspace ! videoscale ! "video/x-raw-yuv,width=640,height=480,framerate=10/1" ! ffmpegcolorspace ! xvimagesink | ||
+ | </pre> | ||
+ | |||
+ | == Record video stream to a file using Motion JPEG encoding (MJPEG) == | ||
+ | |||
+ | Note that the '''--eos-on-shutdown''' creates a cleanly terminated video file when you hit '''CTRL-C'''. | ||
+ | <pre> | ||
+ | gst-launch --eos-on-shutdown v4l2src ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=video.mov | ||
+ | </pre> | ||
+ | |||
+ | === Record with with specified size and framerate === | ||
<pre> | <pre> | ||
− | gst-launch --eos-on-shutdown v4l2src ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=video. | + | gst-launch --eos-on-shutdown v4l2src device="/dev/video0" ! video/x-raw-yuv,width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=video.mov |
</pre> | </pre> | ||
− | Record | + | === Record individual frames -- Need a still camera snapshot mode === |
+ | |||
+ | I have not figured out a nice way to simply use a '''v4l2''' video camera to take still camera style snapshots, but the following can be used as a hack. Basically set your camera to the highest resolution it will allow without specifying the framerate (it should use a sensible default), record the video to a file, then post-process the file to stack the individual frames into a single frame. If you are using a USB2 camera then using the highest resolution will also default to a slow frame rate since the bandwidth of USB2 will be the limiting factor. Note, that '''stacking''' refers to any number of algorithms that can be used to "average" frames together into a single frame. I most often use a '''median filter''' to combine frames -- usually superior to simple arithmatic average. There are far more sophisticate staking algorithms, but they would probably not add much to this application. | ||
+ | |||
+ | Note that since I am using '''autovideosink''' for testing output that these settings don't really show much. | ||
<pre> | <pre> | ||
− | gst-launch | + | gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=1920,height=1080' ! ffmpegcolorspace ! autovideosink |
</pre> | </pre> | ||
− | + | === Improve color and bitdepth === | |
+ | |||
+ | Some cameras allow better bitdepth for color. This could help when simulating a still camera. Note that since I am using '''autovideosink''' to for testing output that '''bpp''' and '''depth''' changes would not be visible. | ||
<pre> | <pre> | ||
− | gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width= | + | gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480,bpp=16,depth=16' ! ffmpegcolorspace ! autovideosink |
</pre> | </pre> | ||
+ | |||
+ | === Display higher than allowed framerate (BROKEN) === | ||
+ | ''This is an intriguing note I once made, but in going back to test it I noticed that it does not work and I don't actually understand how it was supposed to work. I make have left this unfinished or this may have worked on some other camera hardware.'' | ||
This will capture at a higher framerate, but it will undersample the pixels so you will get an effective 320x240 resolution. | This will capture at a higher framerate, but it will undersample the pixels so you will get an effective 320x240 resolution. | ||
Line 47: | Line 80: | ||
</pre> | </pre> | ||
− | Record video and | + | === Record video and watch live camera view at the same time === |
+ | |||
+ | Recording video to a file and displaying the stream at the same time requires '''tee''' to split the stream. | ||
+ | |||
+ | Notice how the '''name''' parameter of '''tee''' refers to a label defined at the end of the command-line. Also notice how '''queue''' is necessary so that '''xvimagesink''' will run in parallel with the '''filesink'''. Without this you would record video but you would see a window with a single frozen frame of video. | ||
<pre> | <pre> | ||
− | gst-launch --eos-on-shutdown v4l2src ! ffmpegcolorspace ! tee name=my_videosink ! jpegenc ! avimux ! filesink location=video. | + | gst-launch --eos-on-shutdown v4l2src ! ffmpegcolorspace ! tee name=my_videosink ! jpegenc ! avimux ! filesink location=video.mov my_videosink. ! queue ! autovideosink |
</pre> | </pre> | ||
− | Video test pattern | + | == Video test pattern == |
+ | This will display a simple test pattern. Other patterns are available. | ||
<pre> | <pre> | ||
− | gst-launch videotestsrc ! ffmpegcolorspace ! | + | gst-launch videotestsrc ! ffmpegcolorspace ! autovideosink |
</pre> | </pre> | ||
− | == capture video with transparent overlay (picture in picture) == | + | == capture video with transparent overlay (transparent picture in picture) == |
− | This will display the camera view with a semi-transparent overlay of random snow. | + | This will display the live camera view with a small semi-transparent overlay of random snow. |
<pre> | <pre> | ||
gst-launch \ | gst-launch \ | ||
videomixer name=mix sink_0::zorder=1 \ | videomixer name=mix sink_0::zorder=1 \ | ||
sink_1::xpos=160 sink_1::ypos=120 sink_1::alpha=0.2 sink_1::zorder=2 ! \ | sink_1::xpos=160 sink_1::ypos=120 sink_1::alpha=0.2 sink_1::zorder=2 ! \ | ||
− | ffmpegcolorspace ! | + | ffmpegcolorspace ! autovideosink \ |
v4l2src device=/dev/video0 ! video/x-raw-yuv,width=640 ! ffmpegcolorspace ! mix.sink_0 \ | v4l2src device=/dev/video0 ! video/x-raw-yuv,width=640 ! ffmpegcolorspace ! mix.sink_0 \ | ||
videotestsrc pattern="snow" ! video/x-raw-yuv, framerate=10/1, width=320, height=240 ! mix.sink_1 \ | videotestsrc pattern="snow" ! video/x-raw-yuv, framerate=10/1, width=320, height=240 ! mix.sink_1 \ | ||
</pre> | </pre> | ||
− | == capture video with | + | == capture video with overlay (alpha picture in picture) version 2 == |
+ | |||
+ | This demonstrates "green screen" or "chroma-key" compositing of multiple video channels. | ||
This will display a static PNG image in the foreground. Anywhere the static PNG image has alpha transparency the live video will show through behind the static image. | This will display a static PNG image in the foreground. Anywhere the static PNG image has alpha transparency the live video will show through behind the static image. | ||
Line 90: | Line 130: | ||
</pre> | </pre> | ||
− | Using '''mplayer''' you can play side-by-side stereo video as anaglyphic stereo video. | + | Using '''mplayer''' you can play side-by-side stereo video as anaglyphic stereo video. For more information see: http://www.noah.org/wiki/Mplayer_notes#Play_side-by-side_stereo_3D_video_as_an_anaglyph . |
<pre> | <pre> | ||
mplayer -vf stereo3d,scale -idx sbs-3d-video.mov -loop 0 | mplayer -vf stereo3d,scale -idx sbs-3d-video.mov -loop 0 | ||
Line 106: | Line 146: | ||
mkdir test | mkdir test | ||
gst-launch -v filesrc location=test.avi ! avidemux ! ffdec_huffyuv ! ffmpegcolorspace ! pngenc snapshot=false ! multifilesink location="frame%05d.png" | gst-launch -v filesrc location=test.avi ! avidemux ! ffdec_huffyuv ! ffmpegcolorspace ! pngenc snapshot=false ! multifilesink location="frame%05d.png" | ||
− | |||
− | |||
− | |||
− | |||
− | |||
</pre> | </pre> | ||
Line 158: | Line 193: | ||
Upside-down: | Upside-down: | ||
<pre> | <pre> | ||
− | gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240,framerate=30/1 ! videoflip method= | + | gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240,framerate=30/1 ! videoflip method=rotate-180 ! xvimagesink |
</pre> | </pre> | ||
Sideways: | Sideways: |
Revision as of 15:40, 11 April 2016
Contents
- 1 Gstreamer
- 1.1 Play a video
- 1.2 Display live camera view in a window
- 1.3 Record video stream to a file using Motion JPEG encoding (MJPEG)
- 1.4 Video test pattern
- 1.5 capture video with transparent overlay (transparent picture in picture)
- 1.6 capture video with overlay (alpha picture in picture) version 2
- 1.7 record two video cameras into side-by-side stereo video
- 1.8 record video frames into separate files
- 1.9 playback a collection of jpeg or PNG sill image files as a video (slideshow)
- 1.10 mix two video sources into one (side-by-side)
- 1.11 date and time stamps
- 1.12 interesting filters
- 1.13 gstreamer plugin documentation
- 1.14 gstreamer flip video
- 1.15 gstreamer fbdevsink "ERROR: Pipeline doesn't want to pause."
- 1.16 gstreamer xvimagesink "ERROR: ... Could not initialize Xv output"
- 1.17 playback video on framebuffer (/dev/fbdev0)
Gstreamer
Gstreamer is one of the best tools in Linux for handling video. It comes with a command-line tool that allows you to build almost any time of video processing stream that you could with the gstreamer API.
See also this Gstreamer cheat sheet.
Play a video
This will figure out almost any video format and play it. The magic happens in decodebin2.
gst-launch filesrc location=video.mov ! decodebin2 ! autovideosink
Display live camera view in a window
This will display the live camera view in a window:
gst-launch v4l2src ! ffmpegcolorspace ! autovideosink # or gst-launch v4l2src ! ffmpegcolorspace ! xvimagesink
Display live camera view with specified size and framerate
This will display the camera view in a window with a specified size (320x240) and framerate (30 fps). Note that the device is explicitly specified. The ffmpegcolorspace elements appears to be optional. I am not sure if removing this element will cause the command to fail in some environments.
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! xvimagesink # or gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! xvimagesink
Add timecode overlay
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! timeoverlay ! ffmpegcolorspace ! autovideosink
Display live Xwindows display in a window onthe Xwindows display. Yes.
gst-launch -v ximagesrc ! ffmpegcolorspace ! videoscale ! "video/x-raw-yuv,width=640,height=480,framerate=10/1" ! ffmpegcolorspace ! xvimagesink
Record video stream to a file using Motion JPEG encoding (MJPEG)
Note that the --eos-on-shutdown creates a cleanly terminated video file when you hit CTRL-C.
gst-launch --eos-on-shutdown v4l2src ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=video.mov
Record with with specified size and framerate
gst-launch --eos-on-shutdown v4l2src device="/dev/video0" ! video/x-raw-yuv,width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=video.mov
Record individual frames -- Need a still camera snapshot mode
I have not figured out a nice way to simply use a v4l2 video camera to take still camera style snapshots, but the following can be used as a hack. Basically set your camera to the highest resolution it will allow without specifying the framerate (it should use a sensible default), record the video to a file, then post-process the file to stack the individual frames into a single frame. If you are using a USB2 camera then using the highest resolution will also default to a slow frame rate since the bandwidth of USB2 will be the limiting factor. Note, that stacking refers to any number of algorithms that can be used to "average" frames together into a single frame. I most often use a median filter to combine frames -- usually superior to simple arithmatic average. There are far more sophisticate staking algorithms, but they would probably not add much to this application.
Note that since I am using autovideosink for testing output that these settings don't really show much.
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=1920,height=1080' ! ffmpegcolorspace ! autovideosink
Improve color and bitdepth
Some cameras allow better bitdepth for color. This could help when simulating a still camera. Note that since I am using autovideosink to for testing output that bpp and depth changes would not be visible.
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480,bpp=16,depth=16' ! ffmpegcolorspace ! autovideosink
Display higher than allowed framerate (BROKEN)
This is an intriguing note I once made, but in going back to test it I noticed that it does not work and I don't actually understand how it was supposed to work. I make have left this unfinished or this may have worked on some other camera hardware.
This will capture at a higher framerate, but it will undersample the pixels so you will get an effective 320x240 resolution.
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480,framerate=60/1' ! xvimagesink
Record video and watch live camera view at the same time
Recording video to a file and displaying the stream at the same time requires tee to split the stream.
Notice how the name parameter of tee refers to a label defined at the end of the command-line. Also notice how queue is necessary so that xvimagesink will run in parallel with the filesink. Without this you would record video but you would see a window with a single frozen frame of video.
gst-launch --eos-on-shutdown v4l2src ! ffmpegcolorspace ! tee name=my_videosink ! jpegenc ! avimux ! filesink location=video.mov my_videosink. ! queue ! autovideosink
Video test pattern
This will display a simple test pattern. Other patterns are available.
gst-launch videotestsrc ! ffmpegcolorspace ! autovideosink
capture video with transparent overlay (transparent picture in picture)
This will display the live camera view with a small semi-transparent overlay of random snow.
gst-launch \ videomixer name=mix sink_0::zorder=1 \ sink_1::xpos=160 sink_1::ypos=120 sink_1::alpha=0.2 sink_1::zorder=2 ! \ ffmpegcolorspace ! autovideosink \ v4l2src device=/dev/video0 ! video/x-raw-yuv,width=640 ! ffmpegcolorspace ! mix.sink_0 \ videotestsrc pattern="snow" ! video/x-raw-yuv, framerate=10/1, width=320, height=240 ! mix.sink_1 \
capture video with overlay (alpha picture in picture) version 2
This demonstrates "green screen" or "chroma-key" compositing of multiple video channels.
This will display a static PNG image in the foreground. Anywhere the static PNG image has alpha transparency the live video will show through behind the static image.
The TV.png file below will work, or any PNG image with transparency (an alpha channel) will work. The still image on the right shows what the video display window looks like after running this command.
gst-launch \ videomixer name=mix sink_0::zorder=1 sink_1::alpha=1.0 sink_1::zorder=2 ! ffmpegcolorspace ! xvimagesink \ v4l2src device=/dev/video0 ! video/x-raw-yuv,width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! mix.sink_0 \ filesrc location=TV.png ! pngdec ! alphacolor ! ffmpegcolorspace ! imagefreeze ! mix.sink_1
record two video cameras into side-by-side stereo video
gst-launch v4l2src device=/dev/video1 ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=320, height=240 ! videobox border-alpha=0 left=-320 ! videomixer name=mixme ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=sbs-3d-video.mov v4l2src device=/dev/video2 ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=320, height=240 ! videobox right=-320 ! mixme.
Using mplayer you can play side-by-side stereo video as anaglyphic stereo video. For more information see: http://www.noah.org/wiki/Mplayer_notes#Play_side-by-side_stereo_3D_video_as_an_anaglyph .
mplayer -vf stereo3d,scale -idx sbs-3d-video.mov -loop 0
record video frames into separate files
record from device to separate PNG files
gst-launch --eos-on-shutdown v4l2src device=/dev/video1 ! video/x-raw-yuv,format=\(fourcc\)YUY2,width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! videorate ! video/x-raw-rgb,framerate=30/1 ! ffmpegcolorspace ! pngenc snapshot=false ! multifilesink location="frame%05d.png"
record from video file to separate PNG files
mkdir test gst-launch -v filesrc location=test.avi ! avidemux ! ffdec_huffyuv ! ffmpegcolorspace ! pngenc snapshot=false ! multifilesink location="frame%05d.png"
playback a collection of jpeg or PNG sill image files as a video (slideshow)
The most annoying issue is that you can't use * wildcard.
gst-launch multifilesrc location="frame%05d.png" ! image/png,framerate=5/1 ! pngdec ! videorate ! video/x-raw-rgb,framerate=5/1 ! ffmpegcolorspace ! xvimagesink
The following works great for JPEG images. This is very robust. It will skip errors when trying to decode a broken JPEG image. Then makes it easy to just dump in a whole bunch of unrelated images without having the stream die whenever it hits a bad image.
gst-launch multifilesrc location="image%05d.jpg" ! jpegdec max-errors=-1 ! videoscale ! ffmpegcolorspace ! autovideosink
mix two video sources into one (side-by-side)
gst-launch v4l2src device=/dev/video1 ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=320, height=240 ! videobox border-alpha=0 left=-320 ! videomixer name=mix ! ffmpegcolorspace ! xvimagesink v4l2src device=/dev/video2 ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=320, height=240 ! videobox right=-320 ! mix.
date and time stamps
The timeoverlay filter adds the frame buffer time to each video frame. The clockoverlay filter adds the date and time to each video frame.
gst-launch -e v4l2src device=/dev/video1 ! video/x-raw-yuv,format=\(fourcc\)YUY2,width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! timeoverlay shadow=false halignment=right valignment=bottom font-desc="sanserif 10" ypad=5 xpad=5 ! clockoverlay shadow=false halignment=left valignment=bottom time-format="%Y-%m-%d %H:%M:%S" font-desc="sanserif 10" ypad=5 xpad=5 ! videorate ! video/x-raw-rgb,framerate=30/1 ! ffmpegcolorspace ! xvimagesink
interesting filters
fpsdisplaysink
gstreamer plugin documentation
Many Gstreamer plugins lack good documentation. You can find internal descriptions of plugins and their properties by using gst-inspect. For examples:
gst-inspect timeoverlay gst-inspect clockoverlay
gstreamer flip video
This is handy if the camera is mounted upside-down or sideways. Upside-down:
gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240,framerate=30/1 ! videoflip method=rotate-180 ! xvimagesink
Sideways:
gst-launch v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240,framerate=30/1 ! videoflip method=clockwise ! xvimagesink # or gst-launch v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240,framerate=30/1 ! videoflip method=counterclockwise ! xvimagesink
gstreamer fbdevsink "ERROR: Pipeline doesn't want to pause."
If you are trying to use the framebuffer device for video playback then you may get an error like the one below. This is a permissions problem. Try adding sudo in front of the pipeline, or run the command as root.
$ gst-launch videotestsrc ! ffmpegcolorspace ! fbdevsink Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. Setting pipeline to NULL ... Freeing pipeline ...
gstreamer xvimagesink "ERROR: ... Could not initialize Xv output"
This happens if your X11 installation does not support Xv. This is a common problem when working with a virtual machine. Try using ximagesink instead of xvimagesink.
playback video on framebuffer (/dev/fbdev0)
sudo gst-launch uridecodebin uri=file:///home/noah/Videos/ct_scan_sample.flv ! ffmpegcolorspace ! fbdevsink
You can also use mplayer to play video on a framebuffer device. Be sure to specify fbdev2 if you want color.
sudo mplayer -vo fbdev2 ct_scan_sample.flv