glstereomix
Combine 2 input streams to produce a stereoscopic output stream. Input views are taken from the left pad and right pad respectively, and mixed according to their timelines.
If either input stream is stereoscopic, the approproriate view (left or right) is taken from each stream and placed into the output.
The multiview representation on the output is chosen according to the downstream caps.
Examples
gst-launch-1.0 -v videotestsrc pattern=ball name=left \
videotestsrc name=right glstereomix name=mix \
left. ! vid/x-raw,width=640,height=480! glupload ! mix. \
right. ! video/x-raw,width=640,height=480! glupload ! mix. \
mix. ! video/x-raw'(memory:GLMemory)',multiview-mode=side-by-side ! \
queue ! glimagesink output-multiview-mode=side-by-side
Mix 2 different videotestsrc patterns into a side-by-side stereo image and display it.
gst-launch-1.0 -ev v4l2src name=left \
videotestsrc name=right \
glstereomix name=mix \
left. ! video/x-raw,width=640,height=480 ! glupload ! glcolorconvert ! mix. \
right. ! video/x-raw,width=640,height=480 ! glupload ! mix. \
mix. ! video/x-raw'(memory:GLMemory)',multiview-mode=top-bottom ! \
glcolorconvert ! gldownload ! queue ! x264enc ! h264parse ! \
mp4mux ! progressreport ! filesink location=output.mp4
Mix the input from a camera to the left view, and videotestsrc to the right view, and encode as a top-bottom frame packed H.264 video.
Hierarchy
GObject ╰──GInitiallyUnowned ╰──GstObject ╰──GstElement ╰──GstAggregator ╰──GstVideoAggregator ╰──GstGLBaseMixer ╰──GstGLMixer ╰──glstereomix
Factory details
Authors: – Jan Schmidt
Classification: – Filter/Effect/Video
Rank – none
Plugin – gstopengl
Package – GStreamer Base Plug-ins
Pad Templates
sink_%u
video/x-raw(memory:GLMemory):
format: RGBA
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
texture-target: 2D
video/x-raw(meta:GstVideoGLTextureUploadMeta):
format: RGBA
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw:
format: { RGBA, RGB, RGBx, BGR, BGRx, BGRA, xRGB, xBGR, ARGB, ABGR, Y444, I420, YV12, Y42B, Y41B, NV12, NV21, NV16, NV61, YUY2, UYVY, Y210, AYUV, VUYA, Y410, GRAY8, GRAY16_LE, GRAY16_BE, RGB16, BGR16, ARGB64, BGR10A2_LE, RGB10A2_LE, P010_10LE, P012_LE, P016_LE, Y212_LE, Y412_LE }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
src
video/x-raw(memory:GLMemory):
format: RGBA
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
texture-target: 2D
video/x-raw(meta:GstVideoGLTextureUploadMeta):
format: RGBA
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw:
format: { RGBA, RGB, RGBx, BGR, BGRx, BGRA, xRGB, xBGR, ARGB, ABGR, Y444, I420, YV12, Y42B, Y41B, NV12, NV21, NV16, NV61, YUY2, UYVY, Y210, AYUV, VUYA, Y410, GRAY8, GRAY16_LE, GRAY16_BE, RGB16, BGR16, ARGB64, BGR10A2_LE, RGB10A2_LE, P010_10LE, P012_LE, P016_LE, Y212_LE, Y412_LE }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
Properties
downmix-mode
“downmix-mode” GstGLStereoDownmix *
Output anaglyph type to generate when downmixing to mono
Flags : Read / Write
Default value : green-magenta-dubois (0)
latency
“latency” guint64
Additional latency in live mode to allow upstream to take longer to produce buffers for the current position (in nanoseconds)
Flags : Read / Write
Default value : 0
min-upstream-latency
“min-upstream-latency” guint64
When sources with a higher latency are expected to be plugged in dynamically after the aggregator has started playing, this allows overriding the minimum latency reported by the initial source(s). This is only taken into account when larger than the actually reported minimum latency. (nanoseconds)
Flags : Read / Write
Default value : 0
start-time
“start-time” guint64
Start time to use if start-time-selection=set
Flags : Read / Write
Default value : 18446744073709551615
start-time-selection
“start-time-selection” GstAggregatorStartTimeSelection *
Decides which start time is output
Flags : Read / Write
Default value : zero (0)
GstGLStereoMixPad
GObject ╰──GInitiallyUnowned ╰──GstObject ╰──GstPad ╰──GstAggregatorPad ╰──GstVideoAggregatorPad ╰──GstGLBaseMixerPad ╰──GstGLMixerPad ╰──GstGLStereoMixPad
Signals
buffer-consumed
buffer_consumed_callback (GstElement * param_0, GstBuffer * arg0, gpointer udata)
def buffer_consumed_callback (param_0, arg0, udata):
#python callback for the 'buffer-consumed' signal
function buffer_consumed_callback(param_0: GstElement * param_0, arg0: GstBuffer * arg0, udata: gpointer udata): {
// javascript callback for the 'buffer-consumed' signal
}
Parameters:
param_0
–
arg0
–
udata
–
Flags: Run First
Properties
emit-signals
“emit-signals” gboolean
Send signals to signal data consumption
Flags : Read / Write
Default value : false
max-last-buffer-repeat
“max-last-buffer-repeat” guint64
Repeat last buffer for time (in ns, -1=until EOS), behaviour on EOS is not affected
Flags : Read / Write
Default value : 18446744073709551615
repeat-after-eos
“repeat-after-eos” gboolean
Repeat the last frame after EOS until all pads are EOS
Flags : Read / Write
Default value : false
The results of the search are