summaryrefslogtreecommitdiff
path: root/sdk-basic-tutorial-platform-specific-elements.md
blob: ea2258bebb260e0a271b7f305c47145e2b475e45 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
# Basic tutorial 16: Platform-specific elements

## Goal

Even though GStreamer is a multiplatform framework, not all the elements
are available on all platforms. For example, the video sinks
depend heavily on the underlying windowing system, and a different one
needs to be selected depending on the platform. You normally do not need
to worry about this when using elements like `playbin` or
`autovideosink`, but, for those cases when you need to use one of the
sinks that are only available on specific platforms, this tutorial hints
you some of their peculiarities.

## Cross Platform

### `glimagesink`

This video sink is based on
[OpenGL](http://en.wikipedia.org/wiki/OpenGL) or [OpenGL ES](https://en.wikipedia.org/wiki/OpenGL_ES). It supports rescaling
and filtering of the scaled image to alleviate aliasing. It implements
the VideoOverlay interface, so the video window can be re-parented
(embedded inside other windows). This is the video sink recommended on
most platforms. In particular, on Android and iOS, it is the only
available video sink. It can be decomposed into
`glupload ! glcolorconvert ! glimagesinkelement` to insert further OpenGL
hardware accelerated processing into the pipeline.

## Linux

### `ximagesink`

A standard RGB only X-based video sink. It implements the VideoOverlay
interface, so the video window can be re-parented (embedded inside
other windows). It does not support scaling or color formats other
than RGB; it has to be performed by different means (using the
`videoscale` element, for example).

### `xvimagesink`

An X-based video sink, using the [X Video
Extension](http://en.wikipedia.org/wiki/X_video_extension) (Xv). It
implements the VideoOverlay interface, so the video window can be
re-parented (embedded inside other windows). It can perform scaling
efficiently, on the GPU. It is only available if the hardware and
corresponding drivers support the Xv extension.

### `alsasink`

This audio sink outputs to the sound card via
[ALSA](http://www.alsa-project.org/) (Advanced Linux Sound
Architecture). This sink is available on almost every Linux platform. It
is often seen as a “low level” interface to the sound card, and can be
complicated to configure (See the comment on
[](sdk-playback-tutorial-digital-audio-pass-through.md)).

### `pulsesink`

This sink plays audio to a [PulseAudio](http://www.pulseaudio.org/)
server. It is a higher level abstraction of the sound card than ALSA,
and is therefore easier to use and offers more advanced features. It has
been known to be unstable on some older Linux distributions, though.

## Mac OS X

### `osxvideosink`

This is the  video sink available to GStreamer on Mac OS X. It is also
possible to draw using `glimagesink` using OpenGL.

### `osxaudiosink`

This is the only audio sink available to GStreamer on Mac OS X.

## Windows

### `directdrawsink`

This is the oldest of the Windows video sinks, based on [Direct
Draw](http://en.wikipedia.org/wiki/DirectDraw). It requires DirectX 7,
so it is available on almost every current Windows platform. It supports
rescaling and filtering of the scaled image to alleviate aliasing.

### `dshowvideosink`

This video sink is based on [Direct
Show](http://en.wikipedia.org/wiki/Direct_Show).  It can use different
rendering back-ends, like
[EVR](http://en.wikipedia.org/wiki/Enhanced_Video_Renderer),
[VMR9](http://en.wikipedia.org/wiki/Direct_Show#Video_rendering_filters)
or
[VMR7](http://en.wikipedia.org/wiki/Direct_Show#Video_rendering_filters),
EVR only being available on Windows Vista or more recent. It supports
rescaling and filtering of the scaled image to alleviate aliasing. It
implements the VideoOverlay interface, so the video window can be
re-parented (embedded inside other windows).

### `d3dvideosink`

This video sink is based on
[Direct3D](http://en.wikipedia.org/wiki/Direct3D) and it’s the most
recent Windows video sink. It supports rescaling and filtering of the
scaled image to alleviate aliasing. It implements the VideoOverlay
interface, so the video window can be re-parented (embedded inside other
windows).

### `directsoundsink`

This is the default audio sink for Windows, based on [Direct
Sound](http://en.wikipedia.org/wiki/DirectSound), which is available in
all Windows versions.

### `dshowdecwrapper`

[Direct Show](http://en.wikipedia.org/wiki/Direct_Show) is a multimedia
framework similar to GStreamer. They are different enough, though, so
that their pipelines cannot be interconnected. However, through this
element, GStreamer can benefit from the decoding elements present in
Direct Show. `dshowdecwrapper` wraps multiple Direct Show decoders so
they can be embedded in a GStreamer pipeline. Use the `gst-inspect-1.0` tool
(see [](sdk-basic-tutorial-gstreamer-tools.md)) to see the
available decoders.

## Android

### `openslessink`

This is the only audio sink available to GStreamer on Android. It is
based on [OpenSL ES](http://en.wikipedia.org/wiki/OpenSL_ES).

### `openslessrc`

This is the only audio source available to GStreamer on Android. It is
based on [OpenSL ES](http://en.wikipedia.org/wiki/OpenSL_ES).

### `androidmedia`

[android.media.MediaCodec](http://developer.android.com/reference/android/media/MediaCodec.html)
is an Android specific API to access the codecs that are available on
the device, including hardware codecs. It is available since API level
16 (JellyBean) and GStreamer can use it via the androidmedia plugin
for audio and video decoding. On Android, attaching the hardware
decoder to the `glimagesink` element can produce a high performance
zero-copy decodebin pipeline.

### `ahcsrc`

This video source can capture from the cameras on Android devices, it is part
of the androidmedia plugin and uses the [android.hardware.Camera API](https://developer.android.com/reference/android/hardware/Camera.html).

## iOS

### `osxaudiosink`

This is the only audio sink available to GStreamer on iOS.

### `iosassetsrc`

Source element to read iOS assets, this is, documents stored in the
Library (like photos, music and videos). It can be instantiated
automatically by `playbin` when URIs use the
`assets-library://` scheme.

### `iosavassetsrc`

Source element to read and decode iOS audiovisual assets, this is,
documents stored in the Library (like photos, music and videos). It can
be instantiated automatically by `playbin` when URIs use the
`ipod-library://` scheme. Decoding is performed by the system, so
dedicated hardware will be used if available.

## Conclusion

This tutorial has shown a few specific details about some GStreamer
elements which are not available on all platforms. You do not have to
worry about them when using multiplatform elements like `playbin` or
`autovideosink`, but it is good to know their personal quirks if
instancing them manually.

It has been a pleasure having you here, and see you soon!