summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorThibault Saunier <tsaunier@gnome.org>2016-06-17 18:41:07 -0400
committerThibault Saunier <tsaunier@gnome.org>2016-06-17 18:42:07 -0400
commit1c926934ab2873ddf909dfa0ae894c34666ea114 (patch)
treeaba4eaea37301e9fec7b88c6ccaaaa8992ea5bfd
parent208c456f816bb2782cc5c47c5024e88479287c0c (diff)
Avoid having several 'h1' title per page
Each page has one title and it looks better like that
-rw-r--r--index.md2
-rw-r--r--manual-autoplugging.md6
-rw-r--r--manual-bins.md8
-rw-r--r--manual-buffering.md12
-rw-r--r--manual-bus.md4
-rw-r--r--manual-checklist-element.md12
-rw-r--r--manual-clocks.md16
-rw-r--r--manual-compiling.md2
-rw-r--r--manual-data.md4
-rw-r--r--manual-dataaccess.md28
-rw-r--r--manual-dparams.md4
-rw-r--r--manual-elements.md22
-rw-r--r--manual-helloworld.md6
-rw-r--r--manual-index.md10
-rw-r--r--manual-init.md4
-rw-r--r--manual-interfaces.md6
-rw-r--r--manual-intgration.md14
-rw-r--r--manual-intro-basics.md8
-rw-r--r--manual-licensing.md2
-rw-r--r--manual-metadata.md4
-rw-r--r--manual-motivation.md14
-rw-r--r--manual-pads.md20
-rw-r--r--manual-playback-components.md8
-rw-r--r--manual-porting-1.0.md2
-rw-r--r--manual-porting.md2
-rw-r--r--manual-programs.md14
-rw-r--r--manual-queryevents.md4
-rw-r--r--manual-threads.md8
-rw-r--r--pwg-advanced-clock.md20
-rw-r--r--pwg-advanced-events.md30
-rw-r--r--pwg-advanced-interfaces.md10
-rw-r--r--pwg-advanced-qos.md12
-rw-r--r--pwg-advanced-request.md4
-rw-r--r--pwg-advanced-tagging.md6
-rw-r--r--pwg-allocation.md36
-rw-r--r--pwg-building-boiler.md14
-rw-r--r--pwg-building-types.md6
-rw-r--r--pwg-checklist-element.md8
-rw-r--r--pwg-dparams.md8
-rw-r--r--pwg-intro-basics.md12
-rw-r--r--pwg-intro-preface.md8
-rw-r--r--pwg-licensing-advisory.md2
-rw-r--r--pwg-negotiation.md16
-rw-r--r--pwg-other-base.md12
-rw-r--r--pwg-porting.md2
-rw-r--r--pwg-scheduling.md6
-rw-r--r--pwg-statemanage-states.md2
-rw-r--r--sdk-android-tutorial-a-running-pipeline.md40
-rw-r--r--sdk-android-tutorial-media-player.md42
-rw-r--r--sdk-android-tutorial-video.md22
-rw-r--r--sdk-basic-media-player.md14
-rw-r--r--sdk-installing-for-ios-development.md10
-rw-r--r--sdk-installing-on-linux.md14
-rw-r--r--sdk-ios-tutorial-video.md14
-rw-r--r--sdk-legal-information.md28
-rw-r--r--sdk-playback-tutorial-hardware-accelerated-video-decoding.md8
-rw-r--r--sdk-qt-gstreamer-vs-c-gstreamer.md14
-rw-r--r--sdk-qt-tutorials.md2
-rw-r--r--sdk-using-appsink-appsrc-in-qt.md8
59 files changed, 337 insertions, 339 deletions
diff --git a/index.md b/index.md
index eed8257..b879c79 100644
--- a/index.md
+++ b/index.md
@@ -1,7 +1,5 @@
# GStreamer documentation
-## Welcome to the GStreamer documentation!
-
Feel free to jump straight to the download section, start practicing
with the tutorials, or read the F.A.Q. if you don’t know what this is
all about.
diff --git a/manual-autoplugging.md b/manual-autoplugging.md
index f49c33c..60d65de 100644
--- a/manual-autoplugging.md
+++ b/manual-autoplugging.md
@@ -29,7 +29,7 @@ Lastly, we will explain how autoplugging and the GStreamer registry can
be used to setup a pipeline that will convert media from one mediatype
to another, for example for media decoding.
-# Media types as a way to identify streams
+## Media types as a way to identify streams
We have previously introduced the concept of capabilities as a way for
elements (or, rather, pads) to agree on a media type when streaming data
@@ -62,7 +62,7 @@ Now that we have an idea how GStreamer identifies known media streams,
we can look at methods GStreamer uses to setup pipelines for media
handling and for media type detection.
-# Media stream type detection
+## Media stream type detection
Usually, when loading a media stream, the type of the stream is not
known. This means that before we can choose a pipeline to decode the
@@ -181,7 +181,7 @@ Once a media type has been detected, you can plug an element (e.g. a
demuxer or decoder) to the source pad of the typefind element, and
decoding of the media stream will start right after.
-# Dynamically autoplugging a pipeline
+## Dynamically autoplugging a pipeline
See [Playback Components](manual-playback-components.md) for using the
high level object that you can use to dynamically construct pipelines.
diff --git a/manual-bins.md b/manual-bins.md
index 303ee34..bf44397 100644
--- a/manual-bins.md
+++ b/manual-bins.md
@@ -9,7 +9,7 @@ is an element itself, a bin can be handled in the same way as any other
element. Therefore, the whole previous chapter
([Elements](manual-elements.md)) applies to bins as well.
-# What are bins
+## What are bins
Bins allow you to combine a group of linked elements into one logical
element. You do not deal with the individual elements anymore but with
@@ -31,7 +31,7 @@ programmer:
bus messages of the contained elements. The toplevel bin has to be a
pipeline, every application thus needs at least one of these.
-# Creating a bin
+## Creating a bin
Bins are created in the same way that other elements are created, i.e.
using an element factory. There are also convenience functions available
@@ -81,7 +81,7 @@ the function `gst_bin_iterate_elements ()`. See the API references of
[`GstBin`](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/GstBin.html)
for details.
-# Custom bins
+## Custom bins
The application programmer can create custom bins packed with elements
to perform a specific task. This allows you, for example, to write an
@@ -121,7 +121,7 @@ Guide](http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/index.h
Examples of such custom bins are the playbin and uridecodebin elements
from[gst-plugins-base](http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/index.html).
-# Bins manage states of their children
+## Bins manage states of their children
Bins manage the state of all elements contained in them. If you set a
bin (or a pipeline, which is a special top-level type of bin) to a
diff --git a/manual-buffering.md b/manual-buffering.md
index f885c29..c0e2fd0 100644
--- a/manual-buffering.md
+++ b/manual-buffering.md
@@ -76,7 +76,7 @@ strategies](#buffering-strategies).
```
-# Stream buffering
+## Stream buffering
```
+---------+ +---------+ +-------+
@@ -144,7 +144,7 @@ Another advantage of doing the buffering at a later stage is that you
can let the demuxer operate in pull mode. When reading data from a slow
network drive (with filesrc) this can be an interesting way to buffer.
-# Download buffering
+## Download buffering
```
+---------+ +---------+ +-------+
@@ -172,7 +172,7 @@ application control the buffering in a more intelligent way, using the
BUFFERING query, for example. See [Buffering
strategies](#buffering-strategies).
-# Timeshift buffering
+## Timeshift buffering
```
+---------+ +---------+ +-------+
@@ -192,7 +192,7 @@ This mode is suitable for all live streams. As with the incremental
download mode, buffering messages are emitted along with an indication
that timeshifting download is in progress.
-# Live buffering
+## Live buffering
In live pipelines we usually introduce some fixed latency between the
capture and the playback elements. This latency can be introduced by a
@@ -203,12 +203,12 @@ serve as an indication to the user of the latency buffering. The
application usually does not react to these buffering messages with a
state change.
-# Buffering strategies
+## Buffering strategies
What follows are some ideas for implementing different buffering
strategies based on the buffering messages and buffering query.
-## No-rebuffer strategy
+### No-rebuffer strategy
We would like to buffer enough data in the pipeline so that playback
continues without interruptions. What we need to know to implement this
diff --git a/manual-bus.md b/manual-bus.md
index 215585d..6eb7fad 100644
--- a/manual-bus.md
+++ b/manual-bus.md
@@ -17,7 +17,7 @@ object. When the mainloop is running, the bus will periodically be
checked for new messages, and the callback will be called when any
message is available.
-# How to use a bus
+## How to use a bus
There are two different ways to use a bus:
@@ -84,7 +84,7 @@ handler that wakes up the custom mainloop and that uses
[documentation](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/GstBus.html)
for details)
-# Message types
+## Message types
GStreamer has a few pre-defined message types that can be passed over
the bus. The messages are extensible, however. Plug-ins can define
diff --git a/manual-checklist-element.md b/manual-checklist-element.md
index dc972ab..efb21a5 100644
--- a/manual-checklist-element.md
+++ b/manual-checklist-element.md
@@ -12,7 +12,7 @@ applications. Also, we will touch upon how to acquire knowledge about
plugins and elements and how to test simple pipelines before building
applications around them.
-# Good programming habits
+## Good programming habits
- Always add a `GstBus` handler to your pipeline. Always report errors
in your application, and try to do something with warnings and
@@ -36,7 +36,7 @@ applications around them.
- Report all bugs that you find in GStreamer bugzilla at
[http://bugzilla.gnome.org/](http://bugzilla.gnome.org).
-# Debugging
+## Debugging
Applications can make use of the extensive GStreamer debugging system to
debug pipeline problems. Elements will write output to this system to
@@ -85,7 +85,7 @@ a list of all available options:
- `--gst-plugin-spew` enables printout of errors while loading
GStreamer plugins.
-# Conversion plugins
+## Conversion plugins
GStreamer contains a bunch of conversion plugins that most applications
will find useful. Specifically, those are videoscalers (videoscale),
@@ -96,13 +96,13 @@ they will act in passthrough mode. They will activate when the hardware
doesn't support a specific request, though. All applications are
recommended to use those elements.
-# Utility applications provided with GStreamer
+## Utility applications provided with GStreamer
GStreamer comes with a default set of command-line utilities that can
help in application development. We will discuss only `gst-launch` and
`gst-inspect` here.
-## `gst-launch`
+### `gst-launch`
`gst-launch` is a simple script-like commandline application that can be
used to test pipelines. For example, the command `gst-launch
@@ -121,7 +121,7 @@ d. ! queue ! vorbisdec ! audioconvert ! audioresample ! alsasink
audio-stream. You can also use autopluggers such as decodebin on the
commandline. See the manual page of `gst-launch` for more information.
-## `gst-inspect`
+### `gst-inspect`
`gst-inspect` can be used to inspect all properties, signals, dynamic
parameters and the object hierarchy of an element. This can be very
diff --git a/manual-clocks.md b/manual-clocks.md
index 80c159a..22cc332 100644
--- a/manual-clocks.md
+++ b/manual-clocks.md
@@ -37,7 +37,7 @@ GStreamer uses a `GstClock` object, buffer timestamps and a SEGMENT
event to synchronize streams in a pipeline as we will see in the next
sections.
-# Clock running-time
+## Clock running-time
In a typical computer, there are many sources that can be used as a time
source, e.g., the system time, soundcards, CPU performance counters, ...
@@ -66,7 +66,7 @@ Because all objects in the pipeline have the same clock and base-time,
they can thus all calculate the running-time according to the pipeline
clock.
-# Buffer running-time
+## Buffer running-time
To calculate a buffer running-time, we need a buffer timestamp and the
SEGMENT event that preceeded the buffer. First we can convert the
@@ -87,7 +87,7 @@ running-time of 0.
Live sources need to timestamp buffers with a running-time matching the
pipeline running-time when the first byte of the buffer was captured.
-# Buffer stream-time
+## Buffer stream-time
The buffer stream-time, also known as the position in the stream, is
calculated from the buffer timestamps and the preceding SEGMENT event.
@@ -105,7 +105,7 @@ The stream-time is used in:
The stream-time is never used to synchronize streams, this is only done
with the running-time.
-# Time overview
+## Time overview
Here is an overview of the various timelines used in GStreamer.
@@ -120,7 +120,7 @@ running-time is equal to the clock-time - base-time. The stream-time
represents the position in the stream and jumps backwards when
repeating.
-# Clock providers
+## Clock providers
A clock provider is an element in the pipeline that can provide a
`GstClock` object. The clock object needs to report an absolute-time
@@ -160,7 +160,7 @@ provider is removed from the pipeline, a CLOCK\_LOST message is posted
and the application should go to PAUSED and back to PLAYING to select a
new clock.
-# Latency
+## Latency
The latency is the time it takes for a sample captured at timestamp X to
reach the sink. This time is measured against the clock in the pipeline.
@@ -177,7 +177,7 @@ clock is now \>= 1 second, the sink will drop this buffer because it is
too late. Without any latency compensation in the sink, all buffers will
be dropped.
-## Latency compensation
+### Latency compensation
Before the pipeline goes to the PLAYING state, it will, in addition to
selecting a clock and calculating a base-time, calculate the latency in
@@ -189,7 +189,7 @@ All sink elements will delay playback by the value in the LATENCY event.
Since all sinks delay with the same amount of time, they will be
relative in sync.
-## Dynamic Latency
+### Dynamic Latency
Adding/removing elements to/from a pipeline or changing element
properties can change the latency in a pipeline. An element can request
diff --git a/manual-compiling.md b/manual-compiling.md
index 52e2b39..a9451f0 100644
--- a/manual-compiling.md
+++ b/manual-compiling.md
@@ -7,7 +7,7 @@ title: Compiling
This section talks about the different things you can do when building
and shipping your applications and plugins.
-# Embedding static elements in your application
+## Embedding static elements in your application
The [Plugin Writer's
Guide](http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/index.html)
diff --git a/manual-data.md b/manual-data.md
index 658979a..5844979 100644
--- a/manual-data.md
+++ b/manual-data.md
@@ -11,7 +11,7 @@ notifiers. All this will flow through the pipeline automatically when
it's running. This chapter is mostly meant to explain the concept to
you; you don't need to do anything for this.
-# Buffers
+## Buffers
Buffers contain the data that will flow through the pipeline you have
created. A source element will typically create a new buffer and pass it
@@ -44,7 +44,7 @@ in-place, i.e. without allocating a new one. Elements can also write to
hardware memory (such as from video-capture sources) or memory allocated
from the X-server (using XShm). Buffers can be read-only, and so on.
-# Events
+## Events
Events are control particles that are sent both up- and downstream in a
pipeline along with buffers. Downstream events notify fellow elements of
diff --git a/manual-dataaccess.md b/manual-dataaccess.md
index d9bd604..3b2cec7 100644
--- a/manual-dataaccess.md
+++ b/manual-dataaccess.md
@@ -14,7 +14,7 @@ a pipeline from your application, how to read data from a pipeline, how
to manipulate the pipeline's speed, length, starting point and how to
listen to a pipeline's data processing.
-# Using probes
+## Using probes
Probing is best envisioned as a pad listener. Technically, a probe is
nothing more than a callback that can be attached to a pad. You can
@@ -96,7 +96,7 @@ The probe can notify you of the following activity on pads:
[Dynamically changing the
pipeline](#dynamically-changing-the-pipeline).
-## Data probes
+### Data probes
Data probes allow you to be notified when there is data passing on a
pad. When adding the probe, specify the GST\_PAD\_PROBE\_TYPE\_BUFFER
@@ -258,7 +258,7 @@ The identity element also provides a few useful debugging tools like the
passing the '-v' switch to gst-launch and by setting the silent property
on the identity to FALSE).
-## Play a region of a media file
+### Play a region of a media file
In this example we will show you how to play back a region of a media
file. The goal is to only play the part of a file from 2 seconds to 5
@@ -481,7 +481,7 @@ this second block to remove the probes. Then we set the pipeline to
PLAYING and it should play from 2 to 5 seconds, then EOS and exit the
application.
-# Manually adding or removing data from/to a pipeline
+## Manually adding or removing data from/to a pipeline
Many people have expressed the wish to use their own sources to inject
data into a pipeline. Some people have also expressed the wish to grab
@@ -508,7 +508,7 @@ GObject (action) signals and properties. The same API is also available
as a regular C api. The C api is more performant but requires you to
link to the app library in order to use the elements.
-## Inserting data with appsrc
+### Inserting data with appsrc
First we look at some examples for appsrc, which lets you insert data
into the pipeline from the application. Appsrc has some configuration
@@ -582,7 +582,7 @@ When the last byte is pushed into appsrc, you must call
These signals allow the application to operate appsrc in push and pull
mode as will be explained next.
-### Using appsrc in push mode
+#### Using appsrc in push mode
When appsrc is configured in push mode (stream-type is stream or
seekable), the application repeatedly calls the push-buffer method with
@@ -599,7 +599,7 @@ seek-data callback.
Use this model when implementing various network protocols or hardware
devices.
-### Using appsrc in pull mode
+#### Using appsrc in pull mode
In the pull model, data is fed to appsrc from the need-data signal
handler. You should push exactly the amount of bytes requested in the
@@ -608,7 +608,7 @@ at the end of the stream.
Use this model for file access or other randomly accessable sources.
-### Appsrc example
+#### Appsrc example
This example application will generate black/white (it switches every
second) video to an Xv-window output by using appsrc as a source with
@@ -710,7 +710,7 @@ main (gint argc,
```
-## Grabbing data with appsink
+### Grabbing data with appsink
Unlike appsrc, appsink is a little easier to use. It also supports a
pull and push based model of getting data from the pipeline.
@@ -761,7 +761,7 @@ Consider configuring the following properties in the appsink:
caps on appsink. You must still check the `GstSample` to get the
actual caps of the buffer.
-### Appsink example
+#### Appsink example
What follows is an example on how to capture a snapshot of a video
stream using appsink.
@@ -913,7 +913,7 @@ main (int argc, char *argv[])
```
-# Forcing a format
+## Forcing a format
Sometimes you'll want to set a specific format, for example a video size
and format or an audio bitsize and number of channels. You can do this
@@ -925,7 +925,7 @@ types matching that specified capability set for negotiation. See also
[Creating capabilities for
filtering](manual-pads.md#creating-capabilities-for-filtering).
-## Changing format in a PLAYING pipeline
+### Changing format in a PLAYING pipeline
It is also possible to dynamically change the format in a pipeline while
PLAYING. This can simply be done by changing the caps property on a
@@ -1027,7 +1027,7 @@ It is possible to set multiple caps for the capsfilter separated with a
;. The capsfilter will try to renegotiate to the first possible format
from the list.
-# Dynamically changing the pipeline
+## Dynamically changing the pipeline
In this section we talk about some techniques for dynamically modifying
the pipeline. We are talking specifically about changing the pipeline
@@ -1116,7 +1116,7 @@ modification but it requires you to know a bit of details before you can
do this without causing pipeline errors. In the following sections we
will demonstrate a couple of typical use-cases.
-## Changing elements in a pipeline
+### Changing elements in a pipeline
In the next example we look at the following chain of elements:
diff --git a/manual-dparams.md b/manual-dparams.md
index 4382d06..023182f 100644
--- a/manual-dparams.md
+++ b/manual-dparams.md
@@ -4,7 +4,7 @@ title: Dynamic Controllable Parameters
# Dynamic Controllable Parameters
-# Getting Started
+## Getting Started
The controller subsystem offers a lightweight way to adjust gobject
properties over stream-time. Normally these properties are changed using
@@ -39,7 +39,7 @@ Your application should link to the shared library
`gstreamer-controller`. One can get the required flag for compiler and
linker by using pkg-config for gstreamer-controller-1.0.
-# Setting up parameter control
+## Setting up parameter control
If we have our pipeline set up and want to control some parameters, we
first need to create a control-source. Lets use an interpolation
diff --git a/manual-elements.md b/manual-elements.md
index e11f72d..7f733ff 100644
--- a/manual-elements.md
+++ b/manual-elements.md
@@ -12,7 +12,7 @@ the different high-level components you will use are derived from
`GstElement`. Every decoder, encoder, demuxer, video or audio output is
in fact a `GstElement`
-# What are elements?
+## What are elements?
For the application programmer, elements are best visualized as black
boxes. On the one end, you might put something in, the element does
@@ -22,7 +22,7 @@ would output decoded data. In the next chapter (see [Pads and
capabilities](manual-pads.md)), you will learn more about data input
and output in elements, and how you can set that up in your application.
-## Source elements
+### Source elements
Source elements generate data for use by a pipeline, for example reading
from disk or from a sound card. [Visualisation of a source
@@ -36,7 +36,7 @@ Source elements do not accept data, they only generate data. You can see
this in the figure because it only has a source pad (on the right). A
source pad can only generate data.
-## Filters, convertors, demuxers, muxers and codecs
+### Filters, convertors, demuxers, muxers and codecs
Filters and filter-like elements have both input and outputs pads. They
operate on data that they receive on their input (sink) pads, and will
@@ -70,7 +70,7 @@ contain the elementary audio stream. Demuxers will generally fire
signals when a new pad is created. The application programmer can then
handle the new elementary stream in the signal handler.
-## Sink elements
+### Sink elements
Sink elements are end points in a media pipeline. They accept data but
do not produce anything. Disk writing, soundcard playback, and video
@@ -79,7 +79,7 @@ sink element](#visualisation-of-a-sink-element) shows a sink element.
![Visualisation of a sink element](images/sink-element.png "fig:")
-# Creating a `GstElement`
+## Creating a `GstElement`
The simplest way to create an element is to use
[`gst_element_factory_make
@@ -175,7 +175,7 @@ main (int argc,
```
-# Using an element as a `GObject`
+## Using an element as a `GObject`
A
[`GstElement`](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/GstElement.html)
@@ -239,7 +239,7 @@ callback mechanism. Here, too, you can use `gst-inspect` to see which
signals a specific element supports. Together, signals and properties
are the most basic way in which elements and applications interact.
-# More about element factories
+## More about element factories
In the previous section, we briefly introduced the
[`GstElementFactory`](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/GstElementFactory.html)
@@ -250,7 +250,7 @@ plugins and elements that GStreamer can create. This means that element
factories are useful for automated element instancing, such as what
autopluggers do, and for creating lists of available elements.
-## Getting information about an element using a factory
+### Getting information about an element using a factory
Tools like `gst-inspect` will provide some generic information about an
element, such as the person that wrote the plugin, a descriptive name
@@ -298,7 +298,7 @@ main (int argc,
You can use `gst_registry_pool_feature_list (GST_TYPE_ELEMENT_FACTORY)`
to get a list of all the element factories that GStreamer knows about.
-## Finding out what pads an element can contain
+### Finding out what pads an element can contain
Perhaps the most powerful feature of element factories is that they
contain a full description of the pads that the element can generate,
@@ -311,7 +311,7 @@ this way. We'll look closer at these features as we learn about `GstPad`
and `GstCaps` in the next chapter: [Pads and
capabilities](manual-pads.md)
-# Linking elements
+## Linking elements
By linking a source element with zero or more filter-like elements and
finally a sink element, you set up a media pipeline. Data will flow
@@ -382,7 +382,7 @@ the same bin or pipeline; if you want to link elements or pads at
different hierarchy levels, you will need to use ghost pads (more about
ghost pads later, see [Ghost pads](manual-pads.md#ghost-pads)).
-# Element States
+## Element States
After being created, an element will not actually perform any actions
yet. You need to change elements state to make it do something.
diff --git a/manual-helloworld.md b/manual-helloworld.md
index f9e2d93..b41bf6b 100644
--- a/manual-helloworld.md
+++ b/manual-helloworld.md
@@ -10,7 +10,7 @@ including initializing libraries, creating elements, packing elements
together in a pipeline and playing this pipeline. By doing all this, you
will be able to build a simple Ogg/Vorbis audio player.
-# Hello world
+## Hello world
We're going to create a simple first application, a simple Ogg/Vorbis
command-line audio player. For this, we will use only standard GStreamer
@@ -205,7 +205,7 @@ as follows:
![The "hello world" pipeline](images/hello-world.png "fig:")
-# Compiling and Running helloworld.c
+## Compiling and Running helloworld.c
To compile the helloworld example, use: `gcc -Wall
helloworld.c -o helloworld
@@ -227,7 +227,7 @@ $(pkg-config --cflags --libs gstreamer-1.0)`.
You can run this example application with `./helloworld
file.ogg`. Substitute `file.ogg` with your favourite Ogg/Vorbis file.
-# Conclusion
+## Conclusion
This concludes our first example. As you see, setting up a pipeline is
very low-level but powerful. You will see later in this manual how you
diff --git a/manual-index.md b/manual-index.md
index e5840a6..bcdc82d 100644
--- a/manual-index.md
+++ b/manual-index.md
@@ -5,7 +5,7 @@ short-description: Complete walkthrough for building an application using GStrea
# Application Development Manual
-# Foreword
+## Foreword
GStreamer is an extremely powerful and versatile framework for creating
streaming media applications. Many of the virtues of the GStreamer
@@ -20,9 +20,9 @@ effort going into helping you understand GStreamer concepts. Later
chapters will go into more advanced topics related to media playback,
but also at other forms of media processing (capture, editing, etc.).
-# Introduction
+## Introduction
-## Who should read this manual?
+### Who should read this manual?
This book is about GStreamer from an application developer's point of
view; it describes how to write a GStreamer application using the
@@ -32,7 +32,7 @@ we suggest the [Plugin Writers Guide](pwg-index.md).
Also check out the other documentation available on the [GStreamer web
site](http://gstreamer.freedesktop.org/documentation/).
-## Preliminary reading
+### Preliminary reading
In order to understand this manual, you need to have a basic
understanding of the *C language*.
@@ -57,7 +57,7 @@ Especially,
- glib main loop
-## Structure of this manual
+### Structure of this manual
To help you navigate through this guide, it is divided into several
large parts. Each part addresses a particular broad topic concerning
diff --git a/manual-init.md b/manual-init.md
index 7aa55aa..14a3b5c 100644
--- a/manual-init.md
+++ b/manual-init.md
@@ -8,7 +8,7 @@ When writing a GStreamer application, you can simply include `gst/gst.h`
to get access to the library functions. Besides that, you will also need
to initialize the GStreamer library.
-# Simple initialization
+## Simple initialization
Before the GStreamer libraries can be used, `gst_init` has to be called
from the main application. This call will perform the necessary
@@ -60,7 +60,7 @@ It is also possible to call the `gst_init` function with two `NULL`
arguments, in which case no command line options will be parsed by
GStreamer.
-# The GOption interface
+## The GOption interface
You can also use a GOption table to initialize your own parameters as
shown in the next example:
diff --git a/manual-interfaces.md b/manual-interfaces.md
index b79524f..75fb642 100644
--- a/manual-interfaces.md
+++ b/manual-interfaces.md
@@ -18,7 +18,7 @@ Most of the interfaces handled here will not contain any example code.
See the API references for details. Here, we will just describe the
scope and purpose of each interface.
-# The URI interface
+## The URI interface
In all examples so far, we have only supported local files through the
“filesrc” element. GStreamer, obviously, supports many more location
@@ -40,7 +40,7 @@ element.
You can convert filenames to and from URIs using GLib's
`g_filename_to_uri ()` and `g_uri_to_filename ()`.
-# The Color Balance interface
+## The Color Balance interface
The colorbalance interface is a way to control video-related properties
on an element, such as brightness, contrast and so on. It's sole reason
@@ -50,7 +50,7 @@ dynamically register properties using `GObject`.
The colorbalance interface is implemented by several plugins, including
xvimagesink and the Video4linux2 elements.
-# The Video Overlay interface
+## The Video Overlay interface
The Video Overlay interface was created to solve the problem of
embedding video streams in an application window. The application
diff --git a/manual-intgration.md b/manual-intgration.md
index 5812f85..b7960b3 100644
--- a/manual-intgration.md
+++ b/manual-intgration.md
@@ -10,7 +10,7 @@ environments (such as GNOME or KDE). In this chapter, we'll mention some
specific techniques to integrate your application with your operating
system or desktop environment of choice.
-# Linux and UNIX-like operating systems
+## Linux and UNIX-like operating systems
GStreamer provides a basic set of elements that are useful when
integrating with Linux or a UNIX-like operating system.
@@ -29,7 +29,7 @@ integrating with Linux or a UNIX-like operating system.
hardware-accelerated video), direct-framebuffer (dfbimagesink) and
openGL image contexts (glsink).
-# GNOME desktop
+## GNOME desktop
GStreamer has been the media backend of the
[GNOME](http://www.gnome.org/) desktop since GNOME-2.2 onwards.
@@ -131,7 +131,7 @@ integrate as closely as possible with the GNOME desktop:
deprecated GNOME-VFS system is supported too but shouldn't be used
for any new applications.
-# KDE desktop
+## KDE desktop
GStreamer has been proposed for inclusion in KDE-4.0. Currently,
GStreamer is included as an optional component, and it's used by several
@@ -146,12 +146,12 @@ probably grow as GStreamer starts to be used in KDE-4.0:
- AmaroK contains a kiosrc element, which is a source element that
integrates with the KDE VFS subsystem KIO.
-# OS X
+## OS X
GStreamer provides native video and audio output elements for OS X. It
builds using the standard development tools for OS X.
-# Windows
+## Windows
> **Warning**
>
@@ -165,7 +165,7 @@ builds using the standard development tools for OS X.
GStreamer builds using Microsoft Visual C .NET 2003 and using Cygwin.
-## Building GStreamer under Win32
+### Building GStreamer under Win32
There are different makefiles that can be used to build GStreamer with
the usual Microsoft compiling tools.
@@ -215,7 +215,7 @@ latest developments in this respect.
> separately on the net for convenience (people who don't want to
> install GNU tools).
-## Installation on the system
+### Installation on the system
FIXME: This section needs be updated for GStreamer-1.0.
diff --git a/manual-intro-basics.md b/manual-intro-basics.md
index bd8f139..0b69002 100644
--- a/manual-intro-basics.md
+++ b/manual-intro-basics.md
@@ -9,7 +9,7 @@ Understanding these concepts will be important in reading any of the
rest of this guide, all of them assume understanding of these basic
concepts.
-# Elements
+## Elements
An *element* is the most important class of objects in GStreamer. You
will usually create a chain of elements linked together and let data
@@ -23,7 +23,7 @@ development of a large variety of media applications possible. If
needed, you can also write new elements. That topic is explained in
great deal in the *GStreamer Plugin Writer's Guide*.
-# Pads
+## Pads
*Pads* are element's input and output, where you can connect other
elements. They are used to negotiate links and data flow between
@@ -56,7 +56,7 @@ object) and events (described by the
[`GstEvent`](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/gstreamer-GstEvent.html)
object).
-# Bins and pipelines
+## Bins and pipelines
A *bin* is a container for a collection of elements. Since bins are
subclasses of elements themselves, you can mostly control a bin as if it
@@ -75,7 +75,7 @@ you stop them or the end of the data stream is reached.
![GStreamer pipeline for a simple ogg player](images/simple-player.png
"fig:")
-# Communication
+## Communication
GStreamer provides several mechanisms for communication and data
exchange between the *application* and the *pipeline*.
diff --git a/manual-licensing.md b/manual-licensing.md
index ecaefde..1066fc6 100644
--- a/manual-licensing.md
+++ b/manual-licensing.md
@@ -4,7 +4,7 @@ title: Licensing advisory
# Licensing advisory
-# How to license the applications you build with GStreamer
+## How to license the applications you build with GStreamer
The licensing of GStreamer is no different from a lot of other libraries
out there like GTK+ or glibc: we use the LGPL. What complicates things
diff --git a/manual-metadata.md b/manual-metadata.md
index 946360e..aab66b8 100644
--- a/manual-metadata.md
+++ b/manual-metadata.md
@@ -14,7 +14,7 @@ video size, audio samplerate, codecs used and so on. Tags are handled
using the GStreamer tagging system. Stream-info can be retrieved from a
`GstPad` by getting the current (negotiated) `GstCaps` for that pad.
-# Metadata reading
+## Metadata reading
Stream information can most easily be read by reading it from a
`GstPad`. This has already been discussed before in [Using capabilities
@@ -162,7 +162,7 @@ main (int argc, char ** argv)
```
-# Tag writing
+## Tag writing
Tag writing is done using the
[`GstTagSetter`](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/GstTagSetter.html)
diff --git a/manual-motivation.md b/manual-motivation.md
index e92fbce..bc2a676 100644
--- a/manual-motivation.md
+++ b/manual-motivation.md
@@ -4,7 +4,7 @@ title: Design principles
# Design principles
-# Clean and powerful
+## Clean and powerful
GStreamer provides a clean interface to:
@@ -18,7 +18,7 @@ GStreamer provides a clean interface to:
and tracing mechanism has been integrated. GStreamer also comes with
an extensive set of real-life plugins that serve as examples too.
-# Object oriented
+## Object oriented
GStreamer adheres to GObject, the GLib 2.0 object model. A programmer
familiar with GLib 2.0 or GTK+ will be comfortable with GStreamer.
@@ -32,7 +32,7 @@ GStreamer intends to be similar in programming methodology to GTK+. This
applies to the object model, ownership of objects, reference counting,
etc.
-# Extensible
+## Extensible
All GStreamer Objects can be extended using the GObject inheritance
methods.
@@ -40,7 +40,7 @@ methods.
All plugins are loaded dynamically and can be extended and upgraded
independently.
-# Allow binary-only plugins
+## Allow binary-only plugins
Plugins are shared libraries that are loaded at runtime. Since all the
properties of the plugin can be set using the GObject properties, there
@@ -50,7 +50,7 @@ the plugins.
Special care has been taken to make plugins completely self-contained.
All relevant aspects of plugins can be queried at run-time.
-# High performance
+## High performance
High performance is obtained by:
@@ -76,7 +76,7 @@ High performance is obtained by:
that the plugin loading can be delayed until the plugin is actually
used.
-# Clean core/plugins separation
+## Clean core/plugins separation
The core of GStreamer is essentially media-agnostic. It only knows about
bytes and blocks, and only contains basic elements. The core of
@@ -86,7 +86,7 @@ like cp.
All of the media handling functionality is provided by plugins external
to the core. These tell the core how to handle specific types of media.
-# Provide a framework for codec experimentation
+## Provide a framework for codec experimentation
GStreamer also wants to be an easy framework where codec developers can
experiment with different algorithms, speeding up the development of
diff --git a/manual-pads.md b/manual-pads.md
index e4b49c9..401b81a 100644
--- a/manual-pads.md
+++ b/manual-pads.md
@@ -11,7 +11,7 @@ media that the element can handle will be exposed by the pad's
capabilities. We will talk more on capabilities later in this chapter
(see [Capabilities of a pad](#capabilities-of-a-pad)).
-# Pads
+## Pads
A pad type is defined by two properties: its direction and its
availability. As we've mentioned before, GStreamer defines two pad
@@ -29,7 +29,7 @@ exist, sometimes pad exist only in certain cases (and can disappear
randomly), and on-request pads appear only if explicitly requested by
applications.
-## Dynamic (or sometimes) pads
+### Dynamic (or sometimes) pads
Some elements might not have all of their pads when the element is
created. This can happen, for example, with an Ogg demuxer element. The
@@ -109,7 +109,7 @@ It is not uncommon to add elements to the pipeline only from within the
the newly-added elements to the target state of the pipeline using
`gst_element_set_state ()` or `gst_element_sync_state_with_parent ()`.
-## Request pads
+### Request pads
An element can also have request pads. These pads are not created
automatically but are only created on demand. This is very useful for
@@ -136,7 +136,7 @@ an Ogg multiplexer from any input.
{{ examples/snippets.c#link_to_multiplexer }}
-# Capabilities of a pad
+## Capabilities of a pad
Since the pads play a very important role in how the element is viewed
by the outside world, a mechanism is implemented to describe the data
@@ -155,7 +155,7 @@ which case the pad is not yet negotiated, or it is the type of media
that currently streams over this pad, in which case the pad has been
negotiated already.
-## Dissecting capabilities
+### Dissecting capabilities
A pad's capabilities are described in a `GstCaps` object. Internally, a
[`GstCaps`](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/gstreamer-GstCaps.html)
@@ -195,7 +195,7 @@ Pad Templates:
```
-## Properties and values
+### Properties and values
Properties are used to describe extra information for capabilities. A
property consists of a key (a string) and a value. There are different
@@ -261,7 +261,7 @@ possible value types that can be used:
Unlike a `GST_TYPE_LIST`, the values in an array will be interpreted
as a whole.
-# What capabilities are used for
+## What capabilities are used for
Capabilities (short: caps) describe the type of data that is streamed
between two pads, or that one pad (template) supports. This makes them
@@ -295,7 +295,7 @@ very useful for various purposes:
to convert data to a specific output format at a certain point in a
stream.
-## Using capabilities for metadata
+### Using capabilities for metadata
A pad can have a set (i.e. one or more) of capabilities attached to it.
Capabilities (`GstCaps`) are represented as an array of one or more
@@ -347,7 +347,7 @@ read_video_props (GstCaps *caps)
```
-## Creating capabilities for filtering
+### Creating capabilities for filtering
While capabilities are mainly used inside a plugin to describe the media
type of the pads, the application programmer often also has to have
@@ -436,7 +436,7 @@ See the API references for the full API of
and
[`GstCaps`](http://gstreamer.freedesktop.org/data/doc/gstreamer/stable/gstreamer/html/gstreamer-GstCaps.html).
-# Ghost pads
+## Ghost pads
You can see from [Visualisation of a GstBin element without ghost
pads](#visualisation-of-a-gstbin-------element-without-ghost-pads) how a
diff --git a/manual-playback-components.md b/manual-playback-components.md
index 8403794..e19cba8 100644
--- a/manual-playback-components.md
+++ b/manual-playback-components.md
@@ -21,7 +21,7 @@ advanced features, such as playlist support, crossfading of audio tracks
and so on. Its programming interface is more low-level than that of
playbin, though.
-# Playbin
+## Playbin
Playbin is an element that can be created using the standard GStreamer
API (e.g. `gst_element_factory_make ()`). The factory is conveniently
@@ -109,7 +109,7 @@ Playbin has several features that have been discussed previously:
For convenience, it is possible to test “playbin” on the commandline,
using the command “gst-launch-1.0 playbin uri=file:///path/to/file”.
-# Decodebin
+## Decodebin
Decodebin is the actual autoplugger backend of playbin, which was
discussed in the previous section. Decodebin will, in short, accept
@@ -246,7 +246,7 @@ Decodebin can be easily tested on the commandline, e.g. by using the
command `gst-launch-1.0 filesrc location=file.ogg ! decodebin
! audioconvert ! audioresample ! autoaudiosink`.
-# URIDecodebin
+## URIDecodebin
The uridecodebin element is very similar to decodebin, only that it
automatically plugs a source plugin based on the protocol of the URI
@@ -279,7 +279,7 @@ URIDecodebin can be easily tested on the commandline, e.g. by using the
command `gst-launch-1.0 uridecodebin uri=file:///file.ogg !
! audioconvert ! audioresample ! autoaudiosink`.
-# Playsink
+## Playsink
The playsink element is a powerful sink element. It has request pads for
raw decoded audio, video and text and it will configure itself to play
diff --git a/manual-porting-1.0.md b/manual-porting-1.0.md
index bab0ed4..58710b7 100644
--- a/manual-porting-1.0.md
+++ b/manual-porting-1.0.md
@@ -13,7 +13,7 @@ document.
It should be possible to port simple applications to GStreamer-1.0 in
less than a day.
-# List of changes
+## List of changes
- All deprecated methods were removed. Recompile against 0.10 with
GST\_DISABLE\_DEPRECATED defined (such as by adding
diff --git a/manual-porting.md b/manual-porting.md
index 3fc4ae8..2371464 100644
--- a/manual-porting.md
+++ b/manual-porting.md
@@ -11,7 +11,7 @@ the relevant sections in this Application Development Manual where
needed. With this list, it should be possible to port simple
applications to GStreamer-0.10 in less than a day.
-# List of changes
+## List of changes
- Most functions returning an object or an object property have been
changed to return its own reference rather than a constant reference
diff --git a/manual-programs.md b/manual-programs.md
index 83fc7b9..d33da3a 100644
--- a/manual-programs.md
+++ b/manual-programs.md
@@ -4,7 +4,7 @@ title: Programs
# Programs
-# `gst-launch`
+## `gst-launch`
This is a tool that will construct pipelines based on a command-line
syntax.
@@ -106,13 +106,13 @@ main (int argc, char *argv[])
Note how we can retrieve the filesrc element from the constructed bin
using the element name.
-## Grammar Reference
+### Grammar Reference
The `gst-launch` syntax is processed by a flex/bison parser. This
section is intended to provide a full specification of the grammar; any
deviations from this specification is considered a bug.
-### Elements
+#### Elements
```
... mad ...
@@ -124,7 +124,7 @@ letters, numbers, dashes, underscores, percent signs, or colons) will
create an element from a given element factory. In this example, an
instance of the "mad" MP3 decoding plugin will be created.
-### Links
+#### Links
```
... !sink ...
@@ -140,7 +140,7 @@ constructed will be chosen. An attempt will be made to find compatible
pads. Pad names may be preceded by an element name, as in
`my_element_name.sink_pad`.
-### Properties
+#### Properties
```
... location="http://gstreamer.net" ...
@@ -161,7 +161,7 @@ guaranteed to work, it relies on the g\_value\_convert routines. No
error message will be displayed on an invalid conversion, due to
limitations in the value convert API.
-### Bins, Threads, and Pipelines
+#### Bins, Threads, and Pipelines
```
( ... )
@@ -175,7 +175,7 @@ curly braces make threads. The default toplevel bin type is a pipeline,
although putting the whole description within parentheses or braces can
override this default.
-# `gst-inspect`
+## `gst-inspect`
This is a tool to query a plugin or an element about its properties.
diff --git a/manual-queryevents.md b/manual-queryevents.md
index ce77dc7..8dbabc3 100644
--- a/manual-queryevents.md
+++ b/manual-queryevents.md
@@ -14,7 +14,7 @@ task is done. GStreamer has built-in support for doing all this using a
concept known as *querying*. Since seeking is very similar, it will be
discussed here as well. Seeking is done using the concept of *events*.
-# Querying: getting the position or length of a stream
+## Querying: getting the position or length of a stream
Querying is defined as requesting a specific stream property related to
progress tracking. This includes getting the length of a stream (if
@@ -72,7 +72,7 @@ main (gint argc,
```
-# Events: seeking (and more)
+## Events: seeking (and more)
Events work in a very similar way as queries. Dispatching, for example,
works exactly the same for events (and also has the same limitations),
diff --git a/manual-threads.md b/manual-threads.md
index 54cd7af..a123102 100644
--- a/manual-threads.md
+++ b/manual-threads.md
@@ -17,7 +17,7 @@ configure things such as the thread priority or the threadpool to use.
See [Configuring Threads in
GStreamer](#configuring-threads-in-gstreamer).
-# Scheduling in GStreamer
+## Scheduling in GStreamer
Each element in the GStreamer pipeline decides how it is going to be
scheduled. Elements can choose if their pads are to be scheduled
@@ -34,7 +34,7 @@ threads, or `GstTask` objects, are created from a `GstTaskPool` when the
element needs to make a streaming thread. In the next section we see how
we can receive notifications of the tasks and pools.
-# Configuring Threads in GStreamer
+## Configuring Threads in GStreamer
A STREAM\_STATUS message is posted on the bus to inform you about the
status of the streaming threads. You will get the following information
@@ -61,7 +61,7 @@ from the message:
We will now look at some examples in the next sections.
-## Boost priority of a thread
+### Boost priority of a thread
```
.----------. .----------.
@@ -335,7 +335,7 @@ message, which is likely the pad or the element that starts the thread,
to figure out what the function of this thread is in the context of the
application.
-# When would you want to force a thread?
+## When would you want to force a thread?
We have seen that threads are created by elements but it is also
possible to insert elements in the pipeline for the sole purpose of
diff --git a/pwg-advanced-clock.md b/pwg-advanced-clock.md
index f0a1dab..d03c699 100644
--- a/pwg-advanced-clock.md
+++ b/pwg-advanced-clock.md
@@ -8,7 +8,7 @@ When playing complex media, each sound and video sample must be played
in a specific order at a specific time. For this purpose, GStreamer
provides a synchronization mechanism.
-# Clocks
+## Clocks
Time in GStreamer is defined as the value returned from a particular
`GstClock` object from the method `gst_clock_get_time ()`.
@@ -24,7 +24,7 @@ As clocks return an absolute measure of time, they are not usually used
directly. Instead, differences between two clock times are used to
measure elapsed time according to a clock.
-# Clock running-time
+## Clock running-time
A clock returns the **absolute-time** according to that clock with
`gst_clock_get_time ()`. From the absolute-time is a **running-time**
@@ -45,7 +45,7 @@ Because all objects in the pipeline have the same clock and base-time,
they can thus all calculate the running-time according to the pipeline
clock.
-# Buffer running-time
+## Buffer running-time
To calculate a buffer running-time, we need a buffer timestamp and the
SEGMENT event that preceded the buffer. First we can convert the SEGMENT
@@ -59,12 +59,12 @@ running-time. Usually this task is done by sink elements. Sink also have
to take into account the latency configured in the pipeline and add this
to the buffer running-time before synchronizing to the pipeline clock.
-# Obligations of each element.
+## Obligations of each element.
Let us clarify the contract between GStreamer and each element in the
pipeline.
-## Non-live source elements
+### Non-live source elements
Non-live source elements must place a timestamp in each buffer that they
deliver when this is possible. They must choose the timestamps and the
@@ -78,7 +78,7 @@ buffers. It can and must however create a timestamp on the first buffer
The source then pushes out the SEGMENT event followed by the timestamped
buffers.
-## Live source elements
+### Live source elements
Live source elements must place a timestamp in each buffer that they
deliver. They must choose the timestamps and the values of the SEGMENT
@@ -86,13 +86,13 @@ event in such a way that the running-time of the buffer matches exactly
the running-time of the pipeline clock when the first byte in the buffer
was captured.
-## Parser/Decoder/Encoder elements
+### Parser/Decoder/Encoder elements
Parser/Decoder elements must use the incoming timestamps and transfer
those to the resulting output buffers. They are allowed to interpolate
or reconstruct timestamps on missing input buffers when they can.
-## Demuxer elements
+### Demuxer elements
Demuxer elements can usually set the timestamps stored inside the media
file onto the outgoing buffers. They need to make sure that outgoing
@@ -101,13 +101,13 @@ running-time. Demuxers also need to take into account the incoming
timestamps on buffers and use that to calculate an offset on the
outgoing buffer timestamps.
-## Muxer elements
+### Muxer elements
Muxer elements should use the incoming buffer running-time to mux the
different streams together. They should copy the incoming running-time
to the outgoing buffers.
-## Sink elements
+### Sink elements
If the element is intended to emit samples at a specific time (real time
playing), the element should require a clock, and thus implement the
diff --git a/pwg-advanced-events.md b/pwg-advanced-events.md
index ea5666c..9f1d528 100644
--- a/pwg-advanced-events.md
+++ b/pwg-advanced-events.md
@@ -11,7 +11,7 @@ pipeline is not handling them correctly the whole event system of the
pipeline is broken. We will try to explain here how these methods work
and how elements are supposed to implement them.
-# Downstream events
+## Downstream events
Downstream events are received through the sink pad's event handler, as
set using `gst_pad_set_event_function ()` when the pad was created.
@@ -76,7 +76,7 @@ decoders with an id3demux or apedemux element in front of them, or
demuxers that are being fed input from sources that send additional
information about the stream in custom events, as DVD sources do).
-# Upstream events
+## Upstream events
Upstream events are generated by an element somewhere downstream in the
pipeline (example: a video sink may generate navigation events that
@@ -127,7 +127,7 @@ handling. Here they are :
thread than the streaming thread, so make sure you use appropriate
locking everywhere.
-# All Events Together
+## All Events Together
In this chapter follows a list of all defined events that are currently
being used, plus how they should be used/interpreted. You can check the
@@ -165,17 +165,17 @@ For more comprehensive information about events and how they should be
used correctly in various circumstances please consult the GStreamer
design documentation. This section only gives a general overview.
-## Stream Start
+### Stream Start
WRITEME
-## Caps
+### Caps
The CAPS event contains the format description of the following buffers.
See [Caps negotiation](pwg-negotiation.md) for more information
about negotiation.
-## Segment
+### Segment
A segment event is sent downstream to announce the range of valid
timestamps in the stream and how they should be transformed into
@@ -209,7 +209,7 @@ extract the event details. Elements may find the GstSegment API useful
to keep track of the current segment (if they want to use it for output
clipping, for example).
-## Tag (metadata)
+### Tag (metadata)
Tagging events are being sent downstream to indicate the tags as parsed
from the stream data. This is currently used to preserve tags during
@@ -227,7 +227,7 @@ ownership of.
Elements parsing this event can use the function `gst_event_parse_tag
()` to acquire the taglist that the event contains.
-## End of Stream (EOS)
+### End of Stream (EOS)
End-of-stream events are sent if the stream that an element sends out is
finished. An element receiving this event (from upstream, so it receives
@@ -256,15 +256,15 @@ not need to ever manually send an EOS event, you should also just return
GST\_FLOW\_EOS in your create or fill function (assuming your element
derives from GstBaseSrc or GstPushSrc).
-## Table Of Contents
+### Table Of Contents
WRITEME
-## Gap
+### Gap
WRITEME
-## Flush Start
+### Flush Start
The flush start event is sent downstream (in push mode) or upstream (in
pull mode) if all buffers and caches in the pipeline should be emptied.
@@ -287,7 +287,7 @@ The flush-start event is created with the `gst_event_new_flush_start
only created by elements driving the pipeline, like source elements
operating in push-mode or pull-range based demuxers/decoders.
-## Flush Stop
+### Flush Stop
The flush-stop event is sent by an element driving the pipeline after a
flush-start and tells pads and elements downstream that they should
@@ -303,13 +303,13 @@ has one parameter that controls if the running-time of the pipeline
should be reset to 0 or not. Normally after a flushing seek, the
running\_time is set back to 0.
-## Quality Of Service (QOS)
+### Quality Of Service (QOS)
The QOS event contains a report about the current real-time performance
of the stream. See more info in [Quality Of Service
(QoS)](pwg-advanced-qos.md).
-## Seek Request
+### Seek Request
Seek events are meant to request a new stream position to elements. This
new position can be set in several formats (time, bytes or “default
@@ -335,7 +335,7 @@ it should operate based on the SEGMENT events it receives.
Elements parsing this event can do this using `gst_event_parse_seek()`.
-## Navigation
+### Navigation
Navigation events are sent upstream by video sinks to inform upstream
elements of where the mouse pointer is, if and where mouse pointer
diff --git a/pwg-advanced-interfaces.md b/pwg-advanced-interfaces.md
index 76711ba..55a1111 100644
--- a/pwg-advanced-interfaces.md
+++ b/pwg-advanced-interfaces.md
@@ -42,7 +42,7 @@ reasons for this. First of all, properties can be more easily
introspected. Second, properties can be specified on the commandline
(`gst-launch`).
-# How to Implement Interfaces
+## How to Implement Interfaces
Implementing interfaces is initiated in the `_get_type ()` of your
element. You can register one or more interfaces after having registered
@@ -114,15 +114,15 @@ G_DEFINE_TYPE_WITH_CODE (GstMyFilter, gst_my_filter,GST_TYPE_ELEMENT,
```
-# URI interface
+## URI interface
WRITEME
-# Color Balance Interface
+## Color Balance Interface
WRITEME
-# Video Overlay Interface
+## Video Overlay Interface
The \#GstVideoOverlay interface is used for 2 main purposes :
@@ -212,7 +212,7 @@ gst_my_filter_sink_set_caps (GstMyFilter *my_filter, GstCaps *caps)
```
-# Navigation Interface
+## Navigation Interface
WRITEME
diff --git a/pwg-advanced-qos.md b/pwg-advanced-qos.md
index 963e138..3c2e068 100644
--- a/pwg-advanced-qos.md
+++ b/pwg-advanced-qos.md
@@ -32,7 +32,7 @@ It is also possible for the application to artificially introduce delay
between synchronized buffers, this is called throttling. It can be used
to limit or reduce the framerate, for example.
-# Measuring QoS
+## Measuring QoS
Elements that synchronize buffers on the pipeline clock will usually
measure the current QoS. They will also need to keep some statistics in
@@ -63,7 +63,7 @@ These measurements are used to construct a QOS event that is sent
upstream. Note that a QoS event is sent for each buffer that arrives in
the sink.
-# Handling QoS
+## Handling QoS
An element will have to install an event function on its source pads in
order to receive QOS events. Usually, the element will need to store the
@@ -102,7 +102,7 @@ the example below. Also make sure to pass the QoS event upstream.
With the QoS values, there are two types of corrections that an element
can do:
-## Short term correction
+### Short term correction
The timestamp and the jitter value in the QOS event can be used to
perform a short term correction. If the jitter is positive, the previous
@@ -145,7 +145,7 @@ A possible algorithm typically looks like this:
```
-## Long term correction
+### Long term correction
Long term corrections are a bit more difficult to perform. They rely on
the value of the proportion in the QOS event. Elements should reduce the
@@ -171,7 +171,7 @@ In all cases, elements should be prepared to go back to their normal
processing rate when the proportion member in the QOS event approaches
the ideal proportion of 1.0 again.
-# Throttling
+## Throttling
Elements synchronizing to the clock should expose a property to
configure them in throttle mode. In throttle mode, the time distance
@@ -194,7 +194,7 @@ The default sink base class, has the “throttle-time” property for this
feature. You can test this with: `gst-launch-1.0 videotestsrc !
xvimagesink throttle-time=500000000`
-# QoS Messages
+## QoS Messages
In addition to the QOS events that are sent between elements in the
pipeline, there are also QOS messages posted on the pipeline bus to
diff --git a/pwg-advanced-request.md b/pwg-advanced-request.md
index d5781dd..e383629 100644
--- a/pwg-advanced-request.md
+++ b/pwg-advanced-request.md
@@ -12,7 +12,7 @@ pad (always, sometimes or request) can be seen in a pad's template. This
chapter will discuss when each of the two is useful, how they are
created and when they should be disposed.
-# Sometimes pads
+## Sometimes pads
A “sometimes” pad is a pad that is created under certain conditions, but
not in all cases. This mostly depends on stream content: demuxers will
@@ -201,7 +201,7 @@ used maliciously to cause undefined behaviour in the plugin, which might
lead to security issues. *Always* assume that the file could be used to
do bad things.
-# Request pads
+## Request pads
“Request” pads are similar to sometimes pads, except that request are
created on demand of something outside of the element rather than
diff --git a/pwg-advanced-tagging.md b/pwg-advanced-tagging.md
index 6d9b03e..121e0e4 100644
--- a/pwg-advanced-tagging.md
+++ b/pwg-advanced-tagging.md
@@ -4,7 +4,7 @@ title: Tagging (Metadata and Streaminfo)
# Tagging (Metadata and Streaminfo)
-# Overview
+## Overview
Tags are pieces of information stored in a stream that are not the
content itself, but they rather *describe* the content. Most media
@@ -46,7 +46,7 @@ supporting both can be used in a tag editor for quick tag changing
writing and usually requires tag extraction/stripping and remuxing of
the stream with new tags).
-# Reading Tags from Streams
+## Reading Tags from Streams
The basic object for tags is a [`GstTagList
`](../../gstreamer/html/GstTagList.html). An element that is reading
@@ -88,7 +88,7 @@ gst_my_filter_class_init (GstMyFilterClass *klass)
```
-# Writing Tags to Streams
+## Writing Tags to Streams
Tag writers are the opposite of tag readers. Tag writers only take
metadata tags into account, since that's the only type of tags that have
diff --git a/pwg-allocation.md b/pwg-allocation.md
index 6ba5380..b59b786 100644
--- a/pwg-allocation.md
+++ b/pwg-allocation.md
@@ -25,7 +25,7 @@ For efficiently managing buffers of the same size, we take a look at
GST\_QUERY\_ALLOCATION query that is used to negotiate memory management
options between elements.
-# GstMemory
+## GstMemory
`GstMemory` is an object that manages a region of memory. The memory
object points to a region of memory of “maxsize”. The area in this
@@ -34,7 +34,7 @@ region in the memory. the maxsize of the memory can never be changed
after the object is created, however, the offset and size can be
changed.
-## GstAllocator
+### GstAllocator
`GstMemory` objects are created by a `GstAllocator` object. Most
allocators implement the default `gst_allocator_alloc()` method but some
@@ -46,7 +46,7 @@ memory and memory backed by a DMAbuf file descriptor. To implement
support for a new kind of memory type, you must implement a new
allocator object as shown below.
-## GstMemory API example
+### GstMemory API example
Data access to the memory wrapped by the `GstMemory` object is always
protected with a `gst_memory_map()` and `gst_memory_unmap()` pair. An
@@ -83,11 +83,11 @@ Below is an example of making a `GstMemory` object and using the
```
-## Implementing a GstAllocator
+### Implementing a GstAllocator
WRITEME
-# GstBuffer
+## GstBuffer
A `GstBuffer` is an lightweight object that is passed from an upstream
to a downstream element and contains memory and metadata. It represents
@@ -112,7 +112,7 @@ Metadata in the buffer consists of:
- Arbitrary structures via `GstMeta`, see below.
-## GstBuffer writability
+### GstBuffer writability
A buffer is writable when the refcount of the object is exactly 1,
meaning that only one object is holding a ref to the buffer. You can
@@ -120,7 +120,7 @@ only modify anything in the buffer when the buffer is writable. This
means that you need to call `gst_buffer_make_writable()` before changing
the timestamps, offsets, metadata or adding and removing memory blocks.
-## GstBuffer API examples
+### GstBuffer API examples
You can create a buffer with `gst_buffer_new ()` and then add memory
objects to it or you can use a convenience function
@@ -168,7 +168,7 @@ Below is an example of how to create a buffer and access its memory.
```
-# GstMeta
+## GstMeta
With the `GstMeta` system you can add arbitrary structures on buffers.
These structures describe extra properties of the buffer such as
@@ -179,7 +179,7 @@ its API look like) and the implementation (how it works). This makes it
possible to make different implementations of the same API, for example,
depending on the hardware you are running on.
-## GstMeta API example
+### GstMeta API example
After allocating a new buffer, you can add metadata to the buffer with
the metadata specific API. This means that you will need to link to the
@@ -241,12 +241,12 @@ frame like this:
```
-## Implementing new GstMeta
+### Implementing new GstMeta
In the next sections we show how you can add new metadata to the system
and use it on buffers.
-### Define the metadata API
+#### Define the metadata API
First we need to define what our API will look like and we will have to
register this API to the system. This is important because this API
@@ -319,7 +319,7 @@ As you can see, it simply uses the `gst_meta_api_type_register ()`
function to register a name for the api and some tags. The result is a
new pointer GType that defines the newly registered API.
-### Implementing a metadata API
+#### Implementing a metadata API
Next we can make an implementation for a registered metadata API GType.
The implementation detail of a metadata API are kept in a `GstMetaInfo`
@@ -431,7 +431,7 @@ buffer.
Lastly, you implement a `gst_buffer_add_*_meta()` that adds the metadata
implementation to a buffer and sets the values of the metadata.
-# GstBufferPool
+## GstBufferPool
The `GstBufferPool` object provides a convenient base class for managing
lists of reusable buffers. Essential for this object is that all the
@@ -453,7 +453,7 @@ the pool.
In the following sections we take a look at how you can use a
bufferpool.
-## GstBufferPool API example
+### GstBufferPool API example
Many different bufferpool implementations can exist; they are all
subclasses of the base class `GstBufferPool`. For this example, we will
@@ -554,11 +554,11 @@ buffer, GStreamer will automatically call
pool. You (or any other downstream element) don't need to know if a
buffer came from a pool, you can just unref it.
-## Implementing a new GstBufferPool
+### Implementing a new GstBufferPool
WRITEME
-# GST\_QUERY\_ALLOCATION
+## GST\_QUERY\_ALLOCATION
The ALLOCATION query is used to negotiate `GstMeta`, `GstBufferPool` and
`GstAllocator` between elements. Negotiation of the allocation strategy
@@ -589,7 +589,7 @@ When the GST\_QUERY\_ALLOCATION returns, the source pad will select from
the available bufferpools, allocators and metadata how it will allocate
buffers.
-## ALLOCATION query example
+### ALLOCATION query example
Below is an example of the ALLOCATION query.
@@ -648,7 +648,7 @@ enable the pool to put `GstVideoMeta` metadata on the buffers from the
pool doing `gst_buffer_pool_config_add_option (config,
GST_BUFFER_POOL_OPTION_VIDEO_META)`.
-## The ALLOCATION query in base classes
+### The ALLOCATION query in base classes
In many baseclasses you will see the following virtual methods for
influencing the allocation strategy:
diff --git a/pwg-building-boiler.md b/pwg-building-boiler.md
index 85186b5..808c5db 100644
--- a/pwg-building-boiler.md
+++ b/pwg-building-boiler.md
@@ -12,7 +12,7 @@ you follow the examples here, then by the end of this chapter you will
have a functional audio filter plugin that you can compile and use in
GStreamer applications.
-# Getting the GStreamer Plugin Templates
+## Getting the GStreamer Plugin Templates
There are currently two ways to develop a new plugin for GStreamer: You
can write the entire plugin by hand, or you can copy an existing plugin
@@ -49,7 +49,7 @@ If for some reason you can't access the git repository, you can also
revision](http://cgit.freedesktop.org/gstreamer/gst-template/commit/)
via the cgit web interface.
-# Using the Project Stamp
+## Using the Project Stamp
The first thing to do when making a new element is to specify some basic
details about it: what its name is, who wrote it, what version number it
@@ -116,7 +116,7 @@ the well known `make && sudo make install` commands.
> creating elements the tool gst-element-maker from gst-plugins-bad is
> recommended these days.
-# Examining the Basic Code
+## Examining the Basic Code
First we will examine the code you would be likely to place in a header
file (although since the interface to the code is entirely defined by
@@ -170,7 +170,7 @@ G_DEFINE_TYPE (GstMyFilter, gst_my_filter, GST_TYPE_ELEMENT);
```
-# Element metadata
+## Element metadata
The Element metadata provides extra element information. It is
configured with `gst_element_class_set_metadata` or
@@ -220,7 +220,7 @@ gst_my_filter_class_init (GstMyFilterClass * klass)
```
-# GstStaticPadTemplate
+## GstStaticPadTemplate
A GstStaticPadTemplate is a description of a pad that the element will
(or might) create and use. It contains:
@@ -312,7 +312,7 @@ of types are supported too, and should be separated by a semicolon
to know the exact format of a stream: [Specifying the
pads](pwg-building-pads.md).
-# Constructor Functions
+## Constructor Functions
Each element has two functions which are used for construction of an
element. The `_class_init()` function, which is used to initialise the
@@ -320,7 +320,7 @@ class only once (specifying what signals, arguments and virtual
functions the class has and setting up global state); and the `_init()`
function, which is used to initialise a specific instance of this type.
-# The plugin\_init function
+## The plugin\_init function
Once we have written code defining all the parts of the plugin, we need
to write the plugin\_init() function. This is a special function, which
diff --git a/pwg-building-types.md b/pwg-building-types.md
index 9fbdbda..6d6a86e 100644
--- a/pwg-building-types.md
+++ b/pwg-building-types.md
@@ -37,7 +37,7 @@ For now, the policy is simple:
and get it added to the list of known types so that other developers
can use the type correctly when writing their elements.
-# Building a Simple Format for Testing
+## Building a Simple Format for Testing
If you need a new format that has not yet been defined in our [List of
Defined Types](#list-of-defined-types), you will want to have some
@@ -58,7 +58,7 @@ Make sure that your property names do not clash with similar properties
used in other types. If they match, make sure they mean the same thing;
properties with different types but the same names are *not* allowed.
-# Typefind Functions and Autoplugging
+## Typefind Functions and Autoplugging
With only *defining* the types, we're not yet there. In order for a
random data file to be recognized and played back as such, we need a way
@@ -116,7 +116,7 @@ functions.
Autoplugging has been discussed in great detail in the Application
Development Manual.
-# List of Defined Types
+## List of Defined Types
Below is a list of all the defined types in GStreamer. They are split up
in separate tables for audio, video, container, subtitle and other
diff --git a/pwg-checklist-element.md b/pwg-checklist-element.md
index df9bf91..dd0ef82 100644
--- a/pwg-checklist-element.md
+++ b/pwg-checklist-element.md
@@ -11,7 +11,7 @@ element and hope for it to be included in the mainstream GStreamer
distribution, it *has to* meet those requirements. As far as possible,
we will try to explain why those requirements are set.
-# About states
+## About states
- Make sure the state of an element gets reset when going to `NULL`.
Ideally, this should set all object properties to their original
@@ -29,7 +29,7 @@ we will try to explain why those requirements are set.
tools such as `valgrind`. Elements have to be reusable in a pipeline
after having been reset.
-# Debugging
+## Debugging
- Elements should *never* use their standard output for debugging
(using functions such as `printf
@@ -83,7 +83,7 @@ we will try to explain why those requirements are set.
it should be GstFooDec and gst\_foo\_dec, and not GstFoodec and
gst\_foodec.
-# Querying, events and the like
+## Querying, events and the like
- All elements to which it applies (sources, sinks, demuxers) should
implement query functions on their pads, so that applications and
@@ -99,7 +99,7 @@ we will try to explain why those requirements are set.
with gst\_pad\_query\_default (pad, parent, query) instead of just
dropping them.
-# Testing your element
+## Testing your element
- `gst-launch` is *not* a good tool to show that your element is
finished. Applications such as Rhythmbox and Totem (for GNOME) or
diff --git a/pwg-dparams.md b/pwg-dparams.md
index 61bfece..f3412d0 100644
--- a/pwg-dparams.md
+++ b/pwg-dparams.md
@@ -12,7 +12,7 @@ case you can mark these parameters as being Controllable. Aware
applications can use the controller subsystem to dynamically adjust the
property values over time.
-# Getting Started
+## Getting Started
The controller subsystem is contained within the `gstcontroller`
library. You need to include the header in your element's source file:
@@ -54,7 +54,7 @@ GObject params in the `_class_init` method.
```
-# The Data Processing Loop
+## The Data Processing Loop
In the last section we learned how to mark GObject params as
controllable. Application developers can then queue parameter changes
@@ -71,14 +71,14 @@ This call makes all parameter-changes for the given timestamp active by
adjusting the GObject properties of the element. Its up to the element
to determine the synchronisation rate.
-## The Data Processing Loop for Video Elements
+### The Data Processing Loop for Video Elements
For video processing elements it is the best to synchronise for every
frame. That means one would add the `gst_object_sync_values()` call
described in the previous section to the data processing function of the
element.
-## The Data Processing Loop for Audio Elements
+### The Data Processing Loop for Audio Elements
For audio processing elements the case is not as easy as for video
processing elements. The problem here is that audio has a much higher
diff --git a/pwg-intro-basics.md b/pwg-intro-basics.md
index 066ebe6..311389a 100644
--- a/pwg-intro-basics.md
+++ b/pwg-intro-basics.md
@@ -10,7 +10,7 @@ extending GStreamer. Many of these concepts are explained in greater
detail in the *GStreamer Application Development Manual*; the basic
concepts presented here serve mainly to refresh your memory.
-# Elements and Plugins
+## Elements and Plugins
Elements are at the core of GStreamer. In the context of plugin
development, an *element* is an object derived from the [`
@@ -51,7 +51,7 @@ See the *GStreamer Library Reference* for the current implementation
details of [`GstElement`](../../gstreamer/html/GstElement.html) and
[`GstPlugin`](../../gstreamer/html/GstPlugin.html).
-# Pads
+## Pads
*Pads* are used to negotiate links and data flow between elements in
GStreamer. A pad can be viewed as a “place” or “port” on an element
@@ -80,7 +80,7 @@ respectively.
See the *GStreamer Library Reference* for the current implementation
details of a [`GstPad`](../../gstreamer/html/GstPad.html).
-# GstMiniObject, Buffers and Events
+## GstMiniObject, Buffers and Events
All streams of data in GStreamer are chopped up into chunks that are
passed from a source pad on one element to a sink pad on another
@@ -136,7 +136,7 @@ details of a
[`GstBuffer`](../../gstreamer/html/GstBuffer.html) and
[`GstEvent`](../../gstreamer/html/GstEvent.html).
-## Buffer Allocation
+### Buffer Allocation
Buffers are able to store chunks of memory of several different types.
The most generic type of buffer contains memory allocated by malloc().
@@ -175,7 +175,7 @@ framework can choose the fastest algorithm as appropriate. Naturally,
this only makes sense for strict filters -- elements that have exactly
the same format on source and sink pads.
-# Media types and Properties
+## Media types and Properties
GStreamer uses a type system to ensure that the data passed between
elements is in a recognized format. The type system is also important
@@ -185,7 +185,7 @@ made between elements has a specified type and optionally a set of
properties. See more about caps negotiation in [Caps
negotiation](pwg-negotiation.md).
-## The Basic Types
+### The Basic Types
GStreamer already supports many basic media types. Following is a table
of a few of the basic types used for buffers in GStreamer. The table
diff --git a/pwg-intro-preface.md b/pwg-intro-preface.md
index 031ed7d..7af3117 100644
--- a/pwg-intro-preface.md
+++ b/pwg-intro-preface.md
@@ -4,7 +4,7 @@ title: Preface
# Preface
-# What is GStreamer?
+## What is GStreamer?
GStreamer is a framework for creating streaming media applications. The
fundamental design comes from the video pipeline at Oregon Graduate
@@ -36,7 +36,7 @@ The GStreamer core function is to provide a framework for plugins, data
flow, synchronization and media type handling/negotiation. It also
provides an API to write applications using the various plugins.
-# Who Should Read This Guide?
+## Who Should Read This Guide?
This guide explains how to write new modules for GStreamer. The guide is
relevant to several groups of people:
@@ -66,7 +66,7 @@ Development Manual*. If you are just trying to get help with a GStreamer
application, then you should check with the user manual for that
particular application.
-# Preliminary Reading
+## Preliminary Reading
This guide assumes that you are somewhat familiar with the basic
workings of GStreamer. For a gentle introduction to programming concepts
@@ -82,7 +82,7 @@ basics of [GObject](http://developer.gnome.org/gobject/stable/pt01.html)
programming. You may also want to have a look at Eric Harlow's book
*Developing Linux Applications with GTK+ and GDK*.
-# Structure of This Guide
+## Structure of This Guide
To help you navigate through this guide, it is divided into several
large parts. Each part addresses a particular broad topic concerning
diff --git a/pwg-licensing-advisory.md b/pwg-licensing-advisory.md
index 444f20c..aa599cf 100644
--- a/pwg-licensing-advisory.md
+++ b/pwg-licensing-advisory.md
@@ -4,7 +4,7 @@ title: GStreamer licensing
# GStreamer licensing
-# How to license the code you write for GStreamer
+## How to license the code you write for GStreamer
GStreamer is a plugin-based framework licensed under the LGPL. The
reason for this choice in licensing is to ensure that everyone can use
diff --git a/pwg-negotiation.md b/pwg-negotiation.md
index 19a5811..dc3ae33 100644
--- a/pwg-negotiation.md
+++ b/pwg-negotiation.md
@@ -9,7 +9,7 @@ elements that they can handle. This process in GStreamer can in most
cases find an optimal solution for the complete pipeline. In this
section we explain how this works.
-# Caps negotiation basics
+## Caps negotiation basics
In GStreamer, negotiation of the media format always follows the
following simple rules:
@@ -38,7 +38,7 @@ accepted by an element.
All negotiation follows these simple rules. Let's take a look at some
typical uses cases and how negotiation happens.
-# Caps negotiation use cases
+## Caps negotiation use cases
In what follows we will look at some use cases for push-mode scheduling.
The pull-mode scheduling negotiation phase is discussed in [Pull-mode
@@ -62,7 +62,7 @@ identify 3 caps negotiation use cases for the source pads:
- Dynamic negotiation. An element can output many formats. See
[Dynamic negotiation](#dynamic-negotiation).
-## Fixed negotiation
+### Fixed negotiation
In this case, the source pad can only produce a fixed format. Usually
this format is encoded inside the media. No downstream element can ask
@@ -130,7 +130,7 @@ All other elements that need to be configured for the format should
implement full caps negotiation, which will be explained in the next few
sections.
-## Transform negotiation
+### Transform negotiation
In this negotiation technique, there is a fixed transform between the
element input caps and the output caps. This transformation could be
@@ -220,7 +220,7 @@ gst_my_filter_sink_event (GstPad *pad,
```
-## Dynamic negotiation
+### Dynamic negotiation
A last negotiation method is the most complex and powerful dynamic
negotiation.
@@ -348,7 +348,7 @@ gst_my_filter_chain (GstPad *pad,
```
-# Upstream caps (re)negotiation
+## Upstream caps (re)negotiation
Upstream negotiation's primary use is to renegotiate (part of) an
already-negotiated pipeline to a new format. Some practical examples
@@ -393,7 +393,7 @@ different responsibilities here:
NEED\_RECONFIGURE flag with `gst_pad_check_reconfigure ()` and it
should start renegotiation when the function returns TRUE.
-# Implementing a CAPS query function
+## Implementing a CAPS query function
A `_query ()`-function with the GST\_QUERY\_CAPS query type is called
when a peer element would like to know which formats this pad supports,
@@ -461,7 +461,7 @@ gst_my_filter_query (GstPad *pad, GstObject * parent, GstQuery * query)
```
-# Pull-mode Caps negotiation
+## Pull-mode Caps negotiation
WRITEME, the mechanism of pull-mode negotiation is not yet fully
understood.
diff --git a/pwg-other-base.md b/pwg-other-base.md
index bf3cf9f..e23c314 100644
--- a/pwg-other-base.md
+++ b/pwg-other-base.md
@@ -14,7 +14,7 @@ and doing complex caps negotiation. For this purpose, GStreamer provides
base classes that simplify some types of elements. Those base classes
will be discussed in this chapter.
-# Writing a sink
+## Writing a sink
Sinks are special elements in GStreamer. This is because sink elements
have to take care of *preroll*, which is the process that takes care
@@ -71,7 +71,7 @@ The advantages of deriving from `GstBaseSink` are numerous:
There are also specialized base classes for audio and video, let's look
at those a bit.
-## Writing an audio sink
+### Writing an audio sink
Essentially, audio sink implementations are just a special case of a
general sink. An audio sink has the added complexity that it needs to
@@ -117,7 +117,7 @@ In addition to implementing the audio base-class virtual functions,
derived classes can (should) also implement the `GstBaseSink` `set_caps
()` and `get_caps ()` virtual functions for negotiation.
-## Writing a video sink
+### Writing a video sink
Writing a videosink can be done using the `GstVideoSink` base-class,
which derives from `GstBaseSink` internally. Currently, it does nothing
@@ -134,7 +134,7 @@ videosink:
extensions to videosinks that affect all of them, but only need to
be coded once, which is a huge maintenance benefit.
-# Writing a source
+## Writing a source
In the previous part, particularly [Providing random
access](pwg-scheduling.md#providing-random-access), we have learned
@@ -169,7 +169,7 @@ It is possible to use special memory, such as X server memory pointers
or `mmap ()`'ed memory areas, as data pointers in buffers returned from
the `create()` virtual function.
-## Writing an audio source
+### Writing an audio source
An audio source is nothing more but a special case of a pushsource.
Audio sources would be anything that reads audio, such as a source
@@ -192,7 +192,7 @@ on:
- New features can be added to it and will apply to all derived
classes automatically.
-# Writing a transformation element
+## Writing a transformation element
A third base-class that GStreamer provides is the `GstBaseTransform`.
This is a base class for elements with one sourcepad and one sinkpad
diff --git a/pwg-porting.md b/pwg-porting.md
index 300c6db..fd38396 100644
--- a/pwg-porting.md
+++ b/pwg-porting.md
@@ -17,7 +17,7 @@ elements requiring the deprecated bytestream interface, which should
take 1-2 days with random access. The scheduling parts of muxers will
also need a rewrite, which will take about the same amount of time.
-# List of changes
+## List of changes
- Discont events have been replaced by newsegment events. In 0.10, it
is essential that you send a newsegment event downstream before you
diff --git a/pwg-scheduling.md b/pwg-scheduling.md
index ba7b9a2..9ac7299 100644
--- a/pwg-scheduling.md
+++ b/pwg-scheduling.md
@@ -23,7 +23,7 @@ called in turn.
Before we explain pull-mode scheduling, let's first understand how the
different scheduling modes are selected and activated on a pad.
-# The pad activation stage
+## The pad activation stage
During the element state change of READY-\>PAUSED, the pads of an
element will be activated. This happens first on the source pads and
@@ -88,7 +88,7 @@ In the next two sections, we will go closer into pull-mode scheduling
(elements/pads driving the pipeline, and elements/pads providing random
access), and some specific use cases will be given.
-# Pads driving the pipeline
+## Pads driving the pipeline
Sinkpads operating in pull-mode, with the sourcepads operating in
push-mode (or it has no sourcepads when it is a sink), can start a task
@@ -264,7 +264,7 @@ far.
}
```
-# Providing random access
+## Providing random access
In the previous section, we have talked about how elements (or pads)
that are activated to drive the pipeline using their own task, must use
diff --git a/pwg-statemanage-states.md b/pwg-statemanage-states.md
index 5e5e60a..3d4a37d 100644
--- a/pwg-statemanage-states.md
+++ b/pwg-statemanage-states.md
@@ -49,7 +49,7 @@ differentiate between PAUSED and PLAYING state. In PLAYING state, sink
elements actually render incoming data, e.g. output audio to a sound
card or render video pictures to an image sink.
-# Managing filter state
+## Managing filter state
If at all possible, your element should derive from one of the new base
classes ([Pre-made base classes](pwg-other-base.md)). There are
diff --git a/sdk-android-tutorial-a-running-pipeline.md b/sdk-android-tutorial-a-running-pipeline.md
index 5545ffa..af9613d 100644
--- a/sdk-android-tutorial-a-running-pipeline.md
+++ b/sdk-android-tutorial-a-running-pipeline.md
@@ -1,6 +1,6 @@
# Android tutorial 2: A running pipeline
-## Goal
+### Goal
![screenshot]
@@ -20,7 +20,7 @@ learn:
- How to allocate a `CustomData` structure from C and have Java host
it
-## Introduction
+### Introduction
When using a Graphical User Interface (UI), if the application waits for
GStreamer calls to complete the user experience will suffer. The usual
@@ -52,7 +52,7 @@ The code below builds a pipeline with an `audiotestsrc` and an
setting the pipeline to PLAYING or PAUSED. A TextView in the UI shows
messages sent from the C code (for errors and state changes).
-## A pipeline on Android \[Java code\]
+### A pipeline on Android \[Java code\]
**src/org/freedesktop/gstreamer/tutorials/tutorial\_2/Tutorial2.java**
@@ -337,7 +337,7 @@ all allocated resources.
This concludes the UI part of the tutorial.
-## A pipeline on Android \[C code\]
+### A pipeline on Android \[C code\]
**jni/tutorial-2.c**
@@ -356,11 +356,11 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
* a jlong, which is always 64 bits, without warnings.
*/
#if GLIB_SIZEOF_VOID_P == 8
-# define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
-# define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
+## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
+## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
#else
-# define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
-# define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
+## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
+## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
#endif
/* Structure to contain all our information, so we can pass it to callbacks */
@@ -712,7 +712,7 @@ done for simplicity).
Let’s review now the first native method which can be directly called
from Java:
-### `gst_native_init()` (`nativeInit()` from Java)
+#### `gst_native_init()` (`nativeInit()` from Java)
This method is called at the end of Java's `onCreate()`.
@@ -741,7 +741,7 @@ pthread_create (&gst_app_thread, NULL, &app_function, data);
Finally, a thread is created and it starts running the
`app_function()` method.
-### `app_function()`
+#### `app_function()`
``` c
/* Main method for the native code. This is executed on its own thread. */
@@ -820,7 +820,7 @@ reviewed below.
Once the main loop has quit, all resources are freed in lines 178 to
181.
-### `check_initialization_complete()`
+#### `check_initialization_complete()`
``` c
static void check_initialization_complete (CustomData *data) {
@@ -864,7 +864,7 @@ This behavior is implemented in the `get_jni_env()` method, used for
example in `check_initialization_complete()` as we have just seen. Let’s
see how it works, step by step:
-### `get_jni_env()`
+#### `get_jni_env()`
``` c
static JNIEnv *get_jni_env (void) {
@@ -886,13 +886,13 @@ If it returns NULL, we never attached this thread, so we do now with
with
[pthread\_setspecific()](http://pubs.opengroup.org/onlinepubs/9699919799/functions/pthread_setspecific.html).
-### `attach_current_thread()`
+#### `attach_current_thread()`
This method is simply a convenience wrapper around
[AttachCurrentThread()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/invocation.html#attach_current_thread)
to deal with its parameters.
-### `detach_current_thread()`
+#### `detach_current_thread()`
This method is called by the pthreads library when a TLS key is deleted,
meaning that the thread is about to be destroyed. We simply detach the
@@ -901,7 +901,7 @@ thread from the JavaVM with
Let's now review the rest of the native methods accessible from Java:
-### `gst_native_finalize()` (`nativeFinalize()` from Java)
+#### `gst_native_finalize()` (`nativeFinalize()` from Java)
``` c
static void gst_native_finalize (JNIEnv* env, jobject thiz) {
@@ -936,7 +936,7 @@ about to be destroyed. Here, we:
`Tutorial2` class to NULL with
`SET_CUSTOM_DATA()`.
-### `gst_native_play` and `gst_native_pause()` (`nativePlay` and `nativePause()` from Java)
+#### `gst_native_play` and `gst_native_pause()` (`nativePlay` and `nativePause()` from Java)
These two simple methods retrieve `CustomData` from the passed-in object
with `GET_CUSTOM_DATA()` and set the pipeline found inside `CustomData`
@@ -944,13 +944,13 @@ to the desired state, returning immediately.
Finally, let’s see how the GStreamer callbacks are handled:
-### `error_cb` and `state_changed_cb`
+#### `error_cb` and `state_changed_cb`
This tutorial does not do much in these callbacks. They simply parse the
error or state changed message and display a message in the UI using the
`set_ui_message()` method:
-### `set_ui_message()`
+#### `set_ui_message()`
``` c
static void set_ui_message (const gchar *message, CustomData *data) {
@@ -993,7 +993,7 @@ We check for exceptions with the JNI
method and free the UTF16 message with
[DeleteLocalRef()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#DeleteLocalRef).
-## A pipeline on Android \[Android.mk\]
+### A pipeline on Android \[Android.mk\]
**jni/Android.mk**
@@ -1028,7 +1028,7 @@ And this is it\! This has been a rather long tutorial, but we covered a
lot of territory. Building on top of this one, the following ones are
shorter and focus only on the new topics.
-## Conclusion
+### Conclusion
This tutorial has shown:
diff --git a/sdk-android-tutorial-media-player.md b/sdk-android-tutorial-media-player.md
index 1bf8d9f..cb8f8a4 100644
--- a/sdk-android-tutorial-media-player.md
+++ b/sdk-android-tutorial-media-player.md
@@ -1,6 +1,6 @@
# Android tutorial 4: A basic media player
-## Goal
+### Goal
![screenshot]
@@ -19,7 +19,7 @@ It also uses the knowledge gathered in the [](sdk-basic-tutorials.md) regarding:
- How to use `playbin` to play any kind of media
- How to handle network resilience problems
-## Introduction
+### Introduction
From the previous tutorials, we already have almost all necessary pieces
to build a media player. The most complex part is assembling a pipeline
@@ -41,7 +41,7 @@ the video sink is not forced to draw black borders around the clip.
media content. You can still force the video surface to have a specific
size if you really want to.
-## A basic media player \[Java code\]
+### A basic media player \[Java code\]
**src/com/gst\_sdk\_tutorials/tutorial\_4/Tutorial4.java**
@@ -298,7 +298,7 @@ public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSee
}
```
-### Supporting arbitrary media URIs
+#### Supporting arbitrary media URIs
The C code provides the `nativeSetUri()` method so we can indicate the
URI of the media to play. Since `playbin` will be taking care of
@@ -319,7 +319,7 @@ private void setMediaUri() {
We call `setMediaUri()` in the `onGStreamerInitialized()` callback, once
the pipeline is ready to accept commands.
-### Reporting media size
+#### Reporting media size
Every time the size of the media changes (which could happen mid-stream,
for some kind of streams), or when it is first detected, C code calls
@@ -349,7 +349,7 @@ the UI must be called from the main thread, and we are now in a
callback from some GStreamer internal thread. Hence, the usage of
[runOnUiThread()](http://developer.android.com/reference/android/app/Activity.html#runOnUiThread\(java.lang.Runnable\)).
-### Refreshing the Seek Bar
+#### Refreshing the Seek Bar
[](sdk-basic-tutorial-toolkit-integration.md)
has already shown how to implement a [Seek
@@ -404,7 +404,7 @@ private void updateTimeWidget () {
}
```
-### Seeking with the Seek Bar
+#### Seeking with the Seek Bar
To perform the second function of the [Seek
Bar](http://developer.android.com/reference/android/widget/SeekBar.html) (allowing
@@ -479,7 +479,7 @@ desired playing state.
This concludes the User interface part of this tutorial. Let’s review
now the under-the-hood C code that allows this to work.
-## A basic media player \[C code\]
+### A basic media player \[C code\]
**jni/tutorial-4.c**
@@ -502,11 +502,11 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
* a jlong, which is always 64 bits, without warnings.
*/
#if GLIB_SIZEOF_VOID_P == 8
-# define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
-# define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
+## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
+## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
#else
-# define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
-# define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
+## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
+## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
#endif
/* Do not allow seeks to be performed closer than this distance. It is visually useless, and will probably
@@ -1054,7 +1054,7 @@ jint JNI_OnLoad(JavaVM *vm, void *reserved) {
}
```
-### Supporting arbitrary media URIs
+#### Supporting arbitrary media URIs
Java code will call `gst_native_set_uri()` whenever it wants to change
the playing URI (in this tutorial the URI never changes, but it could):
@@ -1100,7 +1100,7 @@ not. Live sources must not use buffering (otherwise latency is
introduced which is inacceptable for them), so we keep track of this
information in the `is_live` variable.
-### Reporting media size
+#### Reporting media size
Some codecs allow the media size (width and height of the video) to
change during playback. For simplicity, this tutorial assumes that they
@@ -1152,7 +1152,7 @@ The helper functions `gst_video_format_parse_caps()` and
manageable integers, which we pass to Java through
its `onMediaSizeChanged()` callback.
-### Refreshing the Seek Bar
+#### Refreshing the Seek Bar
To keep the UI updated, a GLib timer is installed in the
`app_function()` that fires 4 times per second (or every 250ms), right
@@ -1201,7 +1201,7 @@ Bear in mind that all time-related measures returned by GStreamer are in
nanoseconds, whereas, for simplicity, we decided to make the UI code
work in milliseconds.
-### Seeking with the Seek Bar
+#### Seeking with the Seek Bar
The Java UI code already takes care of most of the complexity of seeking
by dragging the thumb of the Seek Bar. From C code, we just need to
@@ -1215,7 +1215,7 @@ Bar can generate a very high number of seek requests in a short period
of time, which is visually useless and will impair responsiveness. Let’s
see how to overcome these problems.
-#### Delayed seeks
+##### Delayed seeks
In
`gst_native_set_position()`:
@@ -1255,7 +1255,7 @@ Once the pipeline moves from the READY to the PAUSED state, we check if
there is a pending seek operation and execute it. The
`desired_position` variable is reset inside `execute_seek()`.
-#### Seek throttling
+##### Seek throttling
A seek is potentially a lengthy operation. The demuxer (the element
typically in charge of seeking) needs to estimate the appropriate byte
@@ -1331,7 +1331,7 @@ The one-shot timer calls `delayed_seek_cb()`, which simply calls
>
> This is not a complete solution: the scheduled seek will still be executed, even though a more-recent seek has already been executed that should have cancelled it. However, it is a good tradeoff between functionality and simplicity.
-### Network resilience
+#### Network resilience
[](sdk-basic-tutorial-streaming.md) has already
shown how to adapt to the variable nature of the network bandwidth by
@@ -1373,7 +1373,7 @@ pipeline, which might be different to the current state, because
buffering forces us to go to PAUSED. Once buffering is complete we set
the pipeline to the `target_state`.
-## A basic media player \[Android.mk\]
+### A basic media player \[Android.mk\]
The only line worth mentioning in the makefile
is `GSTREAMER_PLUGINS`:
@@ -1388,7 +1388,7 @@ In which all plugins required for playback are loaded, because it is not
known at build time what would be needed for an unspecified URI (again,
in this tutorial the URI does not change, but it will in the next one).
-## Conclusion
+### Conclusion
This tutorial has shown how to embed a `playbin` pipeline into an
Android application. This, effectively, turns such application into a
diff --git a/sdk-android-tutorial-video.md b/sdk-android-tutorial-video.md
index a43af3e..c8c6880 100644
--- a/sdk-android-tutorial-video.md
+++ b/sdk-android-tutorial-video.md
@@ -1,6 +1,6 @@
# Android tutorial 3: Video
-## Goal
+### Goal
![screenshot]
@@ -15,7 +15,7 @@ shows:
to GStreamer
- How to keep GStreamer posted on changes to the surface
-## Introduction
+### Introduction
Since Android does not provide a windowing system, a GStreamer video
sink cannot create pop-up windows as it would do on a Desktop platform.
@@ -32,7 +32,7 @@ widget, we pass it to the C code which stores it. The
tutorial is extended so that GStreamer is not considered initialized
until a main loop is running and a drawing surface has been received.
-## A video surface on Android \[Java code\]
+### A video surface on Android \[Java code\]
**src/com/gst\_sdk\_tutorials/tutorial\_3/Tutorial3.java**
@@ -239,7 +239,7 @@ to notify GStreamer about the new surface. We use
Let’s review the C code to see what these functions do.
-## A video surface on Android \[C code\]
+### A video surface on Android \[C code\]
**jni/tutorial-3.c**
@@ -262,11 +262,11 @@ GST_DEBUG_CATEGORY_STATIC (debug_category);
* a jlong, which is always 64 bits, without warnings.
*/
#if GLIB_SIZEOF_VOID_P == 8
-# define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
-# define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
+## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(*env)->GetLongField (env, thiz, fieldID)
+## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)data)
#else
-# define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
-# define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
+## define GET_CUSTOM_DATA(env, thiz, fieldID) (CustomData *)(jint)(*env)->GetLongField (env, thiz, fieldID)
+## define SET_CUSTOM_DATA(env, thiz, fieldID, data) (*env)->SetLongField (env, thiz, fieldID, (jlong)(jint)data)
#endif
/* Structure to contain all our information, so we can pass it to callbacks */
@@ -744,7 +744,7 @@ And this is all there is to it, regarding the main code. Only a couple
of details remain, the subclass we made for SurfaceView and the
`Android.mk` file.
-## GStreamerSurfaceView, a convenient SurfaceView wrapper \[Java code\]
+### GStreamerSurfaceView, a convenient SurfaceView wrapper \[Java code\]
By default,
[SurfaceView](http://developer.android.com/reference/android/view/SurfaceView.html) does
@@ -857,7 +857,7 @@ public class GStreamerSurfaceView extends SurfaceView {
}
```
-## A video surface on Android \[Android.mk\]
+### A video surface on Android \[Android.mk\]
**/jni/Android.mk**
@@ -892,7 +892,7 @@ and `GSTREAMER_PLUGINS_EFFECTS` for the `warptv` element. This tutorial
requires the `gstreamer-video` library to use the
`VideoOverlay` interface and the video helper methods.
-## Conclusion
+### Conclusion
This tutorial has shown:
diff --git a/sdk-basic-media-player.md b/sdk-basic-media-player.md
index 03a492f..59b8d63 100644
--- a/sdk-basic-media-player.md
+++ b/sdk-basic-media-player.md
@@ -1,6 +1,6 @@
# Basic Media Player
-# Goal
+## Goal
This tutorial shows how to create a basic media player with
[Qt](http://qt-project.org/) and
@@ -15,7 +15,7 @@ In particular, you will learn:
- How to create a video output
- Updating the GUI based on playback time
-# A media player with Qt
+## A media player with Qt
These files are located in the qt-gstreamer SDK's `examples/` directory.
@@ -29,7 +29,7 @@ each file to expand.
```
project(qtgst-example-player)
find_package(QtGStreamer REQUIRED)
-# automoc is now a built-in tool since CMake 2.8.6.
+## automoc is now a built-in tool since CMake 2.8.6.
if (${CMAKE_VERSION} VERSION_LESS "2.8.6")
find_package(Automoc4 REQUIRED)
else()
@@ -545,9 +545,9 @@ void Player::handlePipelineStateChange(const QGst::StateChangedMessagePtr & scm)
#include "moc_player.cpp"
```
-# Walkthrough
+## Walkthrough
-## Setting up GStreamer
+### Setting up GStreamer
We begin by looking at `main()`:
@@ -710,7 +710,7 @@ Finally, we tell `playbin` what to play by setting the `uri` property:
m_pipeline->setProperty("uri", realUri);
```
-## Starting Playback
+### Starting Playback
After `Player::setUri()` is called, `MediaApp::openFile()` calls
`play()` on the `Player` object:
@@ -861,7 +861,7 @@ Due to the way Qt handles signals that cross threads, there is no need
to worry about calling UI functions from outside the UI thread in this
example.
-# Conclusion
+## Conclusion
This tutorial has shown:
diff --git a/sdk-installing-for-ios-development.md b/sdk-installing-for-ios-development.md
index 0a40fb4..7c73193 100644
--- a/sdk-installing-for-ios-development.md
+++ b/sdk-installing-for-ios-development.md
@@ -2,7 +2,7 @@
![](images/icons/emoticons/information.png) All versions starting form iOS 6 are supported
-## Prerequisites
+### Prerequisites
For iOS development you need to download Xcode and the iOS SDK. Xcode
can be found at the App Store or
@@ -17,7 +17,7 @@ recommend taking a look at the available documentation at Apple's
website.
[This](http://developer.apple.com/library/ios/#DOCUMENTATION/iPhone/Conceptual/iPhone101/Articles/00_Introduction.html) can be a good starting point.
-# Download and install GStreamer binaries
+## Download and install GStreamer binaries
GStreamer binary installer can be found at:
@@ -40,14 +40,14 @@ Xcode application templates for GStreamer development. Those templates
are also copied to `~/Library/Developer/Xcode/Templates` during
installation so that Xcode can find them.
-## Configure your development environment
+### Configure your development environment
GStreamer is written in C, and the iOS API uses mostly Objective-C (and
C for some parts), but this should cause no problems as those languages
interoperate freely. You can mix both in the same source code, for
example.
-### Building the tutorials
+#### Building the tutorials
The GStreamer SDK ships a few tutorials in the `xcode iOS` folder inside
the `.dmg` file. Copy them out of the package and into a more suitable
@@ -56,7 +56,7 @@ at the sources and build them. This should confirm that the installation
works and give some insight on how simple it is to mix Objective-C and C
code.
-### Creating new projects
+#### Creating new projects
After installation, when creating a new Xcode project, you should see
the GStreamer project templates under the `Templates` category. OS X and
diff --git a/sdk-installing-on-linux.md b/sdk-installing-on-linux.md
index c966131..36ae36c 100644
--- a/sdk-installing-on-linux.md
+++ b/sdk-installing-on-linux.md
@@ -1,6 +1,6 @@
# Installing on Linux
-# Prerequisites
+## Prerequisites
To develop applications using the GStreamer SDK on Linux you will need
one of the following supported distributions:
@@ -28,7 +28,7 @@ a terminal.
</tbody>
</table>
-# Download and install the SDK
+## Download and install the SDK
The GStreamer SDK provides a set of binary packages for supported Linux
distributions. Detailed instructions on how to install the packages for
@@ -159,7 +159,7 @@ su -c 'yum install gstreamer-sdk-devel'
Enter the superuser/root password when prompted.
-# Configure your development environment
+## Configure your development environment
When building applications using GStreamer, the compiler must be able to
locate its libraries. However, in order to prevent possible collisions
@@ -204,7 +204,7 @@ the `configure `script from inside the `gst-sdk-shell` environment.
</tbody>
</table>
-### Getting the tutorial's source code
+#### Getting the tutorial's source code
The source code for the tutorials can be copied and pasted from the
tutorial pages into a text file, but, for convenience, it is also
@@ -220,7 +220,7 @@ Or you can locate the source code in
`/opt/gstreamer-sdk/share/gst-sdk/tutorials`, and copy it to a working
folder of your choice.
-### Building the tutorials
+#### Building the tutorials
You need to enter the GStreamer SDK shell in order for the compiler to
use the right libraries (and avoid conflicts with the system libraries).
@@ -247,7 +247,7 @@ Using the file name of the tutorial you are interested in
</tbody>
</table>
-### Running the tutorials
+#### Running the tutorials
To run the tutorials, simply execute the desired tutorial (**from within
the `gst-sdk-shell`**):
@@ -256,7 +256,7 @@ the `gst-sdk-shell`**):
./basic-tutorial-1
```
-### Deploying your application
+#### Deploying your application
Your application built with the GStreamer SDK must be able locate the
GStreamer libraries when deployed in the target machine. You have at
diff --git a/sdk-ios-tutorial-video.md b/sdk-ios-tutorial-video.md
index 6cbe0d7..5c422d6 100644
--- a/sdk-ios-tutorial-video.md
+++ b/sdk-ios-tutorial-video.md
@@ -1,6 +1,6 @@
# iOS tutorial 3: Video
-# Goal
+## Goal
![screenshot]
@@ -14,7 +14,7 @@ shows:
- How to allocate a drawing surface on the Xcode Interface Builder and
pass it to GStreamer
-# Introduction
+## Introduction
Since iOS does not provide a windowing system, a GStreamer video sink
cannot create pop-up windows as it would do on a Desktop platform.
@@ -27,7 +27,7 @@ placed on the main storyboard. In the `viewDidLoad` method of the
`ViewController`, we pass a pointer to this `UIView `to the instance of
the `GStreamerBackend`, so it can tell the video sink where to draw.
-# The User Interface
+## The User Interface
The storyboard from the previous tutorial is expanded: A `UIView `is
added over the toolbar and pinned to all sides so it takes up all
@@ -63,7 +63,7 @@ outlets):
@end
```
-# The View Controller
+## The View Controller
The `ViewController `class manages the UI, instantiates
the `GStreamerBackend` and also performs some UI-related tasks on its
@@ -238,7 +238,7 @@ The final size is reported to the layout engine by changing the
are accessible to the `ViewController `through IBOutlets, as is usually
done with other widgets.
-# The GStreamer Backend
+## The GStreamer Backend
The `GStreamerBackend` class performs all GStreamer-related tasks and
offers a simplified interface to the application, which does not need to
@@ -524,7 +524,7 @@ only element in this pipeline implementing it, so it will be returned.
Once we have the video sink, we inform it of the `UIView` to use for
rendering, through the `gst_video_overlay_set_window_handle()` method.
-# EaglUIView
+## EaglUIView
One last detail remains. In order for `glimagesink` to be able to draw
on the
@@ -561,7 +561,7 @@ tutorial storyboard to see how to achieve this.
And this is it, using GStreamer to output video onto an iOS application
is as simple as it seems.
-# Conclusion
+## Conclusion
This tutorial has shown:
diff --git a/sdk-legal-information.md b/sdk-legal-information.md
index 2f1e4f4..aba6536 100644
--- a/sdk-legal-information.md
+++ b/sdk-legal-information.md
@@ -4,7 +4,7 @@ short-description: Patents, Licenses and legal F.A.Q.
# Legal information
-# Installer, default installation
+## Installer, default installation
The installer (Microsoft Windows and MacOSX) and the default
installation (GNU/Linux) contain and install the minimal default
@@ -13,7 +13,7 @@ components is also possible, but read on for certain legal cautions you
might want to take. All downloads are from the
[gstreamer.freedesktop.org](http://gstreamer.freedesktop.org) website.
-# Licensing of GStreamer
+## Licensing of GStreamer
GStreamer minimal default installation only contains packages which
are licensed under the [GNU LGPL license
@@ -42,13 +42,13 @@ different licenses, which are both more liberal than the LGPL (they are
less strict conditions for granting the license) and compatible with the
LGPL. This is advised locally.
-# Optional packages
+## Optional packages
There are two types of optional packages (GPL and Patented), which are
under a different license or have other issues concerning patentability
(or both).
-### GPL code
+#### GPL code
Part of the optional packages are under the GNU GPL
[v2](http://www.gnu.org/licenses/old-licenses/gpl-2.0.html) or
@@ -60,7 +60,7 @@ this works in your precise case and design choices. GPL is called
license has the largest possible scope and extends to all derivative
works.
-### Patents
+#### Patents
Certain software, and in particular software that implements
multimedia standard formats such as MP3, MPEG 2 video and audio, h.264,
@@ -84,7 +84,7 @@ circumstances compiling the same code for a given platform or
distributing the object code is not an act that infringes one or more
patents.
-# Software is as-is
+## Software is as-is
All software and the entire GStreamer binaries areprovided as-is, without any
warranty whatsoever. The individual licenses have particular language
@@ -97,18 +97,18 @@ maintenance agreements under certain conditions, you are invited to
contact them in order to receive further details and discuss of the
commercial terms.
-# Data protection
+## Data protection
This website might use cookies and HTTP logs for statistical analysis
and on an aggregate basis only.
-# Frequently Asked Questions
+## Frequently Asked Questions
-#### What licenses are there?
+##### What licenses are there?
GStreamer binaries containst software under various licenses. See above.
-#### How does this relate to the packaging system?
+##### How does this relate to the packaging system?
The packaging is only a more convenient way to install software and
decide what's good for you. GStreamer is meant to be modular, making use
@@ -128,7 +128,7 @@ reference, but we cannot guarantee that our selection is 100% correct,
so it is up to the user to verify the actual licensing conditions before
distributing works that utilize GStreamer.
-#### Can I / must I distribute GStreamer along with my application?
+##### Can I / must I distribute GStreamer along with my application?
You surely can. All software is Free/Open Source software, and can be
distributed freely. You are not **required** to distribute it. Only,
@@ -142,7 +142,7 @@ entire source code, you might want to include it (or the directories
containing the source code) with your application as a safe way to
comply with this requirement of the license.
-#### What happens when I modify the GStreamer's source code?
+##### What happens when I modify the GStreamer's source code?
You are invited to do so, as the licenses (unless you are dealing with
proprietary bits, but in that case you will not find the corresponding
@@ -157,7 +157,7 @@ to fork the code, if at all possible. he Cerbero build system has a
containing all of the complete corresponding machine readable source
code that you are required to provide.
-#### How does licensing relate to software patents? What about software patents in general?
+##### How does licensing relate to software patents? What about software patents in general?
This is a tricky question. We believe software patents should not exist,
so that by distributing and using software on a general purpose machine
@@ -185,7 +185,7 @@ This is why GStreamer has taken a modular approach, so that you can use
a Free plugins or a proprietary, patent royalty bearing, plugin for a
given standard.
-#### What about static vs. dynamic linking and copyleft?
+##### What about static vs. dynamic linking and copyleft?
We cannot provide one single answer to that question. Since copyright in
software works as copyright in literature, static linking means
diff --git a/sdk-playback-tutorial-hardware-accelerated-video-decoding.md b/sdk-playback-tutorial-hardware-accelerated-video-decoding.md
index 6ae8cc1..0207e37 100644
--- a/sdk-playback-tutorial-hardware-accelerated-video-decoding.md
+++ b/sdk-playback-tutorial-hardware-accelerated-video-decoding.md
@@ -1,6 +1,6 @@
# Playback tutorial 8: Hardware-accelerated video decoding
-## Goal
+### Goal
Hardware-accelerated video decoding has rapidly become a necessity, as
low-power devices grow more common. This tutorial (more of a lecture,
@@ -11,7 +11,7 @@ Sneak peek: if properly setup, you do not need to do anything special to
activate hardware acceleration; GStreamer automatically takes advantage
of it.
-## Introduction
+### Introduction
Video decoding can be an extremely CPU-intensive task, especially for
higher resolutions like 1080p HDTV. Fortunately, modern graphics cards,
@@ -85,7 +85,7 @@ the [gstreamer-ducati](https://github.com/robclark/gst-ducati) plugin.
`v4l2` plugin in `gst-plugins-good`. This can support both decoding
and encoding depending on the platform.
-## Inner workings of hardware-accelerated video decoding plugins
+### Inner workings of hardware-accelerated video decoding plugins
These APIs generally offer a number of functionalities, like video
decoding, post-processing, or presentation of the decoded
@@ -170,7 +170,7 @@ the auto-plugging mechanism to never select it.
> ![warning] The GStreamer developers often rank hardware decoders lower than
> the software ones when they are defective. This should act as a warning.
-# Conclusion
+## Conclusion
This tutorial has shown a bit how GStreamer internally manages hardware
accelerated video decoding. Particularly,
diff --git a/sdk-qt-gstreamer-vs-c-gstreamer.md b/sdk-qt-gstreamer-vs-c-gstreamer.md
index 33a4138..549fccf 100644
--- a/sdk-qt-gstreamer-vs-c-gstreamer.md
+++ b/sdk-qt-gstreamer-vs-c-gstreamer.md
@@ -4,7 +4,7 @@ QtGStreamer is designed to mirror the C GStreamer API as closely as
possible. There are, of course, minor differences. They are documented
here.
-# Common Functions
+## Common Functions
<table>
<colgroup>
@@ -37,12 +37,12 @@ here.
</tbody>
</table>
-# Naming Convention
+## Naming Convention
QtGStreamer follows a strict naming policy to help make cross
referencing easier:
-### Namespaces
+#### Namespaces
The "G" namespace (`GObject`, `GValue`, etc...) is referred to as
"QGlib".
@@ -50,13 +50,13 @@ The "G" namespace (`GObject`, `GValue`, etc...) is referred to as
The "Gst" namespace (`GstObject`, `GstElement`, etc...) is referred to
as "QGst".
-### Class Names
+#### Class Names
Class names should be the same as their G\* equivalents, with the
namespace prefix removed. For example, "`GstObject`" becomes
"`QGst::Object`", "`GParamSpec`" becomes "`QGlib::ParamSpec`", etc...
-### Method Names
+#### Method Names
In general the method names should be the same as the GStreamer ones,
with the g\[st\]\_\<class\> prefix removed and converted to camel case.
@@ -92,12 +92,12 @@ There are cases where this may not be followed:
make sense in english, as "sate" is the subject and should go before
the verb "is". So, it becomes `stateIsLocked()`.
-# Reference Counting
+## Reference Counting
Reference counting is handled the same way as Qt does. There is no need
to call `g_object_ref()`` and g_object_unref()`.
-# Access to GStreamer Elements
+## Access to GStreamer Elements
QtGStreamer provides access to the underlying C objects, in case you
need them. This is accessible with a simple cast:
diff --git a/sdk-qt-tutorials.md b/sdk-qt-tutorials.md
index 9a925d8..9dee82d 100644
--- a/sdk-qt-tutorials.md
+++ b/sdk-qt-tutorials.md
@@ -1,6 +1,6 @@
# Qt tutorials
-# Welcome to the GStreamer SDK Qt tutorials
+## Welcome to the GStreamer SDK Qt tutorials
These tutorials describe Qt-specific topics. General GStreamer concepts
will not be explained in these tutorials, so the [Basic
diff --git a/sdk-using-appsink-appsrc-in-qt.md b/sdk-using-appsink-appsrc-in-qt.md
index 5c50a00..83cd56d 100644
--- a/sdk-using-appsink-appsrc-in-qt.md
+++ b/sdk-using-appsink-appsrc-in-qt.md
@@ -1,6 +1,6 @@
# Using appsink/appsrc in Qt
-# Goal
+## Goal
For those times when you need to stream data into or out of GStreamer
through your application, GStreamer includes two helpful elements:
@@ -15,7 +15,7 @@ pipeline to decode an audio file, stream it into an application's code,
then stream it back into your audio output device. All this, using
QtGStreamer.
-# Steps
+## Steps
First, the files. These are also available in the
`examples/appsink-src` directory of the QGstreamer SDK.
@@ -136,7 +136,7 @@ int main(int argc, char **argv)
}
```
-## Walkthrough
+### Walkthrough
As this is a very simple example, most of the action happens in the
`Player`'s constructor. First, GStreamer is initialized through
@@ -233,7 +233,7 @@ Player::Player(int argc, char **argv)
From there, buffers flow into the `autoaudiosink` element, which
automatically figures out a way to send it to your speakers.
-# Conclusion
+## Conclusion
You should now have an understanding of how to push and pull arbitrary
data into and out of a GStreamer pipeline.