Coder Social home page Coder Social logo

pitivi / gst-editing-services Goto Github PK

View Code? Open in Web Editor NEW
1.0 1.0 1.0 420 KB

MIRROR of the GStreamer Editing Services repository for pull requests. The official git repository is at http://cgit.freedesktop.org/gstreamer/gst-editing-services/

License: Other

Shell 0.25% Python 0.42% C 92.75% C++ 6.57%

gst-editing-services's Introduction

Pitivi

Pitivi is a video editor built upon the GStreamer Editing Services. It aims to be an intuitive and flexible application that can appeal to newbies and professionals alike.

Download on Flathub

Your involvement is what keeps this project moving forward. There are many ways in which you can contribute.

For the list of dependencies, look at pitivi/check.py

  • "Hard" dependencies are required for Pitivi to function properly
  • "Soft" dependencies are recommended for an optimal user experience (packagers should add them as recommended or required packages).

Hacking

You can hack on Pitivi in a local flatpak sandbox which is easy to set up.

To get get the most out of your time and effort, come talk to us so we coordinate.

gst-editing-services's People

Contributors

antonbelka avatar bilboed avatar calvaris avatar dschleef avatar ensonic avatar freesteph avatar kerrickstaley avatar lrn avatar lubosz avatar luisbg avatar mathieu69 avatar mbatle avatar ndufresne avatar nekohayo avatar palango avatar prahal avatar rflex avatar superdump avatar tp-m avatar tsaunier avatar vliaskov avatar volodymyrrudyi avatar ylatuya avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

jojva

gst-editing-services's Issues

(critical) GES fails to locate “Quad Drone - Raiatea PK10 Ouest - French Polynesia - GOPRO FPV-Ah2L3mnH2mg.mp4” in http://jeff.ecchi.ca/public/sample-pitivi-projects/gnome-asia.tar ;

With GES Assets, what you’ll see is the project loading seemingly successfully, with all the clips/thumbnails showing up in the media library, but you’ll notice a ton of failed assertions in the term:

in ges-timeline-layer.c:264:new_asset_cb, Asset could not be created for uri (null), error: Resource not found

Timeline does not start when trying to encode with a video/quicktime container

This script shows that the timeline does not start for quicktime, but for matroska. Same codecs.

#!/usr/bin/python
from gi.repository import GES, Gst, Gtk, GstPbutils, GObject

import signal

outputFile = 'file:///home/bmonkey/workspace/ges/export/quickTimeTest'

def handle_sigint(sig, frame):
    Gtk.main_quit()

def busMessageCb(bus, message):
    if message.type == Gst.MessageType.EOS:
        print("eos")
        Gtk.main_quit()
    elif message.type == Gst.MessageType.ERROR:
        print (message.parse_error())

def duration_querier(pipeline):
    print(pipeline.query_position(Gst.Format.TIME)[1] / Gst.SECOND)
    return True

def encoderProfile(container, video, audio):
  container_profile = GstPbutils.EncodingContainerProfile.new(
    "quicktime",
    "QuickTime Encoding Profile",
    Gst.Caps.new_empty_simple(container),
    None)

  video_profile = GstPbutils.EncodingVideoProfile.new(
    Gst.Caps.new_empty_simple(video),
    None,
    Gst.Caps.new_empty_simple("video/x-raw"),
    0)
  container_profile.add_profile(video_profile)

  audio_profile = GstPbutils.EncodingAudioProfile.new(
    Gst.Caps.new_empty_simple(audio),
    None,
    Gst.Caps.new_empty_simple("audio/x-raw"),
    0)
  container_profile.add_profile(audio_profile)

  return container_profile

if __name__ =="__main__":
  Gst.init(None)
  GES.init()

  timeline = GES.Timeline.new_audio_video()
  layer = GES.Layer()
  timeline.add_layer(layer)
  asset = GES.Asset.request(GES.TestClip, None)

  layer.add_asset(asset, 0 * Gst.SECOND, 0, 10 * Gst.SECOND, GES.TrackType.UNKNOWN)

  timeline.commit()

  pipeline = GES.TimelinePipeline()
  pipeline.add_timeline(timeline)

  # does not start, no error
  format = ["video/quicktime", "video/x-h264", "audio/x-vorbis", "mp4"]

  # Works
  #format = ["video/x-matroska", "video/x-h264", "audio/x-vorbis", "mkv"]

  container_profile = encoderProfile(format[0], format[1], format[2])

  pipeline.set_render_settings(outputFile + "." + format[3], container_profile)
  pipeline.set_mode(GES.PipelineFlags.RENDER)
  pipeline.set_state(Gst.State.PLAYING)

  bus = pipeline.get_bus()
  bus.add_signal_watch()
  bus.connect("message", busMessageCb)
  GObject.timeout_add(300, duration_querier, pipeline)

  signal.signal(signal.SIGINT, handle_sigint)
  Gtk.main()

The xptv project formatter should be updated to match the xges formatter (bug 673040), namely.

The following works in the xges formatter, but not the xptv one:

     * Project settings (resolution, framerate, etc.) not loaded from the project file
     * Project metadata (title, author, date, etc.) is not loaded.
     * Importance: used for the default filename for rendering, among other things.

thib: This is really simple stuff, just need to think a standard/good API for that (Might be GstStructure based to be extendable/flexible). We need to figure out the proper way to do it (not super obvious, especially as we do not have any project object yet?). We can not change the existing API.

Timeline does not start when trying to encode with a audio/x-aac codec

MKV h264 audio/x-vorbis works, but not with audio/x-aac

#!/usr/bin/python
from gi.repository import GES, Gst, Gtk, GstPbutils, GObject

import signal

outputFile = 'file:///home/bmonkey/workspace/ges/export/aacTest'

def handle_sigint(sig, frame):
    Gtk.main_quit()

def busMessageCb(bus, message):
    if message.type == Gst.MessageType.EOS:
        print("eos")
        Gtk.main_quit()
    elif message.type == Gst.MessageType.ERROR:
        print (message.parse_error())

def duration_querier(pipeline):
    print(pipeline.query_position(Gst.Format.TIME)[1] / Gst.SECOND)
    return True

def encoderProfile(container, video, audio):
  container_profile = GstPbutils.EncodingContainerProfile.new(
    "aac",
    "AAC Encoding Profile",
    Gst.Caps.new_empty_simple(container),
    None)

  video_profile = GstPbutils.EncodingVideoProfile.new(
    Gst.Caps.new_empty_simple(video),
    None,
    Gst.Caps.new_empty_simple("video/x-raw"),
    0)
  container_profile.add_profile(video_profile)

  audio_profile = GstPbutils.EncodingAudioProfile.new(
    Gst.Caps.new_empty_simple(audio),
    None,
    Gst.Caps.new_empty_simple("audio/x-raw"),
    0)
  container_profile.add_profile(audio_profile)

  return container_profile

if __name__ =="__main__":
  Gst.init(None)
  GES.init()

  timeline = GES.Timeline.new_audio_video()
  layer = GES.Layer()
  timeline.add_layer(layer)
  asset = GES.Asset.request(GES.TestClip, None)

  layer.add_asset(asset, 0 * Gst.SECOND, 0, 10 * Gst.SECOND, GES.TrackType.UNKNOWN)

  timeline.commit()

  pipeline = GES.TimelinePipeline()
  pipeline.add_timeline(timeline)

  # does not start, no error
  format = ["video/x-matroska", "video/x-h264", "audio/x-aac", "mkv"]

  # Works
  #format = ["video/x-matroska", "video/x-h264", "audio/x-vorbis", "mkv"]

  container_profile = encoderProfile(format[0], format[1], format[2])

  pipeline.set_render_settings(outputFile + "." + format[3], container_profile)
  pipeline.set_mode(GES.PipelineFlags.RENDER)
  pipeline.set_state(Gst.State.PLAYING)

  bus = pipeline.get_bus()
  bus.add_signal_watch()
  bus.connect("message", busMessageCb)
  GObject.timeout_add(300, duration_querier, pipeline)

  signal.signal(signal.SIGINT, handle_sigint)
  Gtk.main()

Handle the fact that we can use elementname::propertyname to control TrackElement child properties

Concerned code in ges-track-element.c

if (!ges_track_element_lookup_child (object, property_name, &element, &pspec)) {
  GST_WARNING ("You need to provide a valid and controllable property name");
  return FALSE;
}

/* TODO : update this according to new types of bindings */
if (!g_strcmp0 (binding_type, "direct")) {
  /* First remove existing binding */
  binding =
      (GstControlBinding *) g_hash_table_lookup (priv->bindings_hashtable,
      property_name);
  if (binding) {
    GST_LOG ("Removing old binding %p for property %s", binding,
        property_name);
    gst_object_remove_control_binding (GST_OBJECT (element), binding);
  }
  binding =
      gst_direct_control_binding_new (GST_OBJECT (element), property_name,
      source);
  gst_object_add_control_binding (GST_OBJECT (element), binding);
  g_hash_table_insert (priv->bindings_hashtable, g_strdup (property_name),
      binding);
  return TRUE;
}

GES Project's "new" method says I can't change the URI afterwards

Quoting the docs: "If you then save the project to other locations, it will never be updated again and the first valid URI is the URI it will keep refering to."

This does not make any sense. The save method should update the URI at least.

Otherwise we can't replace pitivi's self.current.uri variable by GES'.

Review layer_scaling handling branch

Very few comments I have to say, next review will be right before merging I think so please,
just fix what I commented here :)

Thanks, nice job ;)

commit 434e19fda6bcd6d78ae3646f46d912e1d5a04b63
Author: Simon Corsin <[email protected]>

    timelineedition test: Adds a test_scaling.

        + It will check that the clip updates its size correctly.
        + Will certainly move elsewhere.

Guess what... No tab in the commit messages!

diff --git a/tests/check/ges/timelineedition.c b/tests/check/ges/timelineedition.c
index 540f2ee..56b2ca1 100644
--- a/tests/check/ges/timelineedition.c
+++ b/tests/check/ges/timelineedition.c
@@ -1254,6 +1254,122 @@ GST_START_TEST (test_snapping_groups)

 GST_END_TEST;

+static void
+_set_track_element_width_height (GESTrackElement * trksrc, gint wvalue,
+    gint hvalue)
+{
+  GValue width = { 0 };
+  GValue height = { 0 };
+
+  g_value_init (&width, G_TYPE_INT);
+  g_value_init (&height, G_TYPE_INT);
+  g_value_set_int (&width, wvalue);
+  g_value_set_int (&height, hvalue);
+  ges_track_element_set_child_property (trksrc, "width", &width);
+  ges_track_element_set_child_property (trksrc, "height", &height);
+}

Why don't you simply use ges_track_element_set_child_properties () ??

+
+static void
+check_frame_positionner_size (GESClip * clip, gint width, gint height)
+{
+  GESTrackElement *trksrc;
+  GValue val_width = { 0 };
+  GValue val_height = { 0 };
+  gint real_width, real_height;
+
+  trksrc = GES_CONTAINER_CHILDREN (clip)->data;
+  if (!GES_IS_VIDEO_SOURCE (trksrc))
+    fail_unless (FALSE);
+
+  g_value_init (&val_width, G_TYPE_INT);
+  g_value_init (&val_height, G_TYPE_INT);
+
+  ges_track_element_get_child_property (trksrc, "width", &val_width);
+  ges_track_element_get_child_property (trksrc, "height", &val_height);

Why don't you simply use ges_track_element_get_child_properties () ??

+GST_START_TEST (test_scaling)
+{
+  GESTimeline *timeline;
+  GESLayer *layer;
+  GESAsset *asset1, *asset2;
+  GESClip *clip;
+  GESTrack *trackv = GES_TRACK (ges_video_track_new ());
+  GstCaps *caps =
+      gst_caps_new_simple ("video/x-raw", "width", G_TYPE_INT, 1200, "height",
+      G_TYPE_INT, 1000, NULL);
+
+  ges_init ();
+  timeline = ges_timeline_new ();
+  ges_timeline_add_track (timeline, trackv);
+  layer = ges_layer_new ();
+  fail_unless (ges_timeline_add_layer (timeline, layer));
+
+  g_object_set (layer, "auto-transition", TRUE, NULL);
+
+  asset1 = GES_ASSET (ges_asset_request (GES_TYPE_TEST_CLIP, NULL, NULL));
+  asset2 = GES_ASSET (ges_asset_request (GES_TYPE_TEST_CLIP, NULL, NULL));
+
+  fail_unless (asset1 != NULL && asset2 != NULL);
+
+  GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS (GST_BIN (timeline),
+      GST_DEBUG_GRAPH_SHOW_ALL, "ges-integration-timeline");

Make no sense here.

+
+  ges_track_set_restriction_caps (trackv, caps);
+
+  GST_ERROR ("caps got set");

Nop!

+
+  clip =
+      ges_layer_add_asset (layer, GES_ASSET (asset1), 0 * GST_SECOND,
+      0 * GST_SECOND, 4 * GST_SECOND, GES_TRACK_TYPE_UNKNOWN);
+  gst_object_unref (asset1);
+  g_object_set (clip, "vpattern", (gint) GES_VIDEO_TEST_PATTERN_SMPTE75, NULL);
+
+    /**
+   * Our timeline
+   *                    

Trailing whitspace

+   * 0--------------0
+   * | width : 1200 |
+   * | height: 1000 |
+   * 0--------------2 
+   */
+
+  /* clip takes the size set on the track as a default */
+  check_frame_positionner_size (clip, 1200, 1000);
+
+  if (GES_IS_VIDEO_SOURCE (GES_CONTAINER_CHILDREN (clip)->data))
+    _set_track_element_width_height (GES_CONTAINER_CHILDREN (clip)->data, 1024,
+        768);
+
+  check_frame_positionner_size (clip, 1024, 768);
+
+  clip =
+      ges_layer_add_asset (layer, GES_ASSET (asset2), 2 * GST_SECOND,
+      0 * GST_SECOND, 4 * GST_SECOND, GES_TRACK_TYPE_UNKNOWN);
+  gst_object_unref (asset2);
+
+  if (GES_IS_VIDEO_SOURCE (GES_CONTAINER_CHILDREN (clip)->data))
+    _set_track_element_width_height (GES_CONTAINER_CHILDREN (clip)->data, 200,
+        170);
+
+  play_timeline (timeline, FALSE);

Remove it in you commit

commit 360be4f12c81518e972c42989e9ac9e58df7a379
Author: Simon Corsin <[email protected]>

    test-utils: Adds a utility function to quicly check the timeline.

          + It takes a dump_to_dot argument as a free bonus.

NO tab

diff --git a/tests/check/ges/test-utils.c b/tests/check/ges/test-utils.c
index 7424f54..a2eb5a8 100644
--- a/tests/check/ges/test-utils.c
+++ b/tests/check/ges/test-utils.c
@@ -222,3 +222,61 @@ check_destroyed (GObject * object_to_unref, GObject * first_object, ...)
   g_list_free (objs);

 }
+gboolean
+play_timeline (GESTimeline * timeline, gboolean dump_to_dot)
+{

Forget that, the GST_DOT_= env variable should be enough

commit d3a7b8f9c4069c376421caab8d4de7cd90a439cb
Author: Simon Corsin <[email protected]>

    ges-track: Remove the set_caps from the API, keep it internally.

         + Add restriction caps instead.
         + gnonlin sets these caps on all its children, which
           is not really convenient.

Please no tabs, go bye a .vimrc :P

diff --git a/ges/ges-audio-track.c b/ges/ges-audio-track.c
index 87a60ab..ebb38f0 100644
--- a/ges/ges-audio-track.c
+++ b/ges/ges-audio-track.c

@@ -99,8 +100,11 @@ ges_audio_track_new (void)
   GESAudioTrack *ret;
   GstCaps *caps = gst_caps_from_string (DEFAULT_CAPS);

-  ret = g_object_new (GES_TYPE_AUDIO_TRACK, "caps", caps,
-      "track-type", GES_TRACK_TYPE_AUDIO, NULL);
+  ret =
+      g_object_new (GES_TYPE_AUDIO_TRACK, "track-type", GES_TRACK_TYPE_AUDIO,
+      NULL);
+
+  ges_track_set_caps (GES_TRACK (ret), caps);

Why do you do that? We should keep the caps property, set it as it is done here,
and make it a construct only prop, removing ges_track_set_caps.

diff --git a/ges/ges-internal.h b/ges/ges-internal.h
index 5dd7cd8..ef7cdab 100644
--- a/ges/ges-internal.h
+++ b/ges/ges-internal.h
@@ -291,4 +291,6 @@ G_GNUC_INTERNAL void ges_track_element_split_bindings (GESTrackElement *element,

 G_GNUC_INTERNAL GstElement *create_bin (const gchar * bin_name, GstElement * sub_element, ...);

+G_GNUC_INTERNAL void ges_track_set_caps (GESTrack *track, const GstCaps *caps);
+

Kill it once for all instead

diff --git a/ges/ges-track.c b/ges/ges-track.c
index e95ddf3..c306a5d 100644
--- a/ges/ges-track.c
+++ b/ges/ges-track.c
@@ -61,6 +61,7 @@ struct _GESTrackPrivate
   guint64 duration;

   GstCaps *caps;
+  GstCaps *restriction_caps;

   GstElement *composition;      /* The composition associated with this track */
   GstPad *srcpad;               /* The source GhostPad */
@@ -69,6 +70,8 @@ struct _GESTrackPrivate

   gboolean mixing;
   GstElement *mixing_operation;
+  GstElement *videoscale;

No way we add a videoscale here nop, not using it anyway it seems :)

@@ -364,15 +372,15 @@ ges_track_get_property (GObject * object, guint property_id,
   GESTrack *track = GES_TRACK (object);

   switch (property_id) {
-    case ARG_CAPS:
-      gst_value_set_caps (value, track->priv->caps);
-      break;

Nop do not remove it... as mentionned before.

   self->priv->trackelements_iter =
@@ -571,6 +583,10 @@ ges_track_init (GESTrack * self)
   self->priv->create_element_for_gaps = NULL;
   self->priv->gaps = NULL;
   self->priv->mixing = TRUE;
+  if (self->type == GES_TRACK_TYPE_VIDEO)
+    self->priv->restriction_caps = gst_caps_new_empty_simple ("video/x-raw");
+  else
+    self->priv->restriction_caps = gst_caps_new_empty_simple ("audio/x-raw");

OK, it is a construct only property, and you should set it in subclasses constructor.

  * @mixing: TRUE if the track should be mixing, FALSE otherwise.
diff --git a/ges/ges-track.h b/ges/ges-track.h
index 6a19c23..6670b11 100644
--- a/ges/ges-track.h
+++ b/ges/ges-track.h
@@ -85,13 +85,13 @@ const GstCaps*     ges_track_get_caps                        (GESTrack *track);

Keep it.

commit a290bf6c892c3ce24174c8408549b37d6d99aebc
Author: Simon Corsin <[email protected]>

    framepositionner: put width and height handling there.

Should be fixedup in the

@@ -86,11 +76,15 @@ ges_video_source_create_element (GESTrackElement * trksrc)
   /* That positionner will add metadata to buffers according to its
      properties, acting like a proxy for our smart-mixer dynamic pads. */
   positionner = gst_element_factory_make ("framepositionner", "frame_tagger");
+
   videoscale =
       gst_element_factory_make ("videoscale", "track-element-videoscale");
   capsfilter =
       gst_element_factory_make ("capsfilter", "track-element-capsfilter");

+  ges_frame_positionner_set_capsfilter (GST_FRAME_POSITIONNER (positionner),
+      capsfilter);
+

This sounds weird to me tbh, already talked about it, and it is not foundamentaly wrong,
as fare as we talk about frame geomerty, but if we start setting framerate and stuff like
that it will be very weird, it is internal anyway so I think it is fine.

```diff
diff --git a/ges/gstframepositionner.c b/ges/gstframepositionner.c
index 97496da..e7932de 100644
--- a/ges/gstframepositionner.c
+++ b/ges/gstframepositionner.c
@@ -62,6 +64,39 @@ GST_STATIC_PAD_TEMPLATE ("sink",
 G_DEFINE_TYPE (GstFramePositionner, gst_frame_positionner,
     GST_TYPE_BASE_TRANSFORM);

+void
+ges_frame_positionner_set_capsfilter (GstFramePositionner * pos,
+    GstElement * capsfilter)
+{
+  pos->capsfilter = capsfilter;
+}
+
+static void
+gst_frame_positionner_update_size (GstFramePositionner * pos)
+{
+  GstCaps *size_caps;
+
+  if (pos->capsfilter == NULL)
+    return;
+
+  if (pos->width == 0 && pos->height == 0)
+    size_caps = gst_caps_new_empty_simple ("video/x-raw");
+  else if (pos->width == 0)
+    size_caps =
+        gst_caps_new_simple ("video/x-raw", "height", G_TYPE_INT, pos->height,
+        NULL);
+  else if (pos->height == 0)
+    size_caps =
+        gst_caps_new_simple ("video/x-raw", "width", G_TYPE_INT, pos->width,
+        NULL);
+  else
+    size_caps =
+        gst_caps_new_simple ("video/x-raw", "width", G_TYPE_INT,
+        pos->width, "height", G_TYPE_INT, pos->height, NULL);

I think you should check if something changed before setting new caps.

commit c25021773741dbc274397ca9b3e098e4f995df9a
Author: Simon Corsin <[email protected]>

    Install width and height properties in ges-video-source.

Remove that commit!

commit 191d70de70fe86827565ede4508b9954558de82b
Author: Simon Corsin <[email protected]>

    videotransition: No need to hard set width and height anymore.

This is not true at that time, but yes ;)

commit d8e9ae682412a27b6b2953023095a6e6e4eaaddc
Author: Simon Corsin <[email protected]>

    integration tests: adds a test for scaling video.

Implement it right or remove it!

commit 44dabc228c7041a97dd7fd8670871a8225b55182
Author: Simon Corsin <[email protected]>

    Port all sources to inheriting from audio / video base sources.

    That was painful.

Prefix with "ges:"

diff --git a/ges/ges-source.h b/ges/ges-source.h
index 73339c2..6a39568 100644
--- a/ges/ges-source.h
+++ b/ges/ges-source.h
@@ -74,7 +74,6 @@ struct _GESSourceClass {

   /*< private >*/
   /* Padding for API extension */
-  GstElement*  (*create_source)           (GESTrackElement * object);
   gpointer _ges_reserved[GES_PADDING];
 };

I would have let it here... anyway.

Looks good :)

commit 3b8baf66fe417b3af41bf01f4cb14b5de3db2b15
Author: Simon Corsin <[email protected]>

    Implement audio and video source.

Prefix the messag with "ges:"

diff --git a/ges/ges-audio-source.c b/ges/ges-audio-source.c
new file mode 100644
index 0000000..5ef2f0d
--- /dev/null
+++ b/ges/ges-audio-source.c
@@ -0,0 +1,110 @@
+#include "ges/ges-meta-container.h"
+#include "ges-track-element.h"
+#include "ges-audio-source.h"
+#include "ges-layer.h"
+#include "gstframepositionner.h"

I removed it in a commit in gh/scaling, please pick them up! :)

+/**
+ * GESAudioSourceClass:
+ * @create_source: method to return the GstElement to put in the source topbin.
+ * Other elements will be queued, like a volume.
+ * In the case of a AudioUriSource, for example, the subclass will return a decodebin,
+ * and we will append a volume.
+ */

Really nice dude :)

diff --git a/ges/ges-video-source.c b/ges/ges-video-source.c
new file mode 100644
index 0000000..0f809bb
--- /dev/null
+++ b/ges/ges-video-source.c
@@ -0,0 +1,114 @@
+/* GStreamer Editing Services
+ * Copyright (C) 2009 Edward Hervey <[email protected]>
+ *               2009 Nokia Corporation

Make it yours, same everywhere :)

commit ecd4448ffca821544352a5f2118a99ba580e300d
Author: Simon Corsin <[email protected]>

    Documentation updates.

Could you fix it up with pevious "Remove ges-uri-source" commit please :)

commit 30720eba7c045f54cc31596d09bbac26c9f2c2da
Author: Simon Corsin <[email protected]>

    Removes old uri-source

Looks OK.

commit d17561d78373b15f5331f19a0c36da66f614f39b
Author: Simon Corsin <[email protected]>

    sources: Starts adding video-uri-source et al. WIP fix up on me.

Already quickly reviewed, this if good stuff

commit 191eda6a31324ab99f47184b6a873807e14c0508
Author: Simon Corsin <[email protected]>

    test sources: Modifies the way we handle properties such as pattern.

         + This makes make check pass.
         + This standardizes property handling.

Please do not make tabs before the '+' they should be max 2 spaces.
Category is testsource, not 'test sources'

Otherwize it is good.

commit 75ee13b46bb39495f3c5b0c208390c6cc221487b
Author: Simon Corsin <[email protected]>

    utils: Adds the create bin method there to be used internally.

This should be called ges_source_create_topbin and got into ges-source.c, keep it internal for
now, otherwize it is good

commit ac28d7bef9c70f0b89f5519343eb88ab5ae86044
Author: Simon Corsin <[email protected]>

    ges-source: Move sub element handling to the base class.

          + And port all the subclasses.

→ Make it possible to add standard elements in the directly in the baseclass

Volume in case of audio and framepositionner for Video for example.

=> Looks good

review https://github.com/pitivi/gst-editing-services/commits/integration

    commit f07bb9d178b5509f52623c59ee8953c52d2fe85e
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Fri Aug 16 20:02:00 2013 +0200

        integration: add an argument allowing to test specific effects.


You should change the names here so we have different XML files generated, and we get more info when we get the results on stdout

    ```diff
    +    {"test-effects", 'e', 0, G_OPTION_ARG_STRING, &test_effects,
    +        "Test all available effects", "N"},

Sound indeed like the thing to do :)


    commit 923a04dba6831bb043454e807f2874d984da0fb1
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Fri Aug 16 20:00:30 2013 +0200

        assets: add all effects assets at initialization.

    diff --git a/ges/ges-asset.c b/ges/ges-asset.c
    index 6be3e3f..82ed0a5 100644
    --- a/ges/ges-asset.c
    +++ b/ges/ges-asset.c
    @@ -87,6 +87,8 @@

     #include <gst/gst.h>

    +#include <string.h>
    +
     enum
     {
       PROP_0,
    @@ -558,6 +560,27 @@ ges_asset_cache_put (GESAsset * asset, GSimpleAsyncResult * res)
       UNLOCK_CACHE;
     }

    +static void
    +_init_effect_assets (void)
    +{
    +  GstRegistry *registry;
    +  GList *features, *tmp;
    +
    +  registry = gst_registry_get ();
    +  features = gst_registry_get_feature_list (registry, GST_TYPE_ELEMENT_FACTORY);
    +  for (tmp = features; tmp; tmp = tmp->next) {
    +    GstElementFactory *fact = tmp->data;
    +    const gchar *klass;
    +
    +    klass = gst_element_factory_get_metadata (fact, "klass");
    +    if (strstr (klass, "Effect")) {
    +      GESAsset *asset = ges_asset_request (GES_TYPE_EFFECT,
    +          gst_plugin_feature_get_name (fact), NULL);
    +      gst_object_unref (asset);
    +    }
    +  }
    +}
    +
     void
     ges_asset_cache_init (void)
     {
    @@ -567,6 +590,7 @@ ges_asset_cache_init (void)

       _init_formatter_assets ();
       _init_standard_transition_assets ();
    +  _init_effect_assets ();
     }

Looks good, but I do not like the idea it takes so much time (you said ~1.6 sec iirc :/)

    commit 410bcc6f96e1ef63e5bb7ba2f3716469f880d8f1
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Tue Aug 13 18:05:55 2013 +0200

        trackelement: split bindings correctly.

    diff --git a/ges/ges-clip.c b/ges/ges-clip.c
    index 602cd38..801cb83 100644
    --- a/ges/ges-clip.c
    +++ b/ges/ges-clip.c
    @@ -1206,7 +1206,6 @@ ges_clip_split (GESClip * clip, guint64 position)
     {
       GList *tmp;
       GESClip *new_object;
    -
       GstClockTime start, inpoint, duration;

       g_return_val_if_fail (GES_IS_CLIP (clip), NULL);
    @@ -1243,7 +1242,6 @@ ges_clip_split (GESClip * clip, guint64 position)
       ges_layer_add_clip (clip->priv->layer, new_object);
       ges_clip_set_moving_from_layer (new_object, FALSE);

    -  _set_duration0 (GES_TIMELINE_ELEMENT (clip), position - _START (clip));
       for (tmp = GES_CONTAINER_CHILDREN (clip); tmp; tmp = tmp->next) {
         GESTrackElement *new_trackelement, *trackelement =
             GES_TRACK_ELEMENT (tmp->data);
    @@ -1267,8 +1265,12 @@ ges_clip_split (GESClip * clip, guint64 position)
             GES_TIMELINE_ELEMENT (new_trackelement));
         ges_track_element_copy_properties (GES_TIMELINE_ELEMENT (trackelement),
             GES_TIMELINE_ELEMENT (new_trackelement));
    +
    +    ges_track_element_split_bindings (trackelement, new_trackelement, position);
       }

    +  _set_duration0 (GES_TIMELINE_ELEMENT (clip), position - _START (clip));
    +
       return new_object;
     }

    diff --git a/ges/ges-internal.h b/ges/ges-internal.h
    index 9ce44f0..61a73a9 100644
    --- a/ges/ges-internal.h
    +++ b/ges/ges-internal.h
    @@ -285,4 +285,7 @@ G_GNUC_INTERNAL guint32   _ges_track_element_get_layer_priority (GESTrackElement
     G_GNUC_INTERNAL void ges_track_element_copy_properties          (GESTimelineElement * element,
                                                                      GESTimelineElement * elementcopy);

    +G_GNUC_INTERNAL void ges_track_element_split_bindings(GESTrackElement *element,

missing a space before '('

    +                 GESTrackElement *new_element,
    +                 guint64 position);
     #endif /* __GES_INTERNAL_H__ */
    diff --git a/ges/ges-track-element.c b/ges/ges-track-element.c
    index 8e81475..8260aa1 100644
    --- a/ges/ges-track-element.c
    +++ b/ges/ges-track-element.c
    @@ -1392,6 +1392,73 @@ ges_track_element_copy_properties (GESTimelineElement * element,
       g_free (specs);
     }

    +void
    +ges_track_element_split_bindings (GESTrackElement * element,
    +    GESTrackElement * new_element, guint64 position)
    +{
    +  GParamSpec **specs;
    +  guint n, n_specs;
    +  GstControlBinding *binding;
    +  GstTimedValueControlSource *source, *new_source;
    +
    +  specs =
    +      ges_track_element_list_children_properties (GES_TRACK_ELEMENT (element),
    +      &n_specs);
    +  for (n = 0; n < n_specs; ++n) {
    +    GList *values, *tmp;
    +    GstTimedValue *last_value = NULL;
    +    gboolean past_position = FALSE;
    +    GstInterpolationMode mode;
    +
    +    binding = ges_track_element_get_control_binding (element, specs[n]->name);
    +    if (!binding)
    +      continue;
    +
    +    g_object_get (binding, "control_source", &source, NULL);
    +
    +    if (!GST_IS_TIMED_VALUE_CONTROL_SOURCE (source))
    +      continue;

What should actyally be done here? :) Add a FIXME please

    commit 8276bb95ecef20d3b101b7c8affa80eccf44eb72
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Tue Aug 13 17:57:33 2013 +0200

        trackelement: update control bindings correctly.

Please be more verbose here. I can't really understand why you need to do that.

    commit 650069f17dc6e5c1b386b67d4ceea8e1ceb5e4cd
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Mon Aug 12 21:25:31 2013 +0200

        container: resort children after prepending an element.

    diff --git a/ges/ges-container.c b/ges/ges-container.c
    index b9cdb69..ecf6a78 100644
    --- a/ges/ges-container.c
    +++ b/ges/ges-container.c
    @@ -495,6 +495,8 @@ ges_container_add (GESContainer * container, GESTimelineElement * child)

       container->children = g_list_prepend (container->children, child);

    +  _ges_container_sort_children (container);
    +
       /* Listen to all property changes */
       mapping->start_notifyid =
           g_signal_connect (G_OBJECT (child), "notify::start",

OK

    commit d941885ce921c3560e5c4f45408924bc351b9e33
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Mon Aug 12 16:13:40 2013 +0200

        timeline: when there are no objects anymore, set duration to 0.

    diff --git a/ges/ges-timeline.c b/ges/ges-timeline.c
    index b8839d4..c528775 100644
    --- a/ges/ges-timeline.c
    +++ b/ges/ges-timeline.c
    @@ -602,10 +602,15 @@ timeline_update_duration (GESTimeline * timeline)
       GSequenceIter *it = g_sequence_get_end_iter (timeline->priv->starts_ends);

       it = g_sequence_iter_prev (it);
    -  if (g_sequence_iter_is_end (it))
    +
    +  if (g_sequence_iter_is_end (it)) {
    +    timeline->priv->duration = 0;
    +    g_object_notify_by_pspec (G_OBJECT (timeline), properties[PROP_DURATION]);
         return;
    +  }

       cduration = g_sequence_get (it);
    +
       if (cduration && timeline->priv->duration != *cduration) {
         GST_DEBUG ("track duration : %" GST_TIME_FORMAT " current : %"
             GST_TIME_FORMAT, GST_TIME_ARGS (*cduration),

OK

    commit 0030444aef1bdd4dde39fd75b18e40c7bab3f0d2
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Mon Aug 12 15:01:53 2013 +0200

        ges-audio-track: Actually return an AudioTrack.

We were returning an audio track, just wrongly casted to VideoTrack

    diff --git a/ges/ges-audio-track.c b/ges/ges-audio-track.c
    index c63d6df..87a60ab 100644
    --- a/ges/ges-audio-track.c
    +++ b/ges/ges-audio-track.c
    @@ -93,10 +93,10 @@ ges_audio_track_class_init (GESAudioTrackClass * klass)
      *
      * Returns: A new #GESTrack
      */
    -GESVideoTrack *
    +GESAudioTrack *
     ges_audio_track_new (void)
     {
    -  GESVideoTrack *ret;
    +  GESAudioTrack *ret;
       GstCaps *caps = gst_caps_from_string (DEFAULT_CAPS);

       ret = g_object_new (GES_TYPE_AUDIO_TRACK, "caps", caps,
    diff --git a/ges/ges-audio-track.h b/ges/ges-audio-track.h
    index 3b920de..ab9afb4 100644
    --- a/ges/ges-audio-track.h
    +++ b/ges/ges-audio-track.h
    @@ -55,7 +55,7 @@ struct _GESAudioTrack
     };

     GType          ges_audio_track_get_type (void) G_GNUC_CONST;
    -GESVideoTrack* ges_audio_track_new (void);
    +GESAudioTrack* ges_audio_track_new (void);

OOps, funny we did not see it before

     G_END_DECLS
     #endif /* _GES_AUDIO_TRACK_H_ */

    commit 4b3e9ba7bc164682d51cc7c3b59edab5f3105d72
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Sun Aug 11 20:06:49 2013 +0200

        pipeline: add a get_mode method.

    diff --git a/ges/ges-pipeline.c b/ges/ges-pipeline.c
    index 08aa752..9000264 100644
    --- a/ges/ges-pipeline.c
    +++ b/ges/ges-pipeline.c
    @@ -131,8 +131,8 @@ _overlay_set_render_rectangle (GstVideoOverlay * overlay, gint x,
     {
       GESPipeline *pipeline = GES_TIMELINE_PIPELINE (overlay);

    -  gst_video_overlay_set_render_rectangle (GST_VIDEO_OVERLAY (pipeline->
    -          priv->playsink), x, y, width, height);
    +  gst_video_overlay_set_render_rectangle (GST_VIDEO_OVERLAY (pipeline->priv->
    +          playsink), x, y, width, height);
     }

     static void
    @@ -140,8 +140,8 @@ _overlay_set_window_handle (GstVideoOverlay * overlay, guintptr handle)
     {
       GESPipeline *pipeline = GES_TIMELINE_PIPELINE (overlay);

    -  gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (pipeline->priv->
    -          playsink), handle);
    +  gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (pipeline->
    +          priv->playsink), handle);
     }

     static void
    @@ -448,8 +448,8 @@ ges_pipeline_change_state (GstElement * element, GstStateChange transition)
             ret = GST_STATE_CHANGE_FAILURE;
             goto done;
           }
    -      if (self->
    -          priv->mode & (TIMELINE_MODE_RENDER | TIMELINE_MODE_SMART_RENDER))
    +      if (self->priv->
    +          mode & (TIMELINE_MODE_RENDER | TIMELINE_MODE_SMART_RENDER))
             GST_DEBUG ("rendering => Updating pipeline caps");
           if (!ges_pipeline_update_caps (self)) {
             GST_ERROR_OBJECT (element, "Error setting the caps for rendering");
    @@ -922,6 +922,18 @@ ges_pipeline_set_render_settings (GESPipeline * pipeline,
     }

     /**
    + * ges_pipeline_get_mode:
    + * @pipeline: a #GESPipeline
    + *
    + * Returns: the #GESPipelineFlags currently in use.
    + **/
    +GESPipelineFlags
    +ges_pipeline_get_mode (GESPipeline * pipeline)
    +{
    +  return pipeline->priv->mode;
    +}
    +
    +/**
      * ges_pipeline_set_mode:
      * @pipeline: a #GESPipeline
      * @mode: the #GESPipelineFlags to use
    diff --git a/ges/ges-pipeline.h b/ges/ges-pipeline.h
    index 49a68c6..2f3cca5 100644
    --- a/ges/ges-pipeline.h
    +++ b/ges/ges-pipeline.h
    @@ -88,6 +88,8 @@ gboolean ges_pipeline_set_render_settings (GESPipeline *pipeline,
     gboolean ges_pipeline_set_mode (GESPipeline *pipeline,
               GESPipelineFlags mode);

    +GESPipelineFlags ges_pipeline_get_mode (GESPipeline *pipeline);
    +
     GstSample *
     ges_pipeline_get_thumbnail(GESPipeline *self, GstCaps *caps);

Please add to docs/libs/*section.txt

    commit f1b1f97f9de7c5bea8288df95c0ee9e351757906
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Sun Aug 11 18:59:26 2013 +0200

        integration: add manipulation tests.

We should not merge that until tests pass, please provide a branch without it

    diff --git a/tests/check/ges/integration.c b/tests/check/ges/integration.c
    index 1eb33a7..a57fa7e 100644
    --- a/tests/check/ges/integration.c
    +++ b/tests/check/ges/integration.c
    @@ -811,6 +811,57 @@ test_mixing (void)

     }

    +static void
    +test_adding (void)
    +{
    +  GESTimeline *timeline;
    +  GESLayer *layer;
    +  GESUriClipAsset *asset1;
    +  GESUriClipAsset *asset2;
    +  GESPipeline *pipeline;
    +  GESClip *clip;
    +
    +  timeline = ges_timeline_new_audio_video ();
    +
    +  get_asset (testfilename1, asset1);
    +  get_asset (testfilename2, asset2);
    +  layer = ges_layer_new ();
    +  fail_unless (ges_timeline_add_layer (timeline, layer));
    +
    +  ges_layer_add_asset (layer, GES_ASSET (asset1), 2 * GST_SECOND,
    +      0 * GST_SECOND, 1 * GST_SECOND, GES_TRACK_TYPE_UNKNOWN);
    +  gst_object_unref (asset1);
    +
    +  ges_timeline_commit (timeline);
    +
    +  pipeline = ges_pipeline_new ();
    +  ges_pipeline_add_timeline (pipeline, timeline);
    +
    +  gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PAUSED);
    +  gst_element_get_state (GST_ELEMENT (pipeline), NULL, NULL, -1);
    +
    +  gst_element_seek_simple (GST_ELEMENT (pipeline),
    +      GST_FORMAT_TIME,
    +      GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE, 0.5 * GST_SECOND);
    +
    +  clip = ges_layer_add_asset (layer, GES_ASSET (asset2), 0 * GST_SECOND,
    +      0 * GST_SECOND, 1 * GST_SECOND, GES_TRACK_TYPE_UNKNOWN);
    +
    +  ges_timeline_commit (timeline);
    +  gst_element_seek_simple (GST_ELEMENT (pipeline),
    +      GST_FORMAT_TIME,
    +      GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE, 0.5 * GST_SECOND);
    +
    +  ges_layer_remove_clip (layer, clip);
    +
    +  ges_timeline_commit (timeline);
    +
    +  gst_element_seek_simple (GST_ELEMENT (pipeline),
    +      GST_FORMAT_TIME,
    +      GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE, 0.5 * GST_SECOND);
    +
    +  gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_NULL);
    +}

     #define CREATE_TEST(name, func, profile)                                       \
     GST_START_TEST (test_##name##_raw_h264_mov)                                    \
    @@ -866,10 +917,27 @@ GST_END_TEST;
     #define CREATE_PLAYBACK_TEST(name)                                             \
       CREATE_TEST_FROM_NAMES(name, _playback, PROFILE_NONE)

    +#define CREATE_MANIPULATION_TEST(name)                                             \
    +  CREATE_TEST_FROM_NAMES(name, _manipulation, PROFILE_NONE)
    +
     #define CREATE_TEST_FULL(name)                                                 \
       CREATE_PLAYBACK_TEST(name)                                                   \
       CREATE_RENDERING_TEST(name, func)

    +#define ADD_MANIPULATION_TESTS(name)                    \
    +  tcase_add_test (tc_chain, test_##name##_manipulation_vorbis_vp8_webm);           \
    +  tcase_add_test (tc_chain, test_##name##_manipulation_vorbis_theora_ogv);         \
    +  tcase_add_test (tc_chain, test_##name##_manipulation_raw_h264_mov);              \
    +  tcase_add_test (tc_chain, test_##name##_manipulation_mp3_h264_mov);              \
    +  tests_names = g_list_prepend (tests_names, g_strdup_printf ("%s%s%s", "test_", #name,     \
    +                   "_manipulation_mp3_h264_mov")); \
    +  tests_names = g_list_prepend (tests_names, g_strdup_printf ("%s%s%s", "test_", #name,     \
    +        "_manipulation_raw_h264_mov"));                                             \
    +  tests_names = g_list_prepend (tests_names, g_strdup_printf ("%s%s%s", "test_", #name,     \
    +        "_manipulation_vorbis_theora_ogv"));                                        \
    +  tests_names = g_list_prepend (tests_names, g_strdup_printf ("%s%s%s", "test_", #name,     \
    +        "_manipulation_vorbis_vp8_webm"));
    +
     #define ADD_PLAYBACK_TESTS(name)                                               \
       tcase_add_test (tc_chain, test_##name##_playback_vorbis_vp8_webm);           \
       tcase_add_test (tc_chain, test_##name##_playback_vorbis_theora_ogv);         \
    @@ -957,6 +1025,8 @@ CREATE_PLAYBACK_TEST(seeking_paused_noplay)
     CREATE_PLAYBACK_TEST(seeking_paused_audio_noplay)
     CREATE_PLAYBACK_TEST(seeking_paused_video_noplay)
     CREATE_PLAYBACK_TEST(image)
    +
    +CREATE_MANIPULATION_TEST(adding)
     /* *INDENT-ON* */

     static Suite *
    @@ -990,6 +1060,8 @@ ges_suite (void)
       ADD_PLAYBACK_TESTS (seeking_paused_audio_noplay);
       ADD_PLAYBACK_TESTS (seeking_paused_video_noplay);

    +  ADD_MANIPULATION_TESTS (adding);
    +
       /* TODO : next test case : complex timeline created from project. */
       /* TODO : deep checking of rendered clips */
       /* TODO : might be interesting to try all profiles, and maintain a list of currently working profiles ? */

    commit 417fb5e355c48ba4e471dc14f4b47ea4c640a04a
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Sun Aug 11 18:57:46 2013 +0200

        integration: NULL terminate options.

Didn't I merge that??

    diff --git a/tests/check/ges/integration.c b/tests/check/ges/integration.c
    index e5f888e..1eb33a7 100644
    --- a/tests/check/ges/integration.c
    +++ b/tests/check/ges/integration.c
    @@ -1041,7 +1041,8 @@ main (int argc, char **argv)

       GOptionEntry options[] = {
         {"list-tests", 'l', 0.0, G_OPTION_ARG_NONE, &list_tests,
    -        "List all avalaible tests", "N"}
    +        "List all avalaible tests", "N"},
    +    {NULL}
       };

       ctx = g_option_context_new ("Run integration tests");

    commit 2ae74b426d0ccacc9b38bab723f0ba5ee3e8bdf3
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Wed Aug 7 19:37:49 2013 +0200

        basexmlformatter: Only set timeline auto transitions when done loading.

    diff --git a/ges/ges-base-xml-formatter.c b/ges/ges-base-xml-formatter.c
    index 2e854c1..2ab382f 100644
    --- a/ges/ges-base-xml-formatter.c
    +++ b/ges/ges-base-xml-formatter.c
    @@ -111,6 +111,8 @@ struct _GESBaseXmlFormatterPrivate

       GESClip *current_clip;
       PendingClip *current_pending_clip;
    +
    +  gboolean timeline_auto_transition;
     };

     static void
    @@ -221,6 +223,8 @@ _load_from_uri (GESFormatter * self, GESTimeline * timeline, const gchar * uri,
     {
       GESBaseXmlFormatterPrivate *priv = _GET_PRIV (self);

    +  ges_timeline_set_auto_transition (timeline, FALSE);
    +
       priv->parsecontext =
           create_parser_context (GES_BASE_XML_FORMATTER (self), uri, error);

    @@ -356,6 +360,7 @@ ges_base_xml_formatter_init (GESBaseXmlFormatter * self)
       priv->current_track_element = NULL;
       priv->current_clip = NULL;
       priv->current_pending_clip = NULL;
    +  priv->timeline_auto_transition = FALSE;
     }

     static void
    @@ -406,8 +411,12 @@ _loading_done (GESFormatter * self)
         g_markup_parse_context_free (priv->parsecontext);
       priv->parsecontext = NULL;

    +  ges_timeline_set_auto_transition (self->timeline,
    +      priv->timeline_auto_transition);
    +
       g_hash_table_foreach (priv->layers, (GHFunc) _set_auto_transition, NULL);
       ges_project_set_loaded (self->project, self);
    +  GST_ERROR ("loading done");
     }

     static void
    @@ -808,6 +817,35 @@ ges_base_xml_formatter_add_clip (GESBaseXmlFormatter * self,
     }

     void
    +ges_base_xml_formatter_set_timeline_properties (GESBaseXmlFormatter * self,
    +    GESTimeline * timeline, const gchar * properties, const gchar * metadatas)
    +{
    +  GESBaseXmlFormatterPrivate *priv = _GET_PRIV (self);
    +  gboolean auto_transition = FALSE;
    +
    +  if (properties) {
    +    GstStructure *props = gst_structure_from_string (properties, NULL);
    +
    +    if (props) {
    +      if (gst_structure_get_boolean (props, "auto-transition",
    +              &auto_transition))
    +        gst_structure_remove_field (props, "auto-transition");
    +
    +      gst_structure_foreach (props,
    +          (GstStructureForeachFunc) set_property_foreach, timeline);
    +      gst_structure_free (props);
    +    }
    +  }
    +
    +  if (metadatas) {
    +    ges_meta_container_add_metas_from_string (GES_META_CONTAINER (timeline),
    +        metadatas);
    +  };
    +
    +  priv->timeline_auto_transition = auto_transition;
    +}
    +
    +void
     ges_base_xml_formatter_add_layer (GESBaseXmlFormatter * self,
         GType extractable_type, guint priority, GstStructure * properties,
         const gchar * metadatas, GError ** error)
    diff --git a/ges/ges-internal.h b/ges/ges-internal.h
    index c4aea8f..9ce44f0 100644
    --- a/ges/ges-internal.h
    +++ b/ges/ges-internal.h
    @@ -253,6 +253,12 @@ G_GNUC_INTERNAL gint element_start_compare                (GESTimelineElement *
     G_GNUC_INTERNAL gint element_end_compare                  (GESTimelineElement * a,
                                                                GESTimelineElement * b);

    +void
    +ges_base_xml_formatter_set_timeline_properties(GESBaseXmlFormatter * self,
    +                GESTimeline *timeline,
    +                const gchar *properties,
    +                const gchar *metadatas);
    +
     /****************************************************
      *              GESContainer                        *
      ****************************************************/
    diff --git a/ges/ges-xml-formatter.c b/ges/ges-xml-formatter.c
    index 63b3bc4..d76abed 100644
    --- a/ges/ges-xml-formatter.c
    +++ b/ges/ges-xml-formatter.c
    @@ -244,19 +244,8 @@ _parse_timeline (GMarkupParseContext * context, const gchar * element_name,
       if (timeline == NULL)
         return;

    -  if (properties) {
    -    GstStructure *props = gst_structure_from_string (properties, NULL);
    -
    -    if (props) {
    -      gst_structure_foreach (props,
    -          (GstStructureForeachFunc) set_property_foreach, timeline);
    -      gst_structure_free (props);
    -    }
    -  }
    -  if (metadatas) {
    -    ges_meta_container_add_metas_from_string (GES_META_CONTAINER (timeline),
    -        metadatas);
    -  };
    +  ges_base_xml_formatter_set_timeline_properties (GES_BASE_XML_FORMATTER (self),
    +      timeline, properties, metadatas);
     }

     static inline void

+1

    commit dd8cb5c4637b87376d3458bf0e02bf4ec28b84cd
    Author: Mathieu Duponchelle <[email protected]>
    Date:   Wed Aug 7 16:12:27 2013 +0200

        integration: make test_basic be two concatenated clips.

    diff --git a/tests/check/ges/integration.c b/tests/check/ges/integration.c
    index 3fb7344..e5f888e 100644
    --- a/tests/check/ges/integration.c
    +++ b/tests/check/ges/integration.c
    @@ -627,8 +627,10 @@ run_basic (GESTimeline * timeline)
     {
       GESLayer *layer;
       GESUriClipAsset *asset1;
    +  GESUriClipAsset *asset2;

       get_asset (testfilename1, asset1);
    +  get_asset (testfilename2, asset2);
       layer = ges_layer_new ();
       fail_unless (ges_timeline_add_layer (timeline, layer));

    @@ -637,12 +639,16 @@ run_basic (GESTimeline * timeline)
       gst_object_unref (asset1);
       /* Test most simple case */

    +  ges_layer_add_asset (layer, GES_ASSET (asset2), 1 * GST_SECOND,
    +      0 * GST_SECOND, 1 * GST_SECOND, GES_TRACK_TYPE_UNKNOWN);
    +  gst_object_unref (asset2);
    +
         /**
        * Our timeline
        *
    -   * inpoints 0--------0
    -   *          |  clip  |
    -   * time     0--------1
    +   * inpoints 0--------0 0--------0
    +   *          |  clip  | |  clip2 |
    +   * time     0------- 1 1--------2
        */

       fail_unless (check_timeline (timeline));

+1

Interlacing artifacts with certain formats

This example shows that video output is strangely interlaced when using timeline.commit(), but only on Ogg / Theora files. Other formats work.

http://i.imgur.com/9SWl1m9.png
http://i.imgur.com/FxYmymx.png

The below example does not show the artifacts when mp4 is used, or timeline.commit() is commented out.

#!/usr/bin/python

from gi.repository import GES, Gst, GLib

# OGG URL http://ftp.nluug.nl/ftp/graphics/blender/apricot/trailer/sintel_trailer-480p.ogv
# MP4 URL http://ftp.nluug.nl/ftp/graphics/blender/apricot/trailer/sintel_trailer-480p.mp4

videoFile = "file:///home/bmonkey/workspace/ges/data/sintel_trailer-480p.ogv"

def quit(bar):
  GLib.MainLoop.quit(bar)
  exit()

def simple():
  Gst.init(None)
  GES.init()

  timeline = GES.Timeline.new_audio_video()

  video_layer = GES.Layer()

  timeline.add_layer(video_layer)

  video_asset = GES.UriClipAsset.request_sync(videoFile)
  video_layer.add_asset(video_asset, 0, 4 * Gst.SECOND, 3 * Gst.SECOND, video_asset.get_supported_formats())

  timeline.commit()

  pipeline = GES.TimelinePipeline()
  pipeline.add_timeline(timeline)
  pipeline.set_state(Gst.State.PLAYING)

  mainLoop = GLib.MainLoop.new(None, False)
  GLib.timeout_add_seconds(3, quit, mainLoop)
  GLib.MainLoop().run()

simple()

GESLayer add_asset ignores in-point

The in-point is ignored in this example

#!/usr/bin/python

from gi.repository import GES, Gst, GLib

# File URL http://download.blender.org/peach/trailer/trailer_400p.ogg

videoFile = "file:///home/bmonkey/workspace/ges/data/trailer_400p.ogg"

def quit(bar):
  GLib.MainLoop.quit(bar)
  exit()

def simple():
  Gst.init(None)
  GES.init()

  timeline = GES.Timeline.new_audio_video()

  asset = GES.UriClipAsset.request_sync(videoFile)

  layer = GES.Layer()
  timeline.add_layer(layer)
  layer.add_asset(asset, 0, 20 * Gst.SECOND, 3 * Gst.SECOND, asset.get_supported_formats())

  pipeline = GES.TimelinePipeline()
  pipeline.add_timeline(timeline)
  pipeline.set_state(Gst.State.PLAYING)

  mainLoop = GLib.MainLoop.new(None, False)
  GLib.timeout_add_seconds(3, quit, mainLoop)
  GLib.MainLoop().run()

simple()

Audiomixer: Audio layer is not played when video clip uses own audio

When commented lines are used, the audio layer is not played. Otherwise the video's audio is not played.

#!/usr/bin/python

from gi.repository import GES, Gst, GLib

# File URL http://download.blender.org/peach/trailer/trailer_400p.ogg

videoFile = "file:///home/bmonkey/workspace/ges/data/trailer_400p.ogg"

audioFile = "file:///home/bmonkey/workspace/ges/data/prof.ogg"

def quit(bar):
  GLib.MainLoop.quit(bar)
  exit()

def simple():
  Gst.init(None)
  GES.init()

  timeline = GES.Timeline.new_audio_video()

  video_layer = GES.Layer()
  audio_layer = GES.Layer()

  timeline.add_layer(video_layer)
  timeline.add_layer(audio_layer)

  video_asset = GES.UriClipAsset.request_sync(videoFile)
  video_layer.add_asset(video_asset, 0, 0, 3 * Gst.SECOND, GES.TrackType.VIDEO)
  #video_layer.add_asset(video_asset, 0, 0, 3 * Gst.SECOND, video_asset.get_supported_formats())
  #video_layer.add_asset(video_asset, 0, 0, 3 * Gst.SECOND, GES.TrackType.UNKNOWN)

  audio_asset = GES.UriClipAsset.request_sync(audioFile)
  audio_layer.add_asset(audio_asset, 0, 0, timeline.get_duration(), GES.TrackType.AUDIO)

  pipeline = GES.TimelinePipeline()
  pipeline.add_timeline(timeline)
  pipeline.set_state(Gst.State.PLAYING)

  mainLoop = GLib.MainLoop.new(None, False)
  GLib.timeout_add_seconds(3, quit, mainLoop)
  GLib.MainLoop().run()

simple()

VP8 encoder negotiation problem when using different video source files

I am trying to encode a timeline with 2 clips from different video files like this.

#!/usr/bin/python

from gi.repository import GES, Gst, Gtk, GstPbutils, GObject

import signal

videoFile1 = "file:///home/bmonkey/workspace/ges/data/trailer_480p.mov"
videoFile2 = "file:///home/bmonkey/workspace/ges/data/sintel_trailer-480p.mp4"

outputFile = 'file:///home/bmonkey/workspace/ges/data/GESEncode.webm'

def handle_sigint(sig, frame):
    Gtk.main_quit()

def busMessageCb(bus, message):
    if message.type == Gst.MessageType.EOS:
        print("eos")
        Gtk.main_quit()
    elif message.type == Gst.MessageType.ERROR:
        print (message.parse_error())
    else:
        pass

def duration_querier(pipeline):
    print(pipeline.query_position(Gst.Format.TIME)[1] / Gst.SECOND)
    return True

def simple():
  Gst.init(None)
  GES.init()

  timeline = GES.Timeline.new_audio_video()

  layer = GES.Layer()

  timeline.add_layer(layer)

  asset = GES.UriClipAsset.request_sync(videoFile1)
  asset2 = GES.UriClipAsset.request_sync(videoFile2)

  layer.add_asset(asset, 0 * Gst.SECOND, 0, 3 * Gst.SECOND, GES.TrackType.UNKNOWN)
  layer.add_asset(asset2, 3 * Gst.SECOND, 0, 3 * Gst.SECOND, GES.TrackType.UNKNOWN)

  timeline.commit()

  pipeline = GES.TimelinePipeline()
  pipeline.add_timeline(timeline)

  #encoding
  container_profile = GstPbutils.EncodingContainerProfile.new(
      "pitivi-profile",
      "Pitivi encoding profile",
      Gst.Caps.new_empty_simple("video/webm"),
      None)

  video_profile = GstPbutils.EncodingVideoProfile.new(
      Gst.Caps.new_empty_simple("video/x-vp8"),
      None,
      Gst.Caps.new_empty_simple("video/x-raw"),
      0)

  container_profile.add_profile(video_profile)

  audio_profile = GstPbutils.EncodingAudioProfile.new(
      Gst.Caps.new_empty_simple("audio/x-vorbis"),
      None,
      Gst.Caps.new_empty_simple("audio/x-raw"),
      0)

  container_profile.add_profile(audio_profile)

  pipeline.set_render_settings(outputFile, container_profile)
  pipeline.set_mode(GES.PipelineFlags.RENDER)
  pipeline.set_state(Gst.State.PLAYING)

  bus = pipeline.get_bus()
  bus.add_signal_watch()
  bus.connect("message", busMessageCb)
  GObject.timeout_add(1000, duration_querier, pipeline)

  signal.signal(signal.SIGINT, handle_sigint)
  Gtk.main()

simple()

It works when I use the same video file for both clips.

asset = GES.UriClipAsset.request_sync(videoFile2)
asset2 = GES.UriClipAsset.request_sync(videoFile2)

I get following output

1.185333333
1.803999999
2.230666667
2.635999999
3.929333333
4.590666667
5.102666667
5.507999999
eos

But when I use different video urls, I get an error and it only encodes the first clip.

asset = GES.UriClipAsset.request_sync(videoFile1)
asset2 = GES.UriClipAsset.request_sync(videoFile2)

I get following output

0.883999999
1.971999999
(GError('GStreamer error: negotiation problem.',), 'gstvideoencoder.c(1363): gst_video_encoder_chain (): /GESTimelinePipeline:gestimelinepipeline0/GstEncodeBin:internal-encodebin/GstVP8Enc:vp8enc0:\nencoder not initialized')
(GError('GStreamer encountered a general stream error.',), 'qtdemux.c(4317): gst_qtdemux_loop (): /GESTimelinePipeline:gestimelinepipeline0/GESTimeline:gestimeline0/GESVideoTrack:gesvideotrack0/GnlComposition:gnlcomposition1/GnlSource:gnlsource3/GstBin:video-src-bin/GstURIDecodeBin:uridecodebin2/GstDecodeBin:decodebin5/GstQTDemux:qtdemux4:\nstreaming stopped, reason not-negotiated')
2.739999999
2.739999999
2.739999999
2.739999999
2.739999999
2.739999999
2.739999999
2.739999999
2.739999999
�2.739999999
etc

Pipeline does not start when a PNG overlaps with video at start

The following test does not open a window and prints out following time:

0.0
0.0
0.0
0.0

It will start when following conditions are changed:

  • The image file is not PNG but JPG
  • The image layer and video layer do not overlap in time
  • The timeline is not started by the image
  • timeline.commit() is not called
#!/usr/bin/python

from gi.repository import GES, Gst, Gtk, GstPbutils, GObject

import signal

#http://download.blender.org/peach/trailer/trailer_400p.ogg
videoFile = "file:///home/bmonkey/workspace/ges/data/trailer_400p.ogg"

#https://upload.wikimedia.org/wikipedia/commons/4/47/PNG_transparency_demonstration_1.png
imageFile = "file:///home/bmonkey/workspace/ges/data/PNG_transparency_demonstration_1.png"

def handle_sigint(sig, frame):
    Gtk.main_quit()

def busMessageCb(bus, message):
    if message.type == Gst.MessageType.EOS:
        print("eos")
        Gtk.main_quit()
    elif message.type == Gst.MessageType.ERROR:
        print (message.parse_error())
    else:
        pass

def duration_querier(pipeline):
    print(pipeline.query_position(Gst.Format.TIME)[1] / Gst.SECOND)
    return True

def simple():
  Gst.init(None)
  GES.init()

  timeline = GES.Timeline.new_audio_video()

  imagelayer = GES.Layer()
  videolayer = GES.Layer()
  timeline.add_layer(imagelayer)
  timeline.add_layer(videolayer)

  asset = GES.UriClipAsset.request_sync(videoFile)
  imageasset = GES.UriClipAsset.request_sync(imageFile)

  imagelayer.add_asset(imageasset, 0 * Gst.SECOND, 0, 1 * Gst.SECOND, GES.TrackType.UNKNOWN)
  videolayer.add_asset(asset, 0 * Gst.SECOND, 0, 10 * Gst.SECOND, GES.TrackType.UNKNOWN)

  timeline.commit()

  pipeline = GES.TimelinePipeline()
  pipeline.add_timeline(timeline)

  pipeline.set_state(Gst.State.PLAYING)

  bus = pipeline.get_bus()
  bus.add_signal_watch()
  bus.connect("message", busMessageCb)
  GObject.timeout_add(300, duration_querier, pipeline)

  signal.signal(signal.SIGINT, handle_sigint)
  Gtk.main()

simple()

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.