Coder Social home page Coder Social logo

processing-glvideo's Introduction

OpenGL video playback for Processing

This library makes use of GStreamer and OpenGL hardware (or software) acceleration to display video in Processing's P2D or P3D renderers. It should run on Linux and Raspbian.

Installing

Linux

Install the GStreamer 1.x software from your distribution's repositories. The actual packages might be named differently between distributions, but you want the the equivalents of:

gstreamer1.0 gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-gst-plugins-bad and either gstreamer1.0-ffmpeg or gstreamer1.0-libav

Additional plugins, such as omx or vaapi, could additionally unlock hardware-accelerated decoding.

Install the library through the Processing's Contribution Manager. Try out the example sketches that it comes with. You might receive a warning if your version of GStreamer does not match the version that library was compiled against, which is currently 1.10. If this is the case, and you experiences errors or crashes, you might want to try re-building the library from source against your particular GStreamer version.

Raspbian

Do not enable the KMS OpenGL driver, but stick with the legacy one. Ironically, enabling this module disables GPU video support.

Install the library through the Processing's Contribution Manager. Try out the example sketches that it comes with. On Raspbian, a local copy of GStreamer 1.12 is bundled together with the library.

processing-glvideo's People

Contributors

codeanticode avatar galnissim avatar gohai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

processing-glvideo's Issues

no sound

Raspberry pi Video plays without sound

mov.Framerate() not working

I get the error The global variable "frameRate" does not exist after I ported in the Frames example from the processing library.

Can anyone tell me why?

Audio playback for GLMovie

In current master we switched from playbin to uridecodebin. Code needs to be adopted to support audio playback.

SimpleCapture example failure

When running the SimpleCapture example, these errors are thrown:

MESA-LOADER: failed to retrieve device information
libGL error: MESA-LOADER: failed to retrieve device information
GLVideo requires the P2D or P3D renderer.

(java:769): GLib-CRITICAL **: g_error_free: assertion 'error != NULL' failed
  • I am running this example on a Raspberry Pi, with a fresh install of Raspbian.
  • The NoIR camera is connected properly, as running raspistill -o /home/pi/Downloads/test.jpg works properly.
  • The entry bcm2835_v4l2 is present in /etc/modules.

Any dependencies that could be missing or require an update in order for this to work?

The

Not possible to open more then 4 video files MacOS

If I try to create more then 4 GLMovie I do get following error message. With the old video library & gstreamer 0.1 it was no problem to load multiple movies.

/Library/Caches/com.apple.xbs/Sources/AppleGVA/AppleGVA-10.1.17/Sources/Slices/Driver/AVD_loader.cpp: failed to get a service for display 4 

If you google for the error, you see that it is widely appearing in several applications. So it could be that it is not a problem of your library, but more of the Mac environment.

GLVideo Error eventually breaking RaspberryPi GPIO connection with sketch.

Sketch runs great, its acts like a zoetrope - We have a bike wheel set up that sends a pulse with each revolution and with that pulse scrubs to the next frame of the video. The problem is that eventually it stops reading after a few minutes. Any ideas why this is happening? The video is 1.9GB
& MPEG-4.

the errors are here:

GLVideo: omx264dec-omxh264dec0: Could Not configure supporting library.
Debugging informaiton: gstomxvideodec.c(1620): gst_omx_video_dec_loop () /GstPlayBin:playbin0/GstURIDecodeBin:uridecodeBin: Unable to reconfigure output port

GLVideo: omx264dec-omxh264dec0: GStreamer encountered a general supporting library error.
Debugging information: gstomxvideodec.c(1527)
OpenMAX comnponent in error state None (0x00000000)

and the code:

import com.pi4j.io.gpio.GpioController;
import com.pi4j.io.gpio.GpioFactory;
import com.pi4j.io.gpio.GpioPinDigitalInput;
import com.pi4j.io.gpio.PinPullResistance;
import com.pi4j.io.gpio.RaspiPin;
import gohai.glvideo.*;

GLMovie video1;
GpioController gpio;
GpioPinDigitalInput button;
float interval = 0;

void setup() {
 size(500, 500, P2D);

 gpio = GpioFactory.getInstance();
 button = gpio.provisionDigitalInputPin(RaspiPin.GPIO_02, PinPullResistance.PULL_DOWN);

 video1 = new GLMovie(this, "wheel_1.mp4");

 video1.play();
 video1.jump(0);
 video1.pause();
}

void draw() {

 if (button.isHigh()) {
   println(interval);
   clear();


   interval = interval + 15 ;
   if (video1.available()) {
     video1.read();


     float moviePosition = interval;


     video1.jump(moviePosition);
     image(video1, 0, 0, width, height);
   }

   if  (interval >= 18000) {
     video1.jump(0);
     interval = 0;
   }
 }
}

Issues looping small, short video on Raspberry Pi

Hi,

I've just installed the glvideo library on a Raspberry Pi 2 running Processing 3.3
and tried to loop small (48x96), short(~10second) video, but after a few loops (about 3-4)
playback slows down and eventually freezes.

I'm running the exported SingleVideo example with the small video (instead launch1.mp4).
All I see in the output is this:

(java:4688): GStreamer-CRITICAL **: gst_tag_list_get_scope: assertion 'GST_IS_TAG_LIST (list)' failed

(java:4688): GStreamer-CRITICAL **: gst_tag_list_insert: assertion 'GST_IS_TAG_LIST (into)' failed
Final caps: video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)48, height=(int)96, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, texture-target=(string)2D

No other error messages.

I've also tried tweaking the generated launch script to use more memory (Xmx128m), but nothing changes. There still are 170MB of memory left while the app is running (but in a frozen state).

The video memory split is currently 128MB.

Any tips on how I could looping to work ?

Thank you,
George

Loops and short video files

HI,
first of all thank you for your great work for the community!! I just installed your new Rasbian image with processing and tested your glvideo library on Pi02 and Pi03 and discovered following problems:

  1. I used very short mp4 sequences and discovered that they stop approximatly a second before they really end. When they are looped, the sequences are paused for this time until it starts in the loop again. Unfortunately the sequence is never played in full length.
  2. If I try to create a executable, processing informs me that it failed to create a linux 32bit and 64bit version. That is nothing to wonder about, but it also fails to create the arm version!?

Cannot link shader program

Getting this error trying to use SimpleCapture with Raspberry Pi camera v2, so tried the SingleVideo example and got the same. It’s on raspbian, using built in Processing, I’ve run updates, checked all gstreamer packages etc. driver is loaded, have dev/video0. Only seen one mention of this error anywhere else (posted a couple of weeks ago) so drawing a blank on where to start. There’s a big list of errors thrown by PShader, PGraphics etc after this too.

ERROR:PREPROCESSOR-1 (vertex shader, line 1) Preprocessor syntax error : garbage at end of preprocessor directive```

1 frame per second on pi 3

running processing 3, raspi3 , most recent update, and the pi camera v2.1
and the default simple capture script

any idea what could be causing this?
and help greatly appreciated

captureEvent

Hi

I'm porting a sketch to raspberry pi and as far as I understand, captureEvent() or GLcaptureEvent(), doesn't seem to work when using processing-glvideo.
Is that the case? If so, is there a workaround?

Thanks :)

MacOS error on loading video

With version 1.1.2 your example SingleVideo is not running. There is a native error:

(<unknown>:51500): GLib-GObject-WARNING **: cannot register existing type 'gchar'
**
GLib-GObject:ERROR:gvaluetypes.c:457:_g_value_types_init: assertion failed: (type == G_TYPE_CHAR)
Could not run the sketch (Target VM failed to initialize).
For more information, read revisions.txt and HelpTroubleshooting.

I'm running MacOS on a MPB 2016 with sierra (10.12.5 (16F73)) installed.

Video Playback on raspberry pi 3 - libEGL warning: DRI2: failed to authenticate

Hi!

I Installed the processing-glvideo library via the processing library manager. But I am not able to play any video. I get the following error:

libEGL warning: DRI2: failed to authenticate
/usr/local/lib/processing-3.1.1/java/bin/java: symbol lookup error: /tmp/jogamp_0000/file_cache/jln2465173736968517657/jln7418784907119433744/natives/linux-armv6hf/libnewt.so: undefined symbol: bcm_host_init
Could not run the sketch (Target VM failed to initialize).
For more information, read revisions.txt and Help → Troubleshooting.

I've tried your expamples for playing single video or two videos...
Other Sketches without the lib are running well...

Ive already googled and found that others have similar problems, but I wasn't able to fix it.
Would be nice if you could explain what to do, to get your nice library running!

Cheers!

Implement loadPixels()

This is not really an issue, but I think it would be very helpful if the loadPixels() could be applied directly to the GLVideo.

For example:

import gohai.glvideo.*;
GLvideo video;

void setup(){
    video = new GLVideo(this, "test.mp4");
    video.loop();
}

void draw(){
    video.loadPixels();
    video.pixels[0];
    //do something
}

Certain video files don't work on OS X

These two video files aren't currently working on OS X using the wip2 branch. They work from the command line on OS X, as well as on the Raspberry Pi using the 1.0.2 release.

waters.zip

What might be relevant: the line /Library/Caches/com.apple.xbs/Sources/AppleGVA/AppleGVA-9.1.12/Sources/Slices/Driver/AVD_loader.cpp: failed to get a service for display 3 appears in the output only after the sketch is closed, whereas it is being printed immediately when using the gst-launch. (Could be related to #9)

Video offset from sketch window and forced to "width" and "height"

I was able to run this on a Pi-3, but the video is offset from the sketch window and has a size of "width" and "height" that I could not change. This is a great first step! Finally able to use processing on the Pi with video. Can any other size video other the the sketch size() be used? how?

Support non-default Capture video modes eventually

Dear gohai,

your library is just running fine on my Pi3, e.g. sketches like SimpleCapture just work fine.

I know the library is at an very early state, but is there any possibility to change and use the different Picam sensor modes? If I interpret it correctly, the width and height of the video stram is currently fixed to 320x200.

Kind regards
pikalu

Processing sketch fails quietly

the below P3 sketch performs a continuous random scratch over two videos. The two videos are actually frame reversed versions of the other but as long as you have make them the same length, perhaps just a simple duplicate you should be able to run it
after as few as 7-800 or maybe some 10's of thousands of draw() calls it halts
I am running this on a RPI 3 and have set gui memory to 750M
I find nothing in the logs, if I run it under processing-java : nothing
? can you suggest why this sketch fails ?

import gohai.glvideo.*; //<>// //<>//

class GLmovie_ext {
  GLMovie mov;
  float length;
};

GLmovie_ext[] m = new GLmovie_ext[2];
;

int now_showing;
boolean loaded = false;

void setup() { 
  //frameRate(50);
  size(500, 500, P2D);
  //fullScreen(P2D);
  //noCursor();
  m[0] = new GLmovie_ext();
  m[0].mov = new GLMovie(this, "clock_red.m4v", GLVideo.MUTE);  
  m[1] = new GLmovie_ext();
  m[1].mov = new GLMovie(this, "clock_green.m4v", GLVideo.MUTE);

  thread("loadMovies");
}

void loadMovies() {
  m[0].mov.loop();
  m[1].mov.loop();
  m[0].mov.volume(0);
  m[1].mov.volume(0);
  while (! m[0].mov.playing() || ! m[1].mov.playing()) delay(10);
  m[0].length = m[0].mov.duration();
  m[1].length = m[1].mov.duration();
  while ( m[0].length < 32 || m[1].length < 32 || abs( m[0].length - m[1].length ) > 0.1 ) {
    m[0].length = max( m[0].length, m[0].mov.duration());
    m[1].length = max( m[1].length, m[1].mov.duration());
  } 
  now_showing = 1;
  runTime = (m[0].length + m[1].length) / 2.0;
  loaded = true;
}

float cumulError = 0;
int drawCnt = 0;
float runTime;

void draw () {
  background(0);
  clear();
  if (!loaded) {
    textSize(10);
    fill(255);
    text("loading ...", width/2, height/2);
  } else {
    if (m[now_showing].mov.available()) m[now_showing].mov.read() ;   
    image( m[now_showing].mov, (width-m[now_showing].mov.width)/2, (height-m[now_showing].mov.height)/2);   
    int alternate = (now_showing+1) %2;
    if ( runTime < m[now_showing].mov.time() + m[alternate].mov.time() ) {
      float cutPoint = m[now_showing].mov.time();
      alternate = now_showing;
      now_showing = (now_showing+1) %2;
      cutPoint = random(0.99)*save+(runTime-save); // in the range from here to other's end point
      // now get alternate where we need it
      float distance = abs(cutPoint - save);
      float gap = runTime - 2*distance; // -1 .. -ve --> overshoot +ve means time in hand .. +1
      float correction = save + gap;
      while (correction > runTime) correction -= runTime;
      m[alternate].mov.jump( correction);
    }
  }
  stroke(color(#FF0000));
  fill(color(#FF0000));
  line(0, 10, width * (m[0].mov.time()/runTime), 10);
  if (now_showing == 0) {
    ellipse(0, 10, 10, 10);
  }
  stroke(color(#00FF00));
  fill(color(#00FF00));
  line(width * (1-m[1].mov.time()/runTime), 20, width, 20);
  if (now_showing == 1) {
    ellipse(width, 20, 10, 10);
  }
  drawCnt++;
  text(drawCnt, 40, 40);
  if (drawCnt % 1000 == 0) System.gc();
}

String fc(float F) {
  return nf( floor( 100.0*F/runTime));
}

movieEvent() usage?

There are issues with this example when porting it for use on the RasPi with GLVideo.
Does this have to do with movieEvent being in the processing.video library and not GLVideo?

/**
 * Frames 
 * by Andres Colubri. 
 * 
 * Moves through the video one frame at the time by using the
 * arrow keys. It estimates the frame counts using the framerate
 * of the movie file, so it might not be exact in some cases.
 */
 
import processing.video.*;

Movie mov;
int newFrame = 0;

void setup() {
  size(640, 360);
  background(0);
  // Load and set the video to play. Setting the video 
  // in play mode is needed so at least one frame is read
  // and we can get duration, size and other information from
  // the video stream. 
  mov = new Movie(this, "transit.mov");  
  
  // Pausing the video at the first frame. 
  mov.play();
  mov.jump(0);
  mov.pause();
}

void movieEvent(Movie m) {
  m.read();
}

void draw() {
  background(0);
  image(mov, 0, 0, width, height);
  fill(255);
  text(getFrame() + " / " + (getLength() - 1), 10, 30);
}

void keyPressed() {
  if (key == CODED) {
    if (keyCode == LEFT) {
      if (0 < newFrame) newFrame--; 
    } else if (keyCode == RIGHT) {
      if (newFrame < getLength() - 1) newFrame++;
    }
  } 
  setFrame(newFrame);  
}
  
int getFrame() {    
  return ceil(mov.time() * 30) - 1;
}

void setFrame(int n) {
  mov.play();
    
  // The duration of a single frame:
  float frameDuration = 1.0 / mov.frameRate;
    
  // We move to the middle of the frame by adding 0.5:
  float where = (n + 0.5) * frameDuration; 
    
  // Taking into account border effects:
  float diff = mov.duration() - where;
  if (diff < 0) {
    where += diff - 0.25 * frameDuration;
  }
    
  mov.jump(where);
  mov.pause();  
}  

int getLength() {
  return int(mov.duration() * mov.frameRate);
}

OSX: no such element factory "avfvideosrc", Could not open capture device avfvideosrc device-index=0

Hello,

First, thanks for the work on that library !

I'm trying to set up a UDP server on a Raspberry Pi, that would stream video images from the Rpi camera module on the network. At the moment, I have a prototype that uses processing-glvideo library, that I'm not yet running on the rpi but on macOS High Sierra, which works (although there is this red FIXME error) if I run it in Eclipse:

GLVideo: Device enumeration is currently not available on macOS. The device returned is a pure guesstimate.
0:00:00.150942000  1267 0x7fc92ec0b800 FIXME                default gstutils.c:3902:gchar *gst_pad_create_stream_id_internal(GstPad *, GstElement *, const gchar *):<avfvideosrc0:src> Creating random stream-id, consider implementing a deterministic way of creating a stream-id
Final caps: video/x-raw(memory:GLMemory), width=(int)320, height=(int)240, format=(string)RGBA, framerate=(fraction)30/1, texture-target=(string)2D

When I say it works, I mean that my mac camera lights on, and I can see the webcam image in the processing sketch.

However, when I made the program a runnable jar, and execute it from my terminal, I get the following error:

0:00:00.013288000  1343 0x7fa53c875cc0 WARN                 default gstdevicemonitor.c:372:GList *gst_device_monitor_get_devices(GstDeviceMonitor *):<devicemonitor0> No providers match the current filters
GLVideo: Device enumeration is currently not available on macOS. The device returned is a pure guesstimate.
Number of devices: 1
First device: avfvideosrc device-index=0
0:00:00.015523000  1343 0x7fa53c875cc0 WARN     GST_ELEMENT_FACTORY gstelementfactory.c:456:GstElement *gst_element_factory_make(const gchar *, const gchar *): no such element factory "avfvideosrc"!
0:00:00.015595000  1343 0x7fa53c875cc0 ERROR           GST_PIPELINE grammar.y:816:int priv_gst_parse_yyparse(void *, graph_t *): geen element "avfvideosrc"
Could not parse source element avfvideosrc device-index=0: geen element "avfvideosrc"
Exception in thread "Thread-2" java.lang.RuntimeException: Could not open capture device avfvideosrc device-index=0
	at gohai.glvideo.GLCapture.<init>(Unknown Source)
	at gohai.glvideo.GLCapture.<init>(Unknown Source)
	at NetworkMotionTracker.setup(NetworkMotionTracker.java:76)
	at processing.core.PApplet.handleDraw(PApplet.java:2374)
	at processing.opengl.PSurfaceJOGL$DrawListener.display(PSurfaceJOGL.java:731)
	at jogamp.opengl.GLDrawableHelper.displayImpl(GLDrawableHelper.java:692)
	at jogamp.opengl.GLDrawableHelper.display(GLDrawableHelper.java:674)
	at jogamp.opengl.GLAutoDrawableBase$2.run(GLAutoDrawableBase.java:443)
	at jogamp.opengl.GLDrawableHelper.invokeGLImpl(GLDrawableHelper.java:1293)
	at jogamp.opengl.GLDrawableHelper.invokeGL(GLDrawableHelper.java:1147)
	at com.jogamp.newt.opengl.GLWindow.display(GLWindow.java:759)
	at com.jogamp.opengl.util.AWTAnimatorImpl.display(AWTAnimatorImpl.java:81)
	at com.jogamp.opengl.util.AnimatorBase.display(AnimatorBase.java:452)
	at com.jogamp.opengl.util.FPSAnimator$MainTask.run(FPSAnimator.java:178)
	at java.util.TimerThread.mainLoop(Timer.java:555)
	at java.util.TimerThread.run(Timer.java:505)`

Do you have any clue why there would be a difference between running it inside Eclipse and running it in the terminal ? Is this error bound to GLvideo or to gstreamer? Would this error be different on another OS ?

error logs : RANDR missing on display and more

Hi

I just set up a spanking new raspberry 3 with its camera module V2
on the processing/raspian OS.
In the SimpleCapture exemple, I had to replace video.start); by video.play();
but then, It works ! :D

but it give me all this in the console

Xlib: extension "RANDR" missing on display ":10.0".
Devices:
[0] "mmal service 16.1"
Cannot parse capability: video/x-raw, format=(string)YUY2, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-raw, format=(string)UYVY, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-raw, format=(string)I420, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-raw, format=(string)YV12, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: image/jpeg, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ 1:4:7:1 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: image/jpeg, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Configs:
Cannot parse capability: video/x-raw, format=(string)BGRx, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ 1:1:5:4 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-raw, format=(string)BGR, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ 1:1:5:4 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-raw, format=(string)RGB, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ sRGB }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-raw, format=(string)NV21, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-raw, format=(string)NV12, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-raw, format=(string)YVYU, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Cannot parse capability: video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)[ 32, 3280, 2 ], height=(int)[ 32, 2464, 2 ], pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601 }, framerate=(fraction)[ 1/1, 90/1 ]
Final caps: video/x-raw(memory:GLMemory), width=(int)320, height=(int)200, framerate=(fraction)90/1, format=(string)RGBA, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, texture-target=(string)2D

just so you know

Render GLMovie in PGraphics buffer

Hi - first of all - thanks so much for all the work you've put into this project.

A quick overview of what I'm trying to accomplish:
Combine your GLVideo library with the Keystone Library to create some simple mapping utilities for the Pi.

Currently, I'm trying to render the GLMovie to an offscreen buffer using PGraphics, and then assigning that to a mapping surface using the keystone library's surface.render(buffer) function, but I'm getting a lot of errors, starting with:

GLVideo requires the P2D or P3D renderer.

(<unknown>:8891): GLib-CRITICAL **: g_error_free: assertion 'error != NULL' failed
java.lang.reflect.InvocationTargetException

The offscreen buffer is initialized with createGraphics(w, h, P2D)
This all works fine with standard Processing video.
I'd definitely be interested to know if there's a relatively simple solution to this issue - otherwise I know you've included a mapping example, so if all else fails I'll just build from that.

Thanks in advance,
Teddy

Ubuntu 17.04 x64 libglvideo.so: undefined symbol: gst_init_check

For ubuntu 17.04 x64 the libglvideo released from processing was compiled against 1.08 and the repository default is 1.10, so I had to compile this locally. One thing that I noticed, though, was that I had to add the link flags at the end, so I changed line 63 in the src/native/Makefile from
$(CC) $(CFLAGS) $(LDFLAGS) $^ -o $@
to
$(CC) $(CFLAGS) $^ -o $@ $(LDFLAGS)
Otherwise, the linking wouldn't be done correctly and processing would yield:
undefined symbol: gst_init_check
Looks like some other folks had this problem with the order of the flags in the past: https://stackoverflow.com/questions/8347676/issues-linking-against-gstreamer-libraries-ubuntu-11-10

Not sure if it's an issue or if changing the order of the flags won't affect the compilation on other platforms, but I'm posting it in case somebody else has this problem.

Error using GLVideo in Eclipse (osx): no glvideo in java.library.path

I'm using the Processing core-library on Eclipse (JDK 8), imported following this tutorial:
https://processing.org/tutorials/eclipse/

Then, i downloaded from this Git the library and imported the following classes, collecting them into gohai.glvideo package:

GLCapture.java
GLMovie.java
GLVideo.java
PerspectiveTransform.java
WarpPerspective.java

But i still have this issue trying to import a .mp4 video:

java.lang.UnsatisfiedLinkError: no glvideo in java.library.path
	at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
	at java.lang.Runtime.loadLibrary0(Runtime.java:870)
	at java.lang.System.loadLibrary(System.java:1122)
	at gohai.glvideo.GLVideo.loadGStreamer(GLVideo.java:109)
	at gohai.glvideo.GLVideo.<init>(GLVideo.java:66)
	at gohai.glvideo.GLMovie.<init>(GLMovie.java:36)
	at gohai.glvideo.GLMovie.<init>(GLMovie.java:57)
	at UsingProcessing.lambda$4(UsingProcessing.java:38)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.Iterator.forEachRemaining(Iterator.java:116)
	at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
	at UsingProcessing.lambda$1(UsingProcessing.java:35)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
	at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.Iterator.forEachRemaining(Iterator.java:116)
	at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
	at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
	at UsingProcessing.settings(UsingProcessing.java:29)
	at processing.core.PApplet.handleSettings(PApplet.java:973)
	at processing.core.PApplet.runSketch(PApplet.java:10739)
	at processing.core.PApplet.main(PApplet.java:10501)
	at processing.core.PApplet.main(PApplet.java:10483)
	at UsingProcessing.main(UsingProcessing.java:19)

error in comment

the file modules path is not correct,
should be:
/etc/modules

OpenGL error

I'm unable to execute any glvideo examples. Simple P3D examples work.
Using Processing 3.3.7
more specs:
lsb_release -a
Description: Ubuntu 17.10
Release: 17.10
Codename: artful

java -version
openjdk version "1.8.0_151"
OpenJDK Runtime Environment (build 1.8.0_151-8u151-b12-0ubuntu0.17.10.2-b12)
OpenJDK 64-Bit Server VM (build 25.151-b12, mixed mode)

inxi -F
Graphics: Card: Advanced Micro Devices [AMD/ATI] Richland [Radeon HD 8470D]
Display Server: x11 (X.Org 1.19.5 ) driver: radeon
Resolution: [email protected]
OpenGL: renderer: AMD ARUBA (DRM 2.50.0 / 4.13.0-37-generic, LLVM 5.0.0)

error trying to run singlevideo.pde for example:

The version of GStreamer on your system (1.12.3) does not match the version this library was compiled against (1.8). This is very likely to cause issues. Consider compiling this library yourself, using the sources from https://github.com/gohai/processing-glvideo.git

** (java:4215): CRITICAL **: gst_gl_handle_context_query: assertion 'display == NULL || DtsGetHWFeatures: Create File Failed
DtsGetHWFeatures: Create File Failed
Running DIL (3.22.0) Version
DtsDeviceOpen: Opening HW in mode 0
DtsDeviceOpen: Create File Failed
Final caps: video/x-raw(memory:GLMemory), width=(int)560, height=(int)406, framerate=(fraction)30000/1001, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, format=(string)RGBA, texture-target=(string)2D

GStreamer simple test is working:
gst-launch-1.0 videotestsrc ! autovideosink

How can I compile processing-glvideo against current GStreamer? I cloned the git cd into src/native and ran make and got fatal error: gst/gl/gl.h: No such file or directory

On another machine with Ubuntu 18.04 and AMDGPU driver I get:

 java.lang.RuntimeException: Profile GL3bc is not available on X11GraphicsDevice[type .x11, connection :1, unitID 0, handle 0x7fbb24018370, owner true, ResourceToolkitLock[obj 0x3cf9c4b4, isOwner true, <38c647f3, 7dc4ce8f>[count 1, qsz 0, owner <main-FPSAWTAnimator#00-Timer0>]]], but: [GLProfile[GLES1/GLES1.hw], GLProfile[GLES2/GLES3.hw], GLProfile[GL2ES1/GLES1.hw], GLProfile[GL4ES3/GL4.hw], GLProfile[GL2ES2/GL4.hw], GLProfile[GL4/GL4.hw], GLProfile[GLES3/GLES3.hw], GLProfile[GL4/GL4.hw], GLProfile[GL3/GL4.hw], GLProfile[GL2GL3/GL4.hw]]
        at processing.opengl.PSurfaceJOGL$2.run(PSurfaceJOGL.java:410)
        at java.lang.Thread.run(Thread.java:748)

uname -a
Linux r3 4.16.0-041600rc4-generic #201803041930 SMP Mon Mar 5 00:32:34 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux

Just for reference PixelFlow examples work, including shadertoy, video2 beta1 also working with all examples.

playing in reverse, controlling speed and ogg videos

Hi I am trying to use this library to control video speed and play it in reverse under some conditions.
I always get themessage "Can not set speed to -1.0" if I try to reverse.
I read that not all codecs support this so I tried with OGG video (*.ogv)
but I get the message
Missing decoder: Ogg (video/ogg)

Any hints on how to gain a fine control of speed and playing backwards?
Also is there any way to add the missing decoder?

TIA
sergio

No luck on OS X...

Maybe I'm doing something dumb or my setup is a little too old?
OS X 10.9.5, Processing 3.3.4, MBP 13" 2014 (Intel Iris 1536 MB)

With any example, I'm getting a -5 error (here with single video example):
Debugging information: gsttypefindelement.c(1228): void gst_type_find_element_loop(GstPad *) (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind: streaming stopped, reason error (-5)

Memory Leak when loading new video file

Hi there,

I have this simple test case:

// put at least two mp4 files in the data/ directory of this sketch

import gohai.glvideo.*;
import java.io.File;

GLMovie movie;

String movieNames[] = {};
int currentMovieIndex;
int time;

void settings() {
	size( 640, 480, P2D );
}

void setup() {

	// read all .mp4 files from data directory
	File dir = new File(dataPath(""));
	File[] files = dir.listFiles();
	for( int i=0; i < files.length; i++ ){
		String path = files[i].getAbsolutePath();
		if( path.toLowerCase().endsWith(".mp4") ) movieNames = append( movieNames, path );
	}

	// load the first movie
	println( "load first movie" );
	movie = new GLMovie( this, movieNames[0] );
	movie.loop();

	time = millis();

}

void draw() {

	// load new movie every 5 seconds
	if( time > 0 && millis() > time + 5000 ) {

		// try to free memory; nothing of this seems to work
		g.removeCache( movie );
		movie.close();
		movie = null;
		System.gc();

		// load next movie
		++currentMovieIndex;
		if( currentMovieIndex >= movieNames.length ) currentMovieIndex = 0;
		String newMovieName = movieNames[ currentMovieIndex ];
		println( "load movie '" + newMovieName + "'" );
		movie = new GLMovie( this, newMovieName );
		movie.loop();

		time = millis();
	}

	// display movie on screen
	if( movie.available() ) movie.read();
	background(0);
	image( movie, 0, 0, width, height );

}

a video is displayed and every 5 seconds the video is replaced with a new video. I'm running this on a Mac and in Activity Monitor I can see an increase in the memory every time a new video loads. After a few minutes, the framerate drops and videoplayback gets chunky.

I'm running this on Processing 3.2.2 with GL Video 1.1.

I'm not that experienced in processing or java, so I don't know how I can debug and fix this issue. Can someone help me out here?

Fix Hang on Mac

The TwoVideos example is currently hanging on Mac, due to calling jump() before play() on one of the files. This issue only occurs on Mac, and was determined to have something to to with CFRunLoop on this platform. Waiting for progress on GStreamer bug #768630.

Using geomerative and glvideo causing unresponsive program only on fullScreen

The following code runs fine on my OSx (using processing.video) environment but when I run it on raspberry pi (with glvideo) and pi camera the program frequently becomes unresponsive when I try to use fullscreen. The problem is somehow related to the rendering of the SVG file but I can't diagnose further. When I hardcode the screen size to almost the monitor size it works fine so I don't think it is a CPU problem. FrameRate is only 5.

SimpleExample.zip

something failed in ex.SimpleCapture

i work as u say,add bcm2835_v4l2 to the file /etc/modules. and /dev/video0 also can be find.
but in the processing case.it's work without waring or error,and all the view is black.

Question about SimpleCapture example

When running the SimpleCapture example on a Raspberry Pi, I ran into an issue where I obtained no error, but only a black window. I am using the Raspberry Pi with Real VNC.

I tried connecting the Pi to my television, and then I managed to see the video output properly.

I don't know/understand much about GL, but am I understanding correctly that this type of output is something that would not be available in a VNC viewer?

Camera module - enable settings

Hi,
thank you for the program. I would like to configure camera module settings on capture. Set exposure/gain etc. I tried executing "raspivid -ex off" to turn off automatic exposure with runtime. Exec() shell command from processing sketch but it did not bind with the ongoing video stream. Is there any way how to interface with camera settings such as gain/exposure and other raspivid config or similar parameters? Thank you in advance for any hint

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.