Coder Social home page Coder Social logo

gst-rpicamsrc's Introduction

gst-rpicamsrc is a GStreamer wrapper around the raspivid/raspistill
functionality of the RaspberryPi, providing a GStreamer
source element capturing from the Rpi camera.

** rpicamsrc has moved **
GStreamer 1.18 will include rpicamsrc directly. The GStreamer git master repository already
includes it, plus fixes beyond this tree. Future development will happen directly in the 
GStreamer repositories, and this tree archived.

Old README:

To use it, you need GStreamer 1.0. As of the last time this README
was updated, Raspbian Jessie has GStreamer 1.4.4, which is recent enough.

Install the GStreamer 1.0 dev packages, then just build and install this module.

apt-get install autoconf automake libtool pkg-config libgstreamer1.0-dev \
 libgstreamer-plugins-base1.0-dev libraspberrypi-dev

For example, again on Raspbian:

./autogen.sh --prefix=/usr --libdir=/usr/lib/arm-linux-gnueabihf/
make
sudo make install

Then, you can try out the camera with a pipeline line:
gst-launch-1.0 rpicamsrc bitrate=1000000 ! filesink location=test.h264
should produce a file called test.h264, while showing a preview window
on the screen.

gst-inspect-1.0 rpicamsrc to get an idea of the properties that have been
implemented.

See the REQUIREMENTS file for the full list of build requirements.

gst-rpicamsrc's People

Contributors

calvaris avatar fthiery avatar hsteinhaus avatar iljabauer avatar nh2 avatar philn avatar staroselskii avatar thaytan avatar tp-m avatar vivia avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gst-rpicamsrc's Issues

bug in gst video force key unit event

when I launch the following gstreamer pipeline,
gst-launch-1.0 -v rpicamsrc bitrate=3000000 rotation=180 ! video/x- h264,width=640,height=480,framerate=30/1 ! h264parse ! mpegtsmux ! hlssink

hlssink will emit an upstream force key unit event. however, the event handling by "rpicamsrc" fails.

(gst-launch-1.0:2462): GStreamer-CRITICAL **: gst_event_get_structure: assertion 'GST_IS_EVENT (event)' failed

(gst-launch-1.0:2462): GStreamer-CRITICAL **: gst_event_get_seqnum: assertion 'GST_IS_EVENT (event)' failed

(gst-launch-1.0:2462): GStreamer-CRITICAL **: gst_mini_object_unref: assertion 'mini_object->refcount > 0' failed

setting properties at runtime

I've tested some code which works with v4l2src but it doesn't with gst-rpicamsrc. Can anybody confirm whether this is an issue or...

failed to load

Hi,

I'm trying to run it in a uClibc based system. It compiles well but when trying to load it reports an error:

# gst-inspect-1.0 ./libgstrpicamsrc.so

gst-inspect-1.0: symbol 'vcos_pthreads_logging_assert': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_log_register': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_abort': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_generic_mem_free': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_thread_join': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_global_unlock': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_generic_mem_alloc': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_thread_create': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_log_impl': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_pthreads_map_errno': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_thread_attr_init': can't resolve symbol

gst-inspect-1.0: symbol 'vcos_global_lock': can't resolve symbol

(gst-inspect-1.0:2689): GStreamer-WARNING **: Failed to load plugin './libgstrpicamsrc.so': unknown dl-error
Could not load plugin file: Opening module failed: unknown dl-error

Headers not installed when `make install`

The INSTALL readme says that the headers should be found in /usr/local/include but after cloning and running configure & sudo make install I cannot find any headers to develop against.

./autogen.sh stops with an error

Hello.
I would like to compile gst-rpicamsrc on RPi 3 Jessie.
I used the instructions but for some reason i can't figure out i'll end up with an error when execute: ./autogen.sh --prefix=/usr --libdir=/usr/lib/arm-linux-gnueabihf/
Returns:
autoreconf: Entering directory `.'
autoreconf: configure.ac: not using Gettext
autoreconf: running: aclocal --force
autom4te: cannot create autom4te.cache: No such file or directory
aclocal: error: echo failed with exit status: 1
autoreconf: aclocal failed with exit status: 1
autogen.sh failed

And i'm kind of lost now, so can someone tell what when wrong and how to fix it?
G-streamer is already installed with Jessie and working. Also the compiler is shipped with Jessie.
Thank you for the help in advance.

Capturing in 1280x720 doesn't use the entire FoV

Hi. I've just switched from recording at ~500x500 to 1280x720 and cropping but it seems like it's not using the entire FoV as it should, it's stated in the documentation.

Is this known? Is there a workaround?

Changing bitrate while streaming

Hi,

Is there any chance to implement dynamic bitrate as x264enc supports. After setting pipeline to PLAYING state, even I change the bitrate property of rpicamsrc element, it doesn't change it's output bitrate. I'm coding it in C.

Thank you.

AWB settings are ignored

Setting awb-mode does not have any effect on the image, also setting awb-mode=off and manually awb-gain-blue=4.2 awb-gain-red=1.0 doesn't have any effect, the image is always slightly too red.

Success Streaming to Client Machine & Commentary After Testing

Ok, I successfully got everything up and running and tested the test.H264 file. It was too much file for my Raspberry Pi model B to run using VLC, so I transferred it to my main computer and the video in the file was quite nice, minimal artifacts and all.

My next step was to try streaming from the Raspberry Pi to a client machine. I looked at other gstreamer1.0 posts about streaming and created this command line for my Raspberry Pi (please be aware that I may have made typos, I'm typing, not cutting and pasting):

sudo gst-launch-1.0 -v rpicamsrc bitrate=1000000 ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=192.168.0.202 port=5000

192.168.0.202 is the fixed IP address of the Raspberry Pi.

On the client machine I ran this command line:

gst-launch-1.0 -v tcpclientsrc host=192.168.0.202 port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! autovideosink sync=false

This results in getting a gst-launch-1.0 video display window on the client. At first I was only getting a gray, artifact-only display and I was going to write that but suddenly while I was typing this, up popped a very nice stream from the Raspberry Pi, the best I've seen yet. With good lighting, it's very nice -- better than probably 80% of the cameras I've worked with in a professional setting (mostly decent quality Panasonic analog cameras in the sub $5000 range as well as lots of Axis cameras and USB 2.0 cams). And that's without having tweaked anything beyond what you see here.

This is the best streaming result I've had yet with the Raspberry Pi. My Pi is set up as a DHCP server and the client is a Lubuntu 13.04(?) machine. It took a while yesterday to make sure I had all the libraries (various packages like gst-libav or gstreamer1.0-libav, and to sudo apt-get update/upgrade)

Some other (but not all) packages I installed in the process:

libgstreamer1.0-0-dbg
gstreamer1.0-tools
libgstreamer-plugins-base1.0-0
gstreamer1.0-plugins-good
gstreamer1.0-plugins-bad-dbg
gstreamer1.0-alsa
restricted extras

I tried to run go through the same process on a Lubuntu 12.10 client machine (that receives the stream from the Raspberry Pi) and couldn't get it to work, but it worked after I upgraded the distribution on the machine from 12.10 to 13.04 (using the process at: http://lawrit.lawr.ucdavis.edu/it-help-center/how-to/upgrading-ubuntu-via-command-line)

Thanks again, Jan!

requesting an intra frame while recording

Hello thaytan,

As an alternative, I am trying to wrap the python picamera module with gstreamer pipeline using the appsrc element.

in picamera module, I can redirect the stream to a custom output that must have a "write" method.

class CamStreamer(object):
    def __init__(self,videoFormat,resolution,framerate):
        self.size = 0
        self.capstring = 'video/x-'+videoFormat+',width='+str(resolution[0]) \
                         +',height='+str(resolution[1])+',framerate=' \
                         +str(framerate)+'/1'
        self.playing = False
        self.paused = False
        self.make_pipeline()
        self.play_pipeline()

    def make_pipeline(self):
        CLI = [
                  'appsrc name="source" ! ',
                  'h264parse ! video/x-h264,stream-format=avc ! ',
                  'h264parse ! video/x-h264,stream-format=byte-stream ! ',
                  'mpegtsmux ! filesink name="sink"'
                  ]
        gcmd = "".join(CLI)
        self.pipeline = Gst.parse_launch(gcmd)
        self.filesink = self.pipeline.get_by_name("sink")
        self.filesink.set_property("location","custom.ts")
        self.appsrc = self.pipeline.get_by_name("source")
        self.appsrc.set_property("is-live",True)
        self.gstcaps = Gst.Caps.from_string(self.capstring)
        self.appsrc.set_property("caps",self.gstcaps)

    def play_pipeline(self):
        self.pipeline.set_state(Gst.State.PLAYING)

    def stop_pipeline(self):
        self.pipeline.set_state(Gst.State.READY)

    def write(self,s):
        gstbuff = Gst.Buffer.new_wrapped(s)
        ret = self.appsrc.emit("push-buffer",gstbuff)

    def flush(self):
        self.stop_pipeline()

and then I use the camera object as below:

class RPiCam(object):
    def __init__(self):
        self.resolution = (640,480)
        self.bitrate = 1000000
        self.format = "h264"
        self.framerate = 30
        self.rotation = 180

        self.camera = picamera.PiCamera()
        self.camera.resolution = self.resolution
        self.camera.framerate = self.framerate

    def startCam(self):
        self.camera.start_recording(CamStreamer(self.format,self.resolution,
                                                     self.framerate),format=self.format, bitrate=self.bitrate)

    def recordCam(self,duration):
        self.camera.wait_recording(duration)

    def stopCam(self):
        self.camera.stop_recording()

    def closeCam(self):
        self.camera.close()

I wish to request an i frame while the recording is in progress. I see the following lines of code in RaspiCapture.c which give me some hints on how to request a key frame against an external event. Can you help me how to implement this using python picamera module?

gboolean raspi_capture_request_i_frame(RASPIVID_STATE *state)
{
   MMAL_PORT_T *encoder_output = NULL;
   MMAL_STATUS_T status;
   MMAL_PARAMETER_BOOLEAN_T param = {{  MMAL_PARAMETER_VIDEO_REQUEST_I_FRAME, sizeof(param)}, 1};
   encoder_output = state->encoder_component->output[0];
   status = mmal_port_parameter_set(encoder_output, &param.hdr);
   if (status != MMAL_SUCCESS)
   {
      vcos_log_error("Unable to request I-frame");
      return FALSE;
   }
   return TRUE;
}

Thanks!

Some missing properties

Works fine, thank you.

When I last used the camera I used v4l2src with the "collabora" v4l2 kernel driver. Now that stopped working (I get oops in dmesg and it got annoying) so I tried your approach and it does the job quite fine.

  1. The auto white balance of the camera is very bad. It wobbles around all the time like being high on LSD.

    They realized this and offer to fixate AWB gains now with --awbg <red>,<blue>

  2. Another thing, this guy implemented "Periodic Intra Refresh". I’m hoping this might get latency just a little lower.

  3. And finally they have an explicit --mode switch for sensor operation modes. They switch modes implicitly based on width and height thresholds. An explicit selection would be good

Could these three things be added?

And if you don’t mind, some questions. Are the timestamps the same that the V4l2 driver delivers? Any hint what would be the best way to stream low latency video onto Android? Via WebRTC maybe?

autogen.sh fails on updated RPI

1366 ./autogen.sh --prefix=/usr --libdir=/usr/lib/arm-linux-gnueabihf/
1367 README
1368 more README
1369 more REQUIREMENTS
1370 sudo apt-get install autoreconf
1371 sudo apt-get install autoconf
1372 sudo apt-get install automake
1373 sudo apt-get install libtool
1374 sudo apt-get install libgstreamer1.0-dev
1375 sudo apt-get install libgstreamer-plugins-base1.0-dev
1376 more REQUIREMENTS
1377 sudo apt-get install libraspberrypi-dev
1378 nano configure.ac
1379 history
pi@picam001-f ~/gst-rpicamsrc $ !1366
./autogen.sh --prefix=/usr --libdir=/usr/lib/arm-linux-gnueabihf/
autoreconf: Entering directory `.'
autoreconf: configure.ac: not using Gettext
autoreconf: running: aclocal --force
autoreconf: configure.ac: tracing
autoreconf: configure.ac: not using Libtool
autoreconf: running: /usr/bin/autoconf --force
configure.ac:23: error: possibly undefined macro: AC_SUBST
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
configure.ac:34: error: possibly undefined macro: AC_MSG_ERROR
autoreconf: /usr/bin/autoconf failed with exit status: 1
autogen.sh failed

How to resize video

Hello, I'm not sure if it's the right place to ask my question, but by using "gst-inspect-1.0 rpicamsrc" I didn't find a way to change the size of the video, such as I did before using rpicamsrc : "raspivid -t 999999 -h 240 -w 320 -fps 25 -b 2000000 -o - "
So, I'd like to know if it's possible using rpicamsrc ?
Thanks

Issues Compiling and Running

When I ran "make", I got error:

gstrpicam-enum-types.c:254:1: fatal error: opening dependency file .deps/libstrpicamsrc_la-gstrpicam-enum-types.Tpo: Permission denied

So, I ran "sudo make" and it seemed to run ok.

At the end of the process, I ran:

gst-launch-1.0 rpicamsrc bitrate=1000000 ! filesink location=test.h264

And got error:

ERROR: pipeline could not be constructed: no element "rpicamsrc"

Running gst-inspect-1.0 shows 161 plugins, 578 features.

Thanks for any feedback.

Implementing video/x-raw support?

Hi,
I'm a noob to the rpi + gstreamer and I had some questions:
Is there amy plan to implement video/x-raw cap support in rpicamsrc?
I'm building a project where I need to extract several ROIs (up to 12 or so) from a single frame.
Since video/x-raw is needed, in your opinion, does it really matter performance-wise if v4l2src or rpicamsrc is used?

Thank you,
Ben

Issues running MAKE command

Hey there!

First off, thanks so much for your work on this... I cant wait to get it fully functional.. This will be the final step in my project.

Although I am having a few issues...

First off... I am using a fresh install of the newest Raspbian OS.

After figuring out and installing all of the individual dependencies of this project I finally got the autogen.sh to run. **Needed to install AUTOMAKE and LIBTOOL and a couple others...
As well as finally getting the proper GST developer version installed.

I finally got the autogen.sh to complete successfully and got the echo, now run the make command.

But when running the "make" command I get some errors...
Folder was cloned to /
so directory path was..
/gst-rpicamsrc

First off I was getting this.. error that no such file or directory exsisted..

gst/gst.h

Issue finding gst.h

root@raspberrypi /gst-rpicamsrc $ make
make all-recursive
make[1]: Entering directory /gst-rpicamsrc' Making all in src make[2]: Entering directory/gst-rpicamsrc/src'
CC libgstrpicamsrc_la-gstrpicamsrc.lo
gstrpicamsrc.c:61:21: fatal error: gst/gst.h: No such file or directory
compilation terminated.
make[2]: *** [libgstrpicamsrc_la-gstrpicamsrc.lo] Error 1
make[2]: Leaving directory /gst-rpicamsrc/src' make[1]: *** [all-recursive] Error 1 make[1]: Leaving directory/gst-rpicamsrc'
make: *** [all] Error 2


So I moved the folder I found in...
/usr/include/gstreamer-1.0

Called "gst" into the /gst-rpicamsrc/src folder... And finally by passed that issue...

But then, when ran, it couldnt find, /gst-rpicamsrc/src/gstrpicam-enum-types.h

Issues finding gstrpicam-enum-types.h

root@raspberrypi:/gst-rpicamsrc# make
make all-recursive
make[1]: Entering directory /gst-rpicamsrc' Making all in src make[2]: Entering directory/gst-rpicamsrc/src'
CC libgstrpicamsrc_la-gstrpicamsrc.lo
gstrpicamsrc.c:66:34: fatal error: gstrpicam-enum-types.h: No such file or directory
compilation terminated.
make[2]: *** [libgstrpicamsrc_la-gstrpicamsrc.lo] Error 1
make[2]: Leaving directory /gst-rpicamsrc/src' make[1]: *** [all-recursive] Error 1 make[1]: Leaving directory/gst-rpicamsrc'
make: *** [all] Error 2


XXXXX--- I ASSUMED WRONG... ------
XXI assumed that the files
XXgstrpicam-enums-templates.h
XX" " .c
XXwere the files it was looking for, maybe got renamed...
XXChanged their names but now getting some more errors when running make... so I might have been wrong on re XXnaming the last ones...

--EDIT---
After reading more into the code I see that these two files are being generated by the /gst-rpicamsrc/src/MakeFile.am
But they are also included in the
/gst-rpicamsrc/.gitignore file...
Not sure why they were added to it recently, but they are not building properly and being added to the /src file.

Any help would be great! I really would like to get this program functional!

I am a bit of a newbie to the whole *-nix operating systems including raspbian... So hopefully I am just missing something minor...

Thanks again!!

invalid property id warnings for preview-* properties

  preview-x           : Start X coordinate of the preview window (in pixels)
** (gst-inspect-1.0:21092): WARNING **: gstrpicamsrc.c:1037: invalid property id 7 for "preview-x" of type 'GParamInt' in 'GstRpiCamSrc'
                        flags: readable, writable
                        Integer. Range: 0 - 2048 Default: 0 
  preview-y           : Start Y coordinate of the preview window (in pixels)
** (gst-inspect-1.0:21092): WARNING **: gstrpicamsrc.c:1037: invalid property id 8 for "preview-y" of type 'GParamInt' in 'GstRpiCamSrc'
                        flags: readable, writable
                        Integer. Range: 0 - 2048 Default: 0 
  preview-w           : Width of the preview window (in pixels)
** (gst-inspect-1.0:21092): WARNING **: gstrpicamsrc.c:1037: invalid property id 9 for "preview-w" of type 'GParamInt' in 'GstRpiCamSrc'
                        flags: readable, writable
                        Integer. Range: 0 - 2048 Default: 0 
  preview-h           : Height of the preview window (in pixels)
** (gst-inspect-1.0:21092): WARNING **: gstrpicamsrc.c:1037: invalid property id 10 for "preview-h" of type 'GParamInt' in 'GstRpiCamSrc'

Package doesn't build on Raspbian 8

I installed the deps specified in REQUIREMENTS; dpkg-builddeps returns nothing (ok). Fresh raspbian 8 install

pi@raspberrypi:~/gst/gst-rpicamsrc $ sudo apt-get install autoconf automake libtool pkg-config libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libraspberrypi-dev
pi@raspberrypi:~/gst/gst-rpicamsrc $ dpkg-buildpackage 
dpkg-buildpackage: source package gstreamer1.0-rpicamsrc
dpkg-buildpackage: source version 1.0.0
dpkg-buildpackage: source distribution wheezy
dpkg-buildpackage: source changed by hornetwl <[email protected]>
dpkg-buildpackage: host architecture armhf
 dpkg-source --before-build gst-rpicamsrc
 fakeroot debian/rules clean
dh clean  --with autotools-dev
   dh_testdir
   dh_auto_clean
   dh_autotools-dev_restoreconfig
   dh_clean
 dpkg-source -b gst-rpicamsrc
dpkg-source: info: using source format `3.0 (native)'
dpkg-source: info: building gstreamer1.0-rpicamsrc in gstreamer1.0-rpicamsrc_1.0.0.tar.xz
dpkg-source: info: building gstreamer1.0-rpicamsrc in gstreamer1.0-rpicamsrc_1.0.0.dsc
 debian/rules build
dh build  --with autotools-dev
   dh_testdir
   dh_autotools-dev_updateconfig
   dh_auto_configure
   dh_auto_build
   dh_auto_test
 fakeroot debian/rules binary
dh binary  --with autotools-dev
   dh_testroot
   dh_prep
   debian/rules override_dh_auto_install
make[1]: Entering directory '/home/pi/gst/gst-rpicamsrc'
dh_auto_install
find /home/pi/gst/gst-rpicamsrc/debian/gstreamer1.0-rpicamsrc -name '*.la' | xargs rm --verbose
rm: missing operand
Try 'rm --help' for more information.
debian/rules:28: recipe for target 'override_dh_auto_install' failed
make[1]: *** [override_dh_auto_install] Error 123
make[1]: Leaving directory '/home/pi/gst/gst-rpicamsrc'
debian/rules:25: recipe for target 'binary' failed
make: *** [binary] Error 2
dpkg-buildpackage: error: fakeroot debian/rules binary gave error exit status 2

Audio

Thaytan,
Would it be possible to expand this wrapper to pull in audio from a USB microphone source on the RPi?
Thanks!

Could you clarify whether this used HW encoding?

Sorry for the very basic question, but I'm not particularly familiar with video encoding in general. Does this module make use of the hardware video encoding on the Pi or does it do a manual encode?

Thanks a lot for this great module :)

Property `intra-refresh-type` set to `cyclic` or `both`

I’m kinda sure this worked before. But now when I set intra-refresh-type=cyclic or intra-refresh-type=both it bombs out.

mmal: mmal_vc_port_enable: failed to enable port vc.ril.video_encode:in:0(OPQV): EINVAL
mmal: mmal_port_enable: failed to enable connected port (vc.ril.video_encode:in:0(OPQV))0x72104240 (EINVAL)
mmal: mmal_connection_enable: output port couldn't be enabled
ERROR: from element /GstPipeline:pipeline0/GstRpiCamSrc:rpicamsrc0: Internal data flow error.

failing launch command

Hello all, so I've been working my way through this project and I feel like I'm so close to getting this to work! I've followed all the instructions so far but when i run:
gst-launch-1.0 rpicamsrc bitrate=1000000 ! filesink location=test.h264

I get:
Setting pipeline to PAUSED ...
In src_start()
mmal: mmal_vc_component_create: failed to create component 'vc.ril.camera' (1:ENOMEM)
mmal: mmal_component_create_core: could not create component 'vc.ril.camera' (1)
mmal: Failed to create camera component
mmal: raspi_capture_setup: Failed to create camera component
ERROR: Pipeline doesn't want to pause.
Setting pipeline to NULL ...
Freeing pipeline ...

(all run as root, to avoid typing sudo constantly)
Thanks for any input on something I may have missed!

Info: This is running on Raspbian and I'm currently SSHing into the Pi.

Question about feasibility of an idea, re: motion detection

One other project I’m following on Github is https://github.com/dickontoo/omxmotion.

What it is doing is to re-implement camera interface with OMX, not mmal. The clou is that it accesses the motion vectors exported by the hardware H264 encoder to detect motion. If enough vectors sum up to a threshold then file writing starts. In parallel a continuous multicast network stream is sent out.

Now this works as a proof-of-concept, but I was wondering if it weren’t easier to write a GStreamer module that acts as a "frame valve". It would receive video frames and motion vector data from
rpicamsrc and analyze ths data and decide to pass on video into its sink like this:

[rpicamsrc]  → [stuff] → [network streaming]
             ↓
             [omxmotion as frame "valve"] → [movmux] →[multifilesink]

omxmotion is using ffmpeg to implement all the other boxes and it is quite hard as it seems.

So my idea is a plug-in motion detection module. Would that be feasible in the GStreamer architecture?

autogen failing

Hey everyone, and idea why I keep getting this on autogen?

pi@raspberrypi ~/github/gst-rpicamsrc $ ./autogen.sh --prefix=/usr --libdir=/usr/lib/arm-linux-gnueabihf/
autoreconf: Entering directory .' autoreconf: configure.ac: not using Gettext autoreconf: running: aclocal --force autoreconf: configure.ac: tracing autoreconf: configure.ac: not using Libtool autoreconf: running: /usr/bin/autoconf --force autoreconf: running: /usr/bin/autoheader --force autoreconf: running: automake --add-missing --copy --force-missing src/Makefile.am:1: Libtool library used butLIBTOOL' is undefined
src/Makefile.am:1: The usual way to define LIBTOOL' is to addLT_INIT'
src/Makefile.am:1: to configure.ac' and runaclocal' and autoconf' again. src/Makefile.am:1: IfLT_INIT' is in `configure.ac', make sure
src/Makefile.am:1: its definition is in aclocal's search path.
autoreconf: automake failed with exit status: 1
autogen.sh failed

Package does not contain libs if local compilation has not been done

After patching gst-rpicamsrc/debian/rules with || true so that the find|xargs doesn't fail, the following will produce an empty package:

cd gst-rpicamsrc
dpkg-buildpackage
cd ../
sudo dpkg -i gstreamer1.0-rpicamsrc_1.0.0_armhf.deb

The package didn't install the libs at all:

dpkg -L  gstreamer1.0-rpicamsrc
/.
/usr
/usr/share
/usr/share/doc
/usr/share/doc/gstreamer1.0-rpicamsrc
/usr/share/doc/gstreamer1.0-rpicamsrc/copyright
/usr/share/doc/gstreamer1.0-rpicamsrc/changelog.gz

However, if i do:

cd gst-rpicamsrc
./autogen.sh
make
dpkg-buildpackage
cd ../
sudo dpkg -i gstreamer1.0-rpicamsrc_1.0.0_armhf.deb

This time, the package does contain the libs !

$ dpkg -L gstreamer1.0-rpicamsrc
/.
/usr
/usr/lib
/usr/lib/arm-linux-gnueabihf
/usr/lib/arm-linux-gnueabihf/gstreamer-1.0
/usr/lib/arm-linux-gnueabihf/gstreamer-1.0/libgstrpicamsrc.so
/usr/share
/usr/share/doc
/usr/share/doc/gstreamer1.0-rpicamsrc
/usr/share/doc/gstreamer1.0-rpicamsrc/copyright
/usr/share/doc/gstreamer1.0-rpicamsrc/changelog.gz

Shouldn't dpkg-buildpackage compile on it's own ?

"time stamp together as an annotation"- want to show format correctly

Hi, thanks for your useful tools.

I'm trying to overlay timestamp with this guidance "#45".
I could insert time&date on video, but it appears like "22???48???22 2016???06???15"
I want to correct that like "22:48:42 2016/06/15".
Please give me some advice.

I wrote the code below now.
gst-launch-1.0 -e mp4mux name="muxer" ! filesink location=/home/pi/Videos/$(date +%Y%m%d_%H%M%S).mp4 -v rpicamsrc annotation-mode=date+time bitrate=1000000 ! video/x-h264,width=1280,height=720,framerate=30/1,profile=high ! h264parse ! muxer. alsasrc device=hw:1 ! audioconvert ! lamemp3enc target=1 bitrate=128 cbr=true ! muxer.
Thank you.

annotation-text-colour Type Mismatch

These options are described as follows, but the information contradicts itself:

annotation-text-colour: Set the annotation text colour, as a VUY hex value eg #8080FF, -1 for default
flags: readable, writable
Integer. Range: -1 - 2147483647 Default: -1

annotation-text-bg-colour: Set the annotation text background colour, as a VUY hex value eg #8080FF, -1 for default
flags: readable, writable
Integer. Range: -1 - 2147483647 Default: -1

In that it states you set the color as #HEX but then the actual Type below states Integer.

The latter appears to be correct as it'll fail when using the value type the description states:

WARNING: erroneous pipeline: could not set property "annotation-text-colour" in element "rpicamsrc0" to "#FFFFFF"

need some help with gstreamer and python

I have two Raspberry Pi. on the first one I executed both of the below commands to stream video and audio.

For video:

raspivid -t 999999 -w 1080 -h 720 -fps 25 -hf -b 2000000 -o - | \gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=serverIp port=5000

For audio:

gst-launch-1.0 -v alsasrc device=plughw:Set ! mulawenc ! rtppcmupay ! udpsink host=clientIp port=5001



then I write a python program to capture video and audio. my program worked fin on my laptop that has Ubuntu. but when I run it on my second Raspberry Pi I get the following warnings and a black screen showed up:


** (DO3.py:2872): WARNING **: Can't load fallback CSS resource: Failed to import: The resource at '/org/gnome/adwaita/gtk-fallback.css' does not exist

** (DO3.py:2872): WARNING **: Can't load fallback CSS resource: Failed to import: The resource at '/org/gnome/adwaita/gtk-fallback.css' does not exist
prepare-window-handle

(DO3.py:2872): GStreamer-WARNING **: gstpad.c:4506:store_sticky_event:gdepay:src Sticky event misordering, got 'segment' before 'caps'


and here is my python program:

!/usr/bin/python3

from os import path

import gi
gi.require_version('Gst', '1.0')
from gi.repository import GObject, Gst, Gtk

from gi.repository import GdkX11, GstVideo

GObject.threads_init()
Gst.init(None)

class Player(object):
def init(self):

    self.window = Gtk.Window()
    self.window.connect('destroy', self.quit)
    self.window.set_default_size(800, 450)

    self.drawingarea = Gtk.DrawingArea()
    self.window.add(self.drawingarea)

    # Create GStreamer pipeline
    self.pipeline = Gst.Pipeline()
    self.pipeline_A = Gst.Pipeline()

    # Create bus to get events from GStreamer pipeline
    self.bus = self.pipeline.get_bus()
    self.bus.add_signal_watch()
    self.bus.connect('message::eos', self.on_eos)
    self.bus.connect('message::error', self.on_error)

    # This is needed to make the video output in our DrawingArea:
    self.bus.enable_sync_message_emission()
    self.bus.connect('sync-message::element', self.on_sync_message)

VIDEO Pipeline

|

|

V

    self.tcpsrc = Gst.ElementFactory.make('tcpclientsrc','tcpsrc')
    self.tcpsrc.set_property("host", '192.168.1.9')
    self.tcpsrc.set_property("port", 5000)

    self.gdepay = Gst.ElementFactory.make('gdpdepay', 'gdepay')


    self.rdepay = Gst.ElementFactory.make('rtph264depay', 'rdepay')

    self.avdec = Gst.ElementFactory.make('avdec_h264', 'avdec')

    self.vidconvert = Gst.ElementFactory.make('videoconvert', 'vidconvert')

    self.asink = Gst.ElementFactory.make('autovideosink', 'asink')
    self.asink.set_property('sync', False)
    #self.asink.set_property('emit-signals', True)
    #self.set_property('drop', True)

    self.pipeline.add(self.tcpsrc)
    self.pipeline.add(self.gdepay)
    self.pipeline.add(self.rdepay)
    self.pipeline.add(self.avdec)
    self.pipeline.add(self.vidconvert)
    self.pipeline.add(self.asink)

    self.tcpsrc.link(self.gdepay)
    self.gdepay.link(self.rdepay)
    self.rdepay.link(self.avdec)
    self.avdec.link(self.vidconvert)
    self.vidconvert.link(self.asink)

^

|

|

VIDEO Pipeline

AUDIO Pipeline

|

|

V

    self.udpsrc = Gst.ElementFactory.make('udpsrc', 'udpsrc')
    self.udpsrc.set_property("port",5001)
    audioCaps = Gst.Caps.from_string("application/x-rtp")
    self.udpsrc.set_property("caps", audioCaps)

    self.queue = Gst.ElementFactory.make('queue', 'queue')


    self.rtppcmudepay = Gst.ElementFactory.make('rtppcmudepay', 'rtppcmudepay')

    self.mulawdec = Gst.ElementFactory.make('mulawdec', 'mulawdec')

    self.audioconvert = Gst.ElementFactory.make('audioconvert', 'audioconvert')

    self.autoaudiosink = Gst.ElementFactory.make('autoaudiosink', 'autoaudiosink')
    self.autoaudiosink.set_property('sync', False)

    self.pipeline_A.add(self.udpsrc)
    self.pipeline_A.add(self.queue)
    self.pipeline_A.add(self.rtppcmudepay)
    self.pipeline_A.add(self.mulawdec)
    self.pipeline_A.add(self.audioconvert)
    self.pipeline_A.add(self.autoaudiosink)

    self.udpsrc.link(self.queue)
    self.queue.link(self.rtppcmudepay)
    self.rtppcmudepay.link(self.mulawdec)
    self.mulawdec.link(self.audioconvert)
    self.audioconvert.link(self.autoaudiosink)

^

|

|

AUDIO Pipeline

def run(self):
    self.window.show_all()
    # You need to get the XID after window.show_all().  You shouldn't get it
    # in the on_sync_message() handler because threading issues will cause
    # segfaults there.
    self.xid = self.drawingarea.get_property('window').get_xid()
    self.pipeline.set_state(Gst.State.PLAYING)
    self.pipeline_A.set_state(Gst.State.PLAYING)
    Gtk.main()

def quit(self, window):
    self.pipeline.set_state(Gst.State.NULL)
    Gtk.main_quit()

def on_sync_message(self, bus, msg):
    if msg.get_structure().get_name() == 'prepare-window-handle':
        print('prepare-window-handle')
        msg.src.set_window_handle(self.xid)

def on_eos(self, bus, msg):
    print('on_eos(): seeking to start of video')
    self.pipeline.seek_simple(
        Gst.Format.TIME,        
        Gst.SeekFlags.FLUSH | Gst.SeekFlags.KEY_UNIT,
        0
    )

def on_error(self, bus, msg):
    print('on_error():', msg.parse_error())

p = Player()
p.run()

please help me

debian rules creates empty package

Any extra instructions for generating a debian package. It seems it creates an empty package:

$ debuild -uc -us                                            
 dpkg-buildpackage -rfakeroot -D -us -uc
dpkg-buildpackage: source package gstreamer1.0-rpicamsrc
dpkg-buildpackage: source version 1.4.4.2+nmu1
dpkg-buildpackage: source changed by Arnaud Loonstra <[email protected]>
 dpkg-source --before-build gst-rpicamsrc
dpkg-buildpackage: host architecture armhf
 fakeroot debian/rules clean
dh clean  --with autotools-dev
   dh_testdir
   dh_auto_clean
   dh_autotools-dev_restoreconfig
   dh_clean
 dpkg-source -b gst-rpicamsrc
dpkg-source: warning: no source format specified in debian/source/format, see dpkg-source(1)
dpkg-source: warning: source directory 'gst-rpicamsrc' is not <sourcepackage>-<upstreamversion> 
'gstreamer1.0-rpicamsrc-1.4.4.2+nmu1'
dpkg-source: info: using source format `1.0'
dpkg-source: info: building gstreamer1.0-rpicamsrc in gstreamer1.0-rpicamsrc_1.4.4.2+nmu1.tar.gz
dpkg-source: info: building gstreamer1.0-rpicamsrc in gstreamer1.0-rpicamsrc_1.4.4.2+nmu1.dsc
 debian/rules build  
dh build  --with autotools-dev
   dh_testdir
   dh_autotools-dev_updateconfig
   dh_auto_configure 
   dh_auto_build
   dh_auto_test
 fakeroot debian/rules binary
dh binary  --with autotools-dev
   dh_testroot
   dh_prep
   dh_installdirs
   dh_auto_install   
   dh_install
   dh_installdocs
   dh_installchangelogs
   dh_installexamples
   dh_installman
   dh_installcatalogs
   dh_installcron
   dh_installdebconf 
   dh_installemacsen 
   dh_installifupdown
   dh_installinfo
   dh_installinit
   dh_installmenu
   dh_installmime
   dh_installmodules 
   dh_installlogcheck
   dh_installlogrotate
   dh_installpam
   dh_installppp
   dh_installudev
   dh_installwm
   dh_installxfonts  
   dh_installgsettings
   dh_bugfiles
   dh_ucf
   dh_lintian
   dh_gconf
   dh_icons
   dh_perl
   dh_usrlocal
   dh_link
   dh_compress
   dh_fixperms
   dh_strip
   dh_makeshlibs
   dh_shlibdeps
   dh_installdeb
   dh_gencontrol
dpkg-gencontrol: warning: Depends field of package gstreamer1.0-rpicamsrc: unknown substitution variable ${shlibs:Depends}
   dh_md5sums
   dh_builddeb
dpkg-deb: building package `gstreamer1.0-rpicamsrc' in `../gstreamer1.0-rpicamsrc_1.4.4.2+nmu1_armhf.deb'.
 dpkg-genchanges  >../gstreamer1.0-rpicamsrc_1.4.4.2+nmu1_armhf.changes
dpkg-genchanges: including full source code in upload
 dpkg-source --after-build gst-rpicamsrc
dpkg-buildpackage: full upload; Debian-native package (full source is included)
Now running lintian...
E: gstreamer1.0-rpicamsrc changes: bad-distribution-in-changes-file zix25
W: gstreamer1.0-rpicamsrc source: diff-contains-git-control-dir .git
E: gstreamer1.0-rpicamsrc source: missing-build-dependency dpkg-dev (>= 1.16.1~)
W: gstreamer1.0-rpicamsrc source: newer-standards-version 3.9.5 (current is 3.9.4)
E: gstreamer1.0-rpicamsrc: helper-templates-in-copyright
W: gstreamer1.0-rpicamsrc: copyright-has-url-from-dh_make-boilerplate
E: gstreamer1.0-rpicamsrc: copyright-contains-dh_make-todo-boilerplate
E: gstreamer1.0-rpicamsrc: extended-description-is-empty
W: gstreamer1.0-rpicamsrc: empty-binary-package

could not find component 'vc.ril.camera'

I just compiled rpicamsrc on RPi B+ running Arch and I get the following error when I try to use the rpicamsrc element in a gstreamer pipeline:

mmal: mmal_component_create_core: could not find component 'vc.ril.camera'

Pipelines using raspivid are working perfectly with my setup.

[Python] toggling preview while running?

Hi. I made a basic program that plays with some properties but I noticed that I can't change rotation nor preview once the pipeline starts playing. Is this a bug? Intended?

compile error

Hi,

after running ./autogen --prefix=/usr --libdir=/usr/lib/arm-linux-gnueabihf/ I have following error:

CC libgstrpicamsrc_la-gstrpicam-enum-types.lo
CC libgstrpicamsrc_la-gstrpicamsrc.lo
CC libgstrpicamsrc_la-RaspiCapture.lo
CC libgstrpicamsrc_la-RaspiCamControl.lo
CC libgstrpicamsrc_la-RaspiPreview.lo
CCLD libgstrpicamsrc.la
/bin/sed: can't read /usr/local/lib/libgstbase-1.0.la: No such file or directory

Is it possible to remove the log ?

Hi,

Is it possible to remove the log ? :

In src_start()
encoder buffer size is 65536
Encoder component done
In decide_allocation
Width 1024, Height 768
............................................

I am using gst-launch-1.0 -q and the log persist.

Cheers.

Python Bindings

Hi

Is there a way to get this to work with the python 3 binding in gi.repository? I've tried just calling the pipeline normally in the code (and it works fine at the command line) but I get the following error:

Traceback (most recent call last):
File "/home/pi/gstreamerserver.py", line 8, in
self.player = gst.parse_launch ("rpicamsrc bitrate=100000 ! tcpserversink host=192.168.0.13 port=24069")
File "/usr/lib/python3/dist-packages/gi/types.py", line 43, in function
return info.invoke(_args, *_kwargs)
gi._glib.GError: no element "rpicamsrc"

So I'm assuming it's not being detected by the gst bindings, do you have any suggestions on how to fix this?

Cannot compile on Raspberry PI 3

I got this error when running sudo make install.

pi@raspberrypi:~/p/gst-rpicamsrc-master $ sudo make install
Making install in src
make[1]: Entering directory '/home/pi/p/gst-rpicamsrc-master/src'
make install-am
make[2]: Entering directory '/home/pi/p/gst-rpicamsrc-master/src'
make[3]: Entering directory '/home/pi/p/gst-rpicamsrc-master/src'
make[3]: Nothing to be done for 'install-exec-am'.
/bin/mkdir -p '/usr/lib/arm-linux-gnueabihf/gstreamer-1.0'
/bin/bash ../libtool --mode=install /usr/bin/install -c libgstrpicamsrc.la '/usr/lib/arm-linux-gnueabihf/gstreamer-1.0'
libtool: install: /usr/bin/install -c .libs/libgstrpicamsrc.so /usr/lib/arm-linux-gnueabihf/gstreamer-1.0/libgstrpicamsrc.so
libtool: install: /usr/bin/install -c .libs/libgstrpicamsrc.lai /usr/lib/arm-linux-gnueabihf/gstreamer-1.0/libgstrpicamsrc.la

libtool: finish: PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/sbin" ldconfig -n /usr/lib/arm-linux-gnueabihf/gstreamer-1.0

Libraries have been installed in:
/usr/lib/arm-linux-gnueabihf/gstreamer-1.0

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the `-LLIBDIR'
flag during linking and do at least one of the following:

  • add LIBDIR to the `LD_LIBRARY_PATH' environment variable
    during execution
  • add LIBDIR to the `LD_RUN_PATH' environment variable
    during linking
  • use the `-Wl,-rpath -Wl,LIBDIR' linker flag
  • have your system administrator add LIBDIR to `/etc/ld.so.conf'

See any operating system documentation about shared libraries for

more information, such as the ld(1) and ld.so(8) manual pages.

make[3]: Leaving directory '/home/pi/p/gst-rpicamsrc-master/src'
make[2]: Leaving directory '/home/pi/p/gst-rpicamsrc-master/src'
make[1]: Leaving directory '/home/pi/p/gst-rpicamsrc-master/src'
Making install in examples
make[1]: Entering directory '/home/pi/p/gst-rpicamsrc-master/examples'
make[2]: Entering directory '/home/pi/p/gst-rpicamsrc-master/examples'
make[2]: Nothing to be done for 'install-exec-am'.
make[2]: Nothing to be done for 'install-data-am'.
make[2]: Leaving directory '/home/pi/p/gst-rpicamsrc-master/examples'
make[1]: Leaving directory '/home/pi/p/gst-rpicamsrc-master/examples'
make[1]: Entering directory '/home/pi/p/gst-rpicamsrc-master'
make[2]: Entering directory '/home/pi/p/gst-rpicamsrc-master'
make[2]: Nothing to be done for 'install-exec-am'.
make[2]: Nothing to be done for 'install-data-am'.
make[2]: Leaving directory '/home/pi/p/gst-rpicamsrc-master'
make[1]: Leaving directory '/home/pi/p/gst-rpicamsrc-master'

Dependencies

You need to put in your README that you require "autoconf" and "libtool". Just good practice to include this in the README.

Cheers,
Mike Seese

Autogen failing

I've compiled all gstreamer libraries from source without a hitch. I've tried to install this plugin and I can't get past running autogen. Here is my output:
./autogen.sh
autoreconf: Entering directory `.'
autoreconf: configure.ac: not using Gettext
autoreconf: running: aclocal --force
autoreconf: configure.ac: tracing
autoreconf: configure.ac: not using Libtool
autoreconf: running: /usr/bin/autoconf --force
autoreconf: running: /usr/bin/autoheader --force
autoreconf: running: automake --add-missing --copy --force-missing
src/Makefile.am:1: error: Libtool library used but 'LIBTOOL' is undefined
src/Makefile.am:1: The usual way to define 'LIBTOOL' is to add 'LT_INIT'
src/Makefile.am:1: to 'configure.ac' and run 'aclocal' and 'autoconf' again.
src/Makefile.am:1: If 'LT_INIT' is in 'configure.ac', make sure
src/Makefile.am:1: its definition is in aclocal's search path.
autoreconf: automake failed with exit status: 1
autogen.sh failed

Any help would be awesome!

edit: I wanted to add in that libtool has been installed from repo, and from source. Both times this has popped up.

QoS issues with live preview pipeline

With this pipeline the sink starts quite early on to drop buffers

gst-launch-1.0 rpicamsrc preview=0 fullscreen=0 ! h264parse ! queue ! omxh264dec ! queue ! glimagesink

Setting sync=0 on the sink works around the problem but ideally I think we'd prefer it to stay synchronized on the clock.

Streaming over the network with TCP without gdppay

Hey,

this is not really an issue with rpicamsrc, rather a question.

rpicamsrc is by far the most CPU efficient way to get video out of the Pi that I've found, and I'd like to make it available over the network.

I can do it via UDP as described here:

pi: gst-launch-1.0 rpicamsrc bitrate=1000000 ! 'video/x-h264, width=1280, height=720, framerate=30/1' ! rtph264pay config-interval=1 ! udpsink host=mycomputer.local port=5000

computer: gst-launch-1.0 udpsrc port=5000 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264" ! rtph264depay ! decodebin ! autovideosink

Works all fine and with super low latency.

However, I would like to do this via TCP, mainly because I want the Pi to be a server that clients can connect to (the Pi does not know ahead of time the IPs of the clients it shall send the video to).

Could you make an example how to do this?

I've searched on this matter for aroun 10 hours so far, but the only way I've found it to work was to use gdppay, which is a non-standard gstreamer protocol that other applications are unlikely to speak.

Thanks, and thanks for rpicamsrc!

Variable bitrate support

raspivid supports a variable bit rate, setting the bitrate to 0 and passing a quantisation parameter, e.g. -qt 30:

https://github.com/raspberrypi/userland/blob/master/host_applications/linux/apps/raspicam/RaspiVid.c#L149

This allows for high quality images on moved scenes and at the same time super low bandwidth use on still scenes.

For example, I get ~5KB/s pointing the camera into my room with VBR, and ~200KB/s when moving in front of it, using raspivid -t 0 -w 1280 -h 720 -fps 30 -g 300 -b 0 -qp 30 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=llw port=5000.

The related code where the quantisationParameter is passed to MMAL in raspivid seems to be here: https://github.com/raspberrypi/userland/blob/master/host_applications/linux/apps/raspicam/RaspiVid.c#L1514

Could we have variable bitrate support in rpicamsrc as well?

Latency

What kind of latency could one expect from this module from camera to display <100ms? I am really looking for something which is around 720p with <10ms latency but I think this might be pi in the sky.

h264 constrained baseline profile

Hello,

Currently rpicamsrc supports three kind of profiles. {baseline, main, high}
However, the mmal interface is also capable of generating constrained-baseline type of profile. Could it be useful to have this option included in rpicamsrc?

Also, is there any way we can control the profile-level-id parameter of rpicamsrc, specially the level-idc?

If you give me some pointers, I can also try helping.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.