Coder Social home page Coder Social logo

shortstheory / apstreamline Goto Github PK

View Code? Open in Web Editor NEW
113.0 13.0 35.0 714 KB

Live Video Streaming Made Easy!

License: GNU General Public License v3.0

Meson 1.85% C++ 92.99% C 5.17%
gstreamer networks c-plus-plus robotics ardupilot streaming-video rtsp-streaming apweb-server rtsp-streams rtsp-server

apstreamline's Introduction

APStreamline

Live Video Streaming Made Easy!

Introduction

Using video cameras for live-streaming the video feed from aerial robots and other unmanned vehicles is useful for a number of applications. Most video streaming solutions use RTP for streaming video over UDP. UDP is more efficient than TCP because it forgoes the overhead that comes with TCP's reliable delivery and congestion control mechanisms.

However, this introduces new problems when streaming video from robots. In most cases, we use the Companion Computer (CC) in Wi-Fi hotspot mode for streaming the video. Due to the limited range of 2.4GHz Wi-Fi, the streaming quality gets perceptibly worse when the robot moves further away from the receiving computer.

The APStreamline project aims to fix this problem by dynamically adjusting the video quality. Over UDP we can obtain estimates of QoS using RTCP packets received from the receiver. These RTCP packets provide helpful QoS information (such as RTT and packet loss) which can be used for automatically changing the bitrate and resolution of the video delivered from the sender.

The code makes use of GStreamer libraries for creating the streaming pipelines.

Recent Changes

APStreamline was first released in a beta image of APSync in September 2018. The latest release supports the following features and needs to built from source:

v2 Features (Released in October 2020)

  • Made it much easier to add new cameras and GStreamer pipelines
  • Support for the ZED and e-Con AR0591 cameras

v1 Features (Released in September 2018)

  • Support for using the hardware encoder for Jetson TX1/TX2 CSI cameras
  • Automatic quality selection based on bandwidth and packet loss estimates
  • Selection of network interfaces to stream the video
  • Manual control over resolution and framerates
  • Multiple camera support using RTSP
  • Hardware-accelerated encoding for the Raspberry Pi camera on the Raspberry Pi
  • Camera settings configurable through the APWeb GUI

Running the Code

Hardware

All the following instructions are for installing APStreamline and APWeb on the CC. A Raspberry Pi 2/3/3B+ with the latest version of Raspian or APSync is a good choice. Intel NUC's are good choices as well. As of April 2019, APStreamline also provides support for the Nvidia Jetson TX1/TX2 boards and its CSI cameras with the tegra-video drivers.

Do note that the Raspberry Pi 3 and 3B+ have very low power Wi-Fi antennae which aren't great for video streaming. Using a portable Wi-Fi router like the TPLink MR3020 can dramatically improve range. Wi-Fi USB dongles working in hotspot mode can help as well.

Installing APStreamline

The installation procedure is pretty much the same across all Debian-based Linux platforms, including but not limited to the Raspberry Pi, NVIDIA Jetson boards, or any x86 machine.

Install the gstreamer dependencies:

sudo apt-get install libgstreamer-plugins-base1.0* libgstreamer1.0-dev libgstrtspserver-1.0-dev gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly python3-pip python-pip gstreamer1.0-libav

Install other dependencies for building the code:

sudo apt install ninja-build cmake meson libconfig++-dev

Navigate to the cloned folder folder and run:

meson build
cd build
ninja # NOTE: on the Raspberry Pi, run ninja -j2 so the build completes without taking up all the available memory

On the Raspberry Pi, use sudo modprobe bcm2835-v4l2 to load the V4L2 driver for the Raspberry Pi camera. Add bcm2835-v4l2 to /etc/modules for automatically loading this module on boot.

Installing APWeb

The APWeb server project enables setting several flight controller parameters on the fly through the use of a Companion Computer (e.g. the Raspberry Pi). We use this APWeb server for configuring the video streams as well.

Clone the forked branch with APStreamline support here:

git clone -b video_streaming https://github.com/shortstheory/APWeb.git
cd APWeb

Install libtalloc-dev and get the MAVLink submodule:

sudo apt-get install libtalloc-dev
git submodule update --init --recursive

Build APWeb:

cd APWeb
make
sudo ./web_server -p 80

Usage

Video livestreams can be launched using RTSP. It is recommended to use RTSP for streaming video as it provides the advantages of supporting multiple cameras, conifguring the resolution on-the-fly, and recording the livestreamed video to a file.

Start the Server

Standalone

Launch the RTSP stream server by going to the parent directory of the build directory. Camera configurations are loaded from the config/ directory of the project folder.

./build/stream_server <interface>

The list of available network interfaces can be found by running ifconfig.

APWeb

Start the APWeb server. This will serve the configuration page for the RTSP stream server. Connect to the web server in your favourite web browser by going to the IP address of the Companion Computer.

On navigating to the new video/ page, you will be presented with a page to start the RTSP Server:

Imgur

Note: Starting the server from the webpage is currently broken in APStreamline v2. As a workaround, start the APStreamline as described above.

Configure The Video Stream

On selecting the desired interface and starting the RTSP Server, the APWeb server will spawn the Stream Server process. The stream server will search for all the V4L2 cameras available in /dev/. It will query the capabilities of all these cameras and select hardware encoding or software encoding accordingly. The list of available cameras can be refreshed by simply stopping and starting the server.

From here, the APWeb page will display the list of available RTSP streams and their mount points:

Imgur

The video quality can either be automatically set based on the avaialble network bandwidth or set manually for more fine-grained control.

View The Video Stream

The RTSP streams can be viewed using any RTSP player. VLC is a good choice.

For example, this can be done in VLC by going to "Media > Open Network Stream" and pasting in the RTSP Mount Point for the camera displayed in the APWeb configuration page. However, VLC introduces two seconds of latency for the jitter reduction, making it unsuitable for critical applications. To circumvent this, RTSP streams can also be viewed at lower latency by using the gst-launch command:

gst-launch-1.0 playbin uri=<RTSP-MOUNT-POINT> latency=100

An RTSP Mount Point looks like this: rtsp://192.168.0.17:8554/cam0. Refer to the APWeb page to see the mount points given for your camera.

To Do

APStreamline could use your help! Some of the tasks which I want to complete are:

  • Add support for the tegra-video driver back into the project. Currently this is only supported in APStreamline v1.0, available from the Releases section of the repository
  • Update documentation and add more detailed steps for adding a new camera
  • Update the APWeb interface to list the actual available resolutions of the camera. Currently it just shows 320x240, 640x480, 1280x720 although the actual camera resolutions may be different
  • Switch the APWeb and APStreamline IPC mechanism to using ZeroMQ or rpcgen
  • Improve the installation flow. Currently the user needs to run APStreamline from the same directory as its config files for them to be loaded properly. Maybe the configuration files should be moved to ~/.config/?
  • Improve the JavaScript in the APWeb interface
  • Document the code better!

Bugs? Questions, Comments, Concerns?

Feel free to make a GitHub issue in the repository to get in touch if you would like to chat about the project or file a bug report!

apstreamline's People

Contributors

shortstheory avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

apstreamline's Issues

video 180° reversed with 2.0 on raspberry pi CSI cam

when using v2 the video is 180° return (upside down)
when shutting down v2 and start v1 it gets back in the right way
when shutting down v1 and restart v2 it's in the right way too until next reboot
looks that v1 is doing something on V4L driver for the rpi cam but I dont know what

Broken on Ras Pi

Attempting to run on a Ras Pi 3+, I get this error when trying to connect from VLC:

pi@raspberrypi:~/APStreamline/build $ ./stream_server eth0
Interface : eth0                                                                 
Address : 192.168.0.108                                                          
Found V4L2 camera device /dev/video0                                            
/dev/video0                                                                     
Name - mmal service 16.1 Driver - bm2835 mmal                                   
Recording not started
(stream_server:4734): GStreamer-CRITICAL **: gst_element_set_state: assertion 'G ST_IS_ELEMENT (element)' failed
(stream_server:4734): GStreamer-CRITICAL **: gst_object_unref: assertion 'object  != NULL' failed
(stream_server:4734): GStreamer-CRITICAL **: gst_element_set_state: assertion 'G ST_IS_ELEMENT (element)' failed
(stream_server:4734): GStreamer-CRITICAL **: gst_object_unref: assertion 'object  != NULL' failed
Stream disconnected!

I've isolated it down to the patch 26caab4 that causes the issue.

EDIT:
Fixed via putting in a missing break;after this line: https://github.com/shortstheory/adaptive-streaming/blob/ea44fb97d410a3eed88a4c7800b904bcca0f8f1b/src/RTSPStreamServer/RTSPAdaptiveStreaming.cpp#L80

Did you want me to put in a pull request for this?

Cannot compile source

I am using a raspberry pi zero 2W with bullseye and get the following error during compilation

pi@raspCam:~/APStreamline $ ls
build config LICENSE meson.build README.md src
pi@raspCam:~/APStreamline $ cd build
pi@raspCam:~/APStreamline/build $ ninja -j2
[9/14] Compiling C++ object stream_server.p/src_Camera_Camera.cpp.o
FAILED: stream_server.p/src_Camera_Camera.cpp.o
c++ -Istream_server.p -I. -I.. -I/usr/include/gstreamer-1.0 -I/usr/include/arm-linux-gnueabihf -I/usr/include/glib-2.0 -I/usr/lib/arm-linux-gnueabihf/glib-2.0/include -I/usr/include/libmount -I/usr/include/blkid -fdiagnostics-color=always -pipe -D_FILE_OFFSET_BITS=64 -Wall -Winvalid-pch -Wnon-virtual-dtor -std=c++14 -g -pthread -DGLIB_VERSION_MIN_REQUIRED=GLIB_VERSION_2_44 -DGLIB_VERSION_MAX_ALLOWED=GLIB_VERSION_2_60 -Wall -Werror -MD -MQ stream_server.p/src_Camera_Camera.cpp.o -MF stream_server.p/src_Camera_Camera.cpp.o.d -o stream_server.p/src_Camera_Camera.cpp.o -c ../src/Camera/Camera.cpp
{standard input}: Assembler messages:
{standard input}:144519: Warning: end of file not at end of a line; newline inserted
{standard input}:145103: Error: unknown pseudo-op: .u' c++: fatal error: Getötet signal terminated program cc1plus compilation terminated. [10/14] Compiling C++ object stream_server.p/src_Camera_MJPGCamera.cpp.o ninja: build stopped: subcommand failed `

Trying to build on Jetson TX2 errors during ninja build.

../src/RTSP_Stream_Server/IPCMessageHandler.cpp: In member function ‘std::__cxx11::string IPCMessageHandler::serialise_device_props(std::pair<std::__cxx11::basic_string, v4l2_info>)’:
../src/RTSP_Stream_Server/IPCMessageHandler.cpp:62:50: error: format ‘%u’ expects argument of type ‘unsigned int’, but argument 9 has type ‘guint64 {aka long unsigned int}’ [-Werror=format=]
stream->file_recorder.get_recording());
How do I fix this?

error in device file reading

I've try to follow error and it looks when creating AdaptiveStream object
I've setup a fresh raspian buster install on rpi4

$ meson build
The Meson build system
Version: 0.52.1
Source dir: /home/pi/APStreamline
Build dir: /home/pi/APStreamline/build
Build type: native build
Project name: adaptive-streaming
Project version: undefined
C compiler for the host machine: cc (gcc 8.3.0 "cc (Raspbian 8.3.0-6+rpi1) 8.3.0")
C linker for the host machine: GNU ld.bfd 2.31.1
C++ compiler for the host machine: c++ (gcc 8.3.0 "c++ (Raspbian 8.3.0-6+rpi1) 8.3.0")
C++ linker for the host machine: GNU ld.bfd 2.31.1
Host machine cpu family: arm
Host machine cpu: armv7l
Found pkg-config: /usr/bin/pkg-config (0.29)
Run-time dependency gstreamer-1.0 found: YES 1.14.4
Run-time dependency gstreamer-rtp-1.0 found: YES 1.14.4
Run-time dependency gstreamer-rtsp-1.0 found: YES 1.14.4
Run-time dependency gstreamer-rtsp-server-1.0 found: YES 1.14.4
Run-time dependency threads found: YES 
Run-time dependency libconfig++ found: YES 1.5
Build targets in project: 1
Found ninja-1.8.2 at /usr/bin/ninja
pi@raspberrypi:~/APStreamline $ cd build/
pi@raspberrypi:~/APStreamline/build $ ninja
[14/14] Linking target stream_server.
pi@raspberrypi:~/APStreamline/build $ ./stream_server wlan0
"/home/pi/APStreamline/build"
Interface : wlan0
Address : 192.168.0.20
***APStreamline***
Access the following video streams using VLC or gst-launch following the instructions here: https://github.com/shortstheory/adaptive-streaming#usage
==============================
Reading RPi config
I/O error while reading file.
terminate called after throwing an instance of 'std::out_of_range'
  what():  _Map_base::at
Aborted
pi@raspberrypi:~/APStreamline/build $ uname -a
Linux raspberrypi 5.4.72-v7l+ #1356 SMP Thu Oct 22 13:57:51 BST 2020 armv7l GNU/Linux

HDMI IN with tc358743

Hello,
we try to use the rpi with Debian Buster and the tc358743 (HDMI IN on the CSI Port).

The HDMI Device (w10 computer) rings at starting the stream because the rpi connecting the hdmi device, so it seems thats it is starting normally but then we get a segmenation fault. Please take a look at the attachments.

rpi

w10_streaming_client

messages.txt

kind regards,
Martin

Ras Pi Quality Issue

Running on a Raspberry Pi, the automatic quality/resolution changes are not occurring. I attempted to instead force a resolution change via the APWeb interface but got a system crash instead.

Using a previous commit (d6efd60), the quality changes (automatic and forced) work fine. So I'm thinking it must have broken somewhere between then and now.

Support ZED camera on TX2

I am unable to generate a feed from a Stereolabs ZED camera via RTSP.
When I start the RTSP server on the video tab in APWeb, the ZED is correctly displayed as cam1 in addition to the Jetson Dev board built in camera (cam0). However I cannot display the cam1 video stream on the client machine. RTSP streaming from the built in camera (cam0) works well.

In turn, streaming from the ZED camera via udp works perfectly well. I am using the following gstreamer settings to display the left camera view on a client PC:
gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw, width=3840, height=1080 ! videocrop top=0 left=0 right=1920 bottom=0 ! tee name=t ! queue ! videoconvert ! omxh264enc ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay ! udpsink host=10.0.1.111 port=5600 t. ! queue ! videoconvert

I assume that the camera settings need to be tweaked somewhere in the APStreamline code - any ideas what I would need to adopt? Thanks!

Failing Ninja Install

With manual build getting these errors on an apsync image. This on the Raspberry Pi 4.

apsync@apsync:~/GitHub/adaptive-streaming/build$ ninja install
[0/1] Regenerating build files.
The Meson build system
Version: 0.54.2
Source dir: /home/apsync/GitHub/adaptive-streaming
Build dir: /home/apsync/GitHub/adaptive-streaming/build
Build type: native build
Project name: adaptive-streaming
Project version: undefined
C compiler for the host machine: cc (gcc 7.5.0 "cc (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) 7.5.0")
C linker for the host machine: cc ld.bfd 2.30
C++ compiler for the host machine: c++ (gcc 7.5.0 "c++ (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) 7.5.0")
C++ linker for the host machine: c++ ld.bfd 2.30
Host machine cpu family: aarch64
Host machine cpu: aarch64
Dependency gstreamer-1.0 found: YES 1.14.5 (cached)
Dependency gstreamer-rtp-1.0 found: YES 1.14.5 (cached)
Dependency gstreamer-rtsp-1.0 found: YES 1.14.5 (cached)
Dependency gstreamer-rtsp-server-1.0 found: YES 1.14.5 (cached)
Dependency threads found: YES unknown (cached)
Build targets in project: 1

Found ninja-1.8.2 at /usr/bin/ninja
[2/9] Compiling C++ object 'stream_server@exe/src_Common_GenericAdaptiveStreaming.cpp.o'
FAILED: stream_server@exe/src_Common_GenericAdaptiveStreaming.cpp.o
c++ -Istream_server@exe -I. -I.. -I/usr/include/gstreamer-1.0 -I/usr/include/glib-2.0 -I/usr/lib/aarch64-linux-gnu/glib-2.0/include -fdiagnostics-color=always -pipe -D_FILE_OFFSET_BITS=64 -Wall -Winvalid-pch -Wnon-virtual-dtor -std=c++14 -g -pthread -Werror -MD -MQ 'stream_server@exe/src_Common_GenericAdaptiveStreaming.cpp.o' -MF 'stream_server@exe/src_Common_GenericAdaptiveStreaming.cpp.o.d' -o 'stream_server@exe/src_Common_GenericAdaptiveStreaming.cpp.o' -c ../src/Common/GenericAdaptiveStreaming.cpp
../src/Common/GenericAdaptiveStreaming.cpp: In member function ‘void GenericAdaptiveStreaming::set_state_constants()’:
../src/Common/GenericAdaptiveStreaming.cpp:60:23: error: ‘MAX_STEADY_BITRATE’ was not declared in this scope
         MAX_BITRATE = MAX_STEADY_BITRATE;
                       ^~~~~~~~~~~~~~~~~~
../src/Common/GenericAdaptiveStreaming.cpp:60:23: note: suggested alternative: ‘MIN_STEADY_BITRATE’
         MAX_BITRATE = MAX_STEADY_BITRATE;
                       ^~~~~~~~~~~~~~~~~~
                       MIN_STEADY_BITRATE
[5/9] Compiling C++ object 'stream_server@exe/src_RTSP_Stream_Server_IPCMessageHandler.cpp.o'
FAILED: stream_server@exe/src_RTSP_Stream_Server_IPCMessageHandler.cpp.o
c++ -Istream_server@exe -I. -I.. -I/usr/include/gstreamer-1.0 -I/usr/include/glib-2.0 -I/usr/lib/aarch64-linux-gnu/glib-2.0/include -fdiagnostics-color=always -pipe -D_FILE_OFFSET_BITS=64 -Wall -Winvalid-pch -Wnon-virtual-dtor -std=c++14 -g
 -pthread -Werror -MD -MQ 'stream_server@exe/src_RTSP_Stream_Server_IPCMessageHandler.cpp.o' -MF 'stream_server@exe/src_
RTSP_Stream_Server_IPCMessageHandler.cpp.o.d' -o 'stream_server@exe/src_RTSP_Stream_Server_IPCMessageHandler.cpp.o' -c ../src/RTSP_Stream_Server/IPCMessageHandler.cpp
../src/RTSP_Stream_Server/IPCMessageHandler.cpp: In member function ‘std::__cxx11::string IPCMessageHandler::serialise_device_props(std::pair<std::__cxx11::basic_string<char>, v4l2_info>)’:
../src/RTSP_Stream_Server/IPCMessageHandler.cpp:62:50: error: format ‘%u’ expects argument of type ‘unsigned int’, but argument 9 has type ‘guint64 {aka long unsigned int}’ [-Werror=format=]
../src/RTSP_Stream_Server/IPCMessageHandler.cpp:60:13:
             device_props.second.frame_property_bitmask,
             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
../src/RTSP_Stream_Server/IPCMessageHandler.cpp:62:50:
             stream->file_recorder.get_recording());
                                                  ^
../src/RTSP_Stream_Server/IPCMessageHandler.cpp:62:50: error: too many arguments for format [-Werror=format-extra-args]
cc1plus: all warnings being treated as errors
[7/9] Compiling C++ object 'stream_server@exe/src_Common_FileRecorder.cpp.o'
ninja: build stopped: subcommand failed.

Raspberry Pi camera on HDMI output

I am using a Raspberry Pi with lite OS without running X. It would be very nice if the camera output is also shown on the HDMI output of the Raspberry Pi during IP streaming. The tool raspivid seems to create some non latency hardware connection between CSI port and HDMI output, and is also able to stream to IP at the same time. Example using VLC is: raspivid -o - -t 0 -w 1920 -h 1080 -fps 25 | cvlc -vvv stream:///dev/stdin --sout '#rtp{access=udp,sdp=rtsp://:8554/stream}' :demux=h264. This has off course limitations. Would it be possible to use raspivid for feeding video data into APStreamline? What could be the proper way of doing this?

not working with usb camera

before anything else I must say it's a great tool and it working perfectly with CSI2 raspi cam so big thank's for this fantastic tool
I'm trying to use it for also streaming a second camera connected via USBI'm on a R-pi 4 with standard buster distribution
lsusb saying :

pi@raspberrypi:~ $ lsusb
Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 001 Device 005: ID 0000:5001  
Bus 001 Device 002: ID 2109:3431 VIA Labs, Inc. Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

dmesg says :

[  287.042761] usb 1-1.1: new high-speed USB device number 5 using xhci_hcd
[  287.253426] usb 1-1.1: New USB device found, idVendor=0000, idProduct=5001, bcdDevice= 5.19
[  287.253444] usb 1-1.1: New USB device strings: Mfr=0, Product=2, SerialNumber=3
[  287.253460] usb 1-1.1: Product: USB Camera                      
[  287.253475] usb 1-1.1: SerialNumber: 20130208
[  287.392528] uvcvideo: Found UVC 1.00 device USB Camera                       (0000:5001)
[  287.424870] uvcvideo 1-1.1:1.0: Entity type for entity Extension 5 was not initialized!
[  287.424889] uvcvideo 1-1.1:1.0: Entity type for entity Extension 4 was not initialized!
[  287.424907] uvcvideo 1-1.1:1.0: Entity type for entity Processing 2 was not initialized!
[  287.424925] uvcvideo 1-1.1:1.0: Entity type for entity Camera 1 was not initialized!
[  287.425285] input: USB Camera                      as /devices/platform/scb/fd500000.pcie/pci0000:00/0000:00:00.0/0000:01:00.0/usb1/1-1/1-1.1/1-1.1:1.0/input/input2
[  287.459112] usb 1-1.1: Warning! Unlikely big volume range (=5760), cval->res is probably wrong.
[  287.459132] usb 1-1.1: [5] FU [Mic Capture Volume] ch = 1, val = 0/5760/1

APStreamline launch looks like :

pi@raspberrypi:~ $ ./APStreamline/bin/stream_server wlan0
Interface : wlan0
Address : 192.168.9.114
***APStreamline***
Access the following video streams using VLC or gst-launch following the instructions here: https://github.com/shortstheory/adaptive-streaming#usage
==============================
/dev/video0 (mmal service 16.1): rtsp://192.168.9.114:8554/cam7
/dev/video1 (USB Camera                     ): rtsp://192.168.9.114:8554/cam8
/dev/video10 (bcm2835-codec-decode): rtsp://192.168.9.114:8554/cam4
/dev/video11 (bcm2835-codec-encode): rtsp://192.168.9.114:8554/cam5
/dev/video12 (bcm2835-codec-isp): rtsp://192.168.9.114:8554/cam6
/dev/video13 (bcm2835-isp): rtsp://192.168.9.114:8554/cam0
/dev/video14 (bcm2835-isp): rtsp://192.168.9.114:8554/cam1
/dev/video15 (bcm2835-isp): rtsp://192.168.9.114:8554/cam2
/dev/video16 (bcm2835-isp): rtsp://192.168.9.114:8554/cam3
/dev/video2 (USB Camera                     ): rtsp://192.168.9.114:8554/cam9

if I try to access /dev/video0 trought rtsp://192.168.9.114:8554/cam7 it works perfectly
but if I try to access : rtsp://192.168.9.114:8554/cam9 heres what I get :

pi@raspberrypi:~ $ ./APStreamline/bin/stream_server wlan0
Interface : wlan0
Address : 192.168.9.114
***APStreamline***
Access the following video streams using VLC or gst-launch following the instructions here: https://github.com/shortstheory/adaptive-streaming#usage
==============================
/dev/video0 (mmal service 16.1): rtsp://192.168.9.114:8554/cam7
/dev/video1 (USB Camera                     ): rtsp://192.168.9.114:8554/cam8
/dev/video10 (bcm2835-codec-decode): rtsp://192.168.9.114:8554/cam4
/dev/video11 (bcm2835-codec-encode): rtsp://192.168.9.114:8554/cam5
/dev/video12 (bcm2835-codec-isp): rtsp://192.168.9.114:8554/cam6
/dev/video13 (bcm2835-isp): rtsp://192.168.9.114:8554/cam0
/dev/video14 (bcm2835-isp): rtsp://192.168.9.114:8554/cam1
/dev/video15 (bcm2835-isp): rtsp://192.168.9.114:8554/cam2
/dev/video16 (bcm2835-isp): rtsp://192.168.9.114:8554/cam3
/dev/video2 (USB Camera                     ): rtsp://192.168.9.114:8554/cam9
Stream disconnected!

(stream_server:1025): GStreamer-CRITICAL **: 11:24:34.102: gst_element_set_state: assertion 'GST_IS_ELEMENT (element)' failed

(stream_server:1025): GStreamer-CRITICAL **: 11:24:34.103: gst_object_unref: assertion 'object != NULL' failed
Segmentation fault
pi@raspberrypi:~ $

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.