Coder Social home page Coder Social logo

gscam2's People

Contributors

caelinsutch avatar clydemcqueen avatar mattsyoung avatar rfriedm-trimble avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

gscam2's Issues

gstreamer videorate doesn't work.

I am using an uvc camera streaming mjpeg. When using gstreamer directly, I can resize the image and change the framerate but the same command in gscam doesn't work.

My config is :
gscam_config = v4l2src device=/dev/video0 do-timestamp=true ! queue ! videorate ! image/jpeg, width=1280, height=720, framerate=15/1

It's working but I am getting 1920x1080 at 30 fps.

Clean up & release

If there's interest I could release this for Dashing, Eloquent and Foxy.

Also:
-- remove dep on ptrmu/ros2_shared.git
-- pass ament_lint_common

camera.ini parameters

hi!
i see default camera.ini in ros2_ws/src/gscam2/cfg/my_camera.ini and want to implement my camera ones.
i got them after Kalibr calibration. but i do not have rectification and projection matrixs.
i use mono camera.
my camera matrix looks like:
camera matrix
351.774060 0.000000 290.713492
0.000000 468.945113 235.782768
0.000000 0.000000 1.000000

may be if i set them to zeros it will be OK ?

rectification
0.0 0.0 0.0
0.0 0.0 0.0
0.0 0.0 0.0

projection
0.0 0.000000 0.0 0.000000
0.000000 0.0 0.0 0.000000
0.000000 0.000000 0.000000 0.000000

GSCAM2 with YUV output

I am using a USB camera with YUV image format with GSCAM2. Gscam_main ros2 node converts the image format to rgb8 and publishes gstreamer output to /image_raw topic which can then be visualized on ros2 image_view package.

I was trying to remove the RGB8 conversion in the code by replacing all the instances of sensor_msgs::image_encodings::RGB8 in gscam_node.cpp to sensor_msgs::image_encodings::yuv422 as this is the only yuv format found in sensor_images::image_encodings.
After which I replaced the following line in

"format", G_TYPE_STRING, "RGB",

from

if (cxt_.image_encoding_ == sensor_msgs::image_encodings::RGB8) {
    caps = gst_caps_new_simple(
      "video/x-raw",
      "format", G_TYPE_STRING, "RGB",
      nullptr);

to

if (cxt_.image_encoding_ == sensor_msgs::image_encodings::YUV422) {
    caps = gst_caps_new_simple(
      "video/x-raw",
      "format", G_TYPE_STRING, "YUY2",
      nullptr);

I am providing 'YUY2' format in the pipeline because this is the default output format as found using gst-launch-1.0 for my pipeline

gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-raw,width=640. height=480, framerate=30/1 ! videoconvert

But I am getting really weird outputs as below:
image_yuv

How can I keep the default camera format unchanged for output? I want the node to give output in YUV/YUY2 format itself, same as default camera input. In ROS2 humble, RVIZ can visualize YUV images directly without the need for converting it to RGB, and I'd like gscam2 output to be in YUV.

thanks

rclcpp_components/register_node_macro.hpp: No such file or directory (Unable to build package)

When I follow the instructions and run 'colcon build' I receive this error:

/home/USER/ros2/gscam2_ws/src/gscam2/src/subscriber_node.cpp:37:10: fatal error: rclcpp_components/register_node_macro.hpp: No such file or directory
#include "rclcpp_components/register_node_macro.hpp"
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.
make[2]: *** [CMakeFiles/subscriber_node.dir/src/subscriber_node.cpp.o] Error 1
make[1]: *** [CMakeFiles/subscriber_node.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
//opt/ros/eloquent/lib/libimage_transport.so: undefined reference to `message_filters::Connection::Connection(std::function<void ()> const&)'
collect2: error: ld returned 1 exit status
make[2]: *** [gscam_main] Error 1
make[1]: *** [CMakeFiles/gscam_main.dir/all] Error 2
make: *** [all] Error 2

Failed <<< gscam [0.63s, exited with code 2]
`

Cannot link outelement("typefind") -> sink

I'm using this package inside a docker container built using the provided Dockerfile.

I am new to GStreamer so please pardon my ignorance.

I'm trying to run the following pipeline and it works without any issue:
gst-launch-1.0 -v udpsrc port=5000 ! "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! h264parse ! decodebin ! videoconvert ! queue ! autovideosink sync=false

But when I run it using gscam2, I get the following output:

[INFO] [1696353856.273649779] [gscam_publisher]: use_intra_process_comms=0
[INFO] [1696353856.273781482] [gscam_publisher]: gst_plugin_path = 
[INFO] [1696353856.273788693] [gscam_publisher]: gscam_config = 
[INFO] [1696353856.273792469] [gscam_publisher]: sync_sink = true
[INFO] [1696353856.273795710] [gscam_publisher]: preroll = false
[INFO] [1696353856.273798899] [gscam_publisher]: use_gst_timestamps = false
[INFO] [1696353856.273802177] [gscam_publisher]: image_encoding = rgb8
[INFO] [1696353856.273805435] [gscam_publisher]: camera_info_url = 
[INFO] [1696353856.273808664] [gscam_publisher]: camera_name = 
[INFO] [1696353856.273811786] [gscam_publisher]: frame_id = camera_frame
[INFO] [1696353856.273816631] [gscam_publisher]: Using GSCAM_CONFIG env var: udpsrc port=5000 caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,payload=(int)96 ! rtph264depay ! h264parse ! decodebin ! videoconvert !  queue ! autovideosink
[INFO] [1696353856.273830218] [gscam_publisher]: using default calibration URL
[INFO] [1696353856.273836806] [gscam_publisher]: camera calibration URL: file:///root/.ros/camera_info/camera.yaml
[ERROR] [1696353856.273866863] [camera_calibration_parsers]: Unable to open camera calibration file [/root/.ros/camera_info/camera.yaml]
[WARN] [1696353856.273872505] [gscam_publisher]: Camera calibration file /root/.ros/camera_info/camera.yaml not found
[INFO] [1696353856.273876720] [gscam_publisher]: Loaded camera calibration from 
[INFO] [1696353856.281511796] [gscam_publisher]: Gstreamer initialized
[INFO] [1696353856.281540444] [gscam_publisher]: Gstreamer version: GStreamer 1.20.3
[FATAL] [1696353856.286175862] [gscam_publisher]: Cannot link outelement("typefind") -> sink

I've tried running it using the parameter file as well as environment variable, both lead to the same output.

h264 encoded stream through gscam

My current stream is 4k @ 60 fps and I intend to stream it over a network to a remote host. Consequently, I want to encode the stream to h264 before streaming it.

I was planning to do something like:
export gscam_config="camsrc name=cam ...... ! vaapih264enc"

However, the sink for gscam expects RGB format so the above command won't work. Neither will a videoconvert at the end because it is encoded data.

Is it possible to somehow allow the encoded data to masquerade as RGB data to be obtained from /image_raw topic and then I can decide how to decode it back to individual frames?

MP4 file stored in GSCAM can not be decoded normally

I get a problem when trying to realize the part of README.md--"Here's an example that uses a GStreamer tee to split the stream, with one stream producing ROS images and the second stream writing to MP4 files", the following is the actual situation of my test.

If I record mp4 files and push stream by gst-launch-1.0, such as:

gst-launch-1.0 fpgasrc channel=0 do-timestamp="TRUE" ! 'video/x-raw(memory:NVMM), format=YUY2, width=(int)1280, height=(int)720, framerate=30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1280, height=(int)720' ! tee name = t t. ! omxh265enc ! 'video/x-h265' ! h265parse ! qtmux ! filesink location=test_h265.mp4 -e t. ! nvvidconv

The mp4 file generated can be decoded in gscam, but if I generate it in gscam by config, such as:

export GSCAM_CONFIG="fpgasrc channel=0 do-timestamp="TRUE" ! queue ! video/x-raw(memory:NVMM), format=YUY2, width=(int)1280, height=(int)720, framerate=30/1 ! nvvidconv ! video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1280, height=(int)720 ! tee name = t t. ! queue ! omxh265enc ! video/x-h265 ! h265parse ! qtmux ! filesink location=test_h265.mp4 -e t. ! nvvidconv"

I tried splitmuxsink either, and got same result:

export GSCAM_CONFIG="fpgasrc channel=0 do-timestamp="TRUE" ! queue ! video/x-raw(memory:NVMM), format=YUY2, width=(int)1280, height=(int)720, framerate=30/1 ! nvvidconv ! video/x-raw(memory:NVMM), format=(string)NV12, width=(int)1280, height=(int)720 ! tee name = t t. ! queue ! omxh265enc ! video/x-h265 ! h265parse ! splitmuxsink location=test_h265.mp4 t. ! nvvidconv"

The mp4 file generated successfully, but can not be decoded by gscam even gst-launch-1.0, if I decode it by gst-launch-1.0, the shell is:

gst-launch-1.0 filesrc location=test_h265.mp4 ! qtdemux name=demux demux.video_0 ! queue ! h265parse ! omxh265dec ! nvoverlaysink -e

and the error is:

qtdemux.c(701): gst_qtdemux_post_no_playable_stream_error (): /GstPipeline:pipeline0/GstQTDemux:demux: 
no known streams found
ERROR: pipeline doesn't want to preroll.

If I decode it in gscam, the GSCAM_CONFIG is:

export GSCAM_CONFIG="filesrc location=/home/administrator/luochangping/gstreamer_test/test_h265.mp4 ! qtdemux name=demux demux.video_0 ! queue ! h265parse ! omxh265dec ! videoconvert"

and process will be blocked.

Can anybody help me? Thanks a lot.

gscam_config For AC-IMX390 Cameras

Hi,
I am using gmsl2 camera in AC-IMX390 model in my system. and my GStreamer pipeline runs successfully in gst-launch-1.0. Here is my pipeline:

gst-launch-1.0 udpsrc buffer_size=9216000 port=10005 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, depth=(string)8,  width=(string)1920, height=(string)1080, colorimetry=(string)BT601-5, payload=(int)96, a-framerate=(string)60" ! rtpvrawdepay ! videoconvert ! fpsdisplaysink name=fps video-sink=autovideosink sync=false

I am getting an error while entering these parameters into the yaml file I created for the package, can you share an example config file for these parameters so that I can understand my problem?

Error while building docker from dockerfile

I am encountering an issue while building a docker image from a dockerfile in gscam2:

[dev_container_auto_added_stage_label 10/10] RUN /bin/bash -c "source /opt/ros/foxy/setup.bash && colcon build":
#0 0.846 Starting >>> ros2_shared
#0 2.014 Finished <<< ros2_shared [1.17s]
#0 2.015 Starting >>> gscam2
#0 3.526 --- stderr: gscam2
#0 3.526 CMake Error at CMakeLists.txt:47 (add_library):
#0 3.526   Target "gscam_node" links to target
#0 3.526   "camera_calibration_parsers::camera_calibration_parsers" but the target was
#0 3.526   not found.  Perhaps a find_package() call is missing for an IMPORTED
#0 3.526   target, or an ALIAS target is missing?
#0 3.526 
#0 3.526 
#0 3.526 CMake Error at CMakeLists.txt:47 (add_library):
#0 3.526   Target "gscam_node" links to target
#0 3.526   "camera_info_manager::camera_info_manager" but the target was not found.
#0 3.526   Perhaps a find_package() call is missing for an IMPORTED target, or an
#0 3.526   ALIAS target is missing?
#0 3.526 
#0 3.526 
#0 3.526 CMake Error at CMakeLists.txt:47 (add_library):
#0 3.526   Target "gscam_node" links to target "rclcpp_components::component" but the
#0 3.526   target was not found.  Perhaps a find_package() call is missing for an
#0 3.526   IMPORTED target, or an ALIAS target is missing?
[2023-05-12T03:59:34.981Z] #0 3.526 
#0 3.526 
#0 3.526 CMake Error at CMakeLists.txt:47 (add_library):
#0 3.526   Target "gscam_node" links to target "sensor_msgs::sensor_msgs_library" but
#0 3.526   the target was not found.  Perhaps a find_package() call is missing for an
#0 3.526   IMPORTED target, or an ALIAS target is missing?
#0 3.526 
#0 3.526 
#0 3.526 CMake Error at CMakeLists.txt:78 (add_library):
#0 3.526   Target "subscriber_node" links to target "rclcpp_components::component" but
#0 3.526   the target was not found.  Perhaps a find_package() call is missing for an
#0 3.526   IMPORTED target, or an ALIAS target is missing?
#0 3.526 
#0 3.526 
#0 3.526 CMake Error at CMakeLists.txt:78 (add_library):
#0 3.526   Target "subscriber_node" links to target "sensor_msgs::sensor_msgs_library"
#0 3.526   but the target was not found.  Perhaps a find_package() call is missing for
#0 3.526   an IMPORTED target, or an ALIAS target is missing?
#0 3.526 
#0 3.526 
#0 3.526 CMake Error at CMakeLists.txt:115 (add_executable):
#0 3.526   Target "ipc_test_main" links to target "sensor_msgs::sensor_msgs_library"
#0 3.526   but the target was not found.  Perhaps a find_package() call is missing for
#0 3.526   an IMPORTED target, or an ALIAS target is missing?
#0 3.526 
#0 3.526 
#0 3.526 CMake Error at /opt/ros/foxy/share/ament_cmake_gtest/cmake/ament_add_gtest_executable.cmake:50 (add_executable):
#0 3.526   Target "smoke_test" links to target "sensor_msgs::sensor_msgs_library" but
#0 3.526   the target was not found.  Perhaps a find_package() call is missing for an
#0 3.526   IMPORTED target, or an ALIAS target is missing?
#0 3.526 Call Stack (most recent call first):
#0 3.526   /opt/ros/foxy/share/ament_cmake_gtest/cmake/ament_add_gtest_executable.cmake:37 (_ament_add_gtest_executable)
#0 3.526   /opt/ros/foxy/share/ament_cmake_gtest/cmake/ament_add_gtest.cmake:68 (ament_add_gtest_executable)
#0 3.526   CMakeLists.txt:134 (ament_add_gtest)

What should I do to resovle this problem, thank you in advance!

gstreamer search paths for custom elements

I'd like to use this package with aravissrc for genicam industrial cameras

aravis installs libgstaravis.0.8.so to /usr/local/lib/x86_64-linux-gnu/
and I can load it with
gst-launch-1.0 --gst-plugin-path="/usr/local/lib/x86_64-linux-gnu" aravissrc camera-name="Aravis-Fake-GV01" ! videoconvert ! ximagesink

How do I pass a gst-plugin-path to gscam2?
I'd like to avoid relying on environment variables

Set up CI

Use Github Actions to build on Foxy, Galactic and Rolling.

Inreliable connection: Restart pipeline on EOS?

I am using this driver on a Jetson Orin NX and am experiencing issues where the camera stream seems to be interrupted if a broken frame occurs (similar to the issue described here). The consequence is that the gstreamer pipeline needs to be restarted in that case.

Right now, if this happens, the driver is stuck on this line. I noticed that calling gst_app_sink_is_eos becomes true if that happens. I am no gstreamer expert, so I am wondering if there is an option to automatically restart the pipeline if that happens? Otherwise, would it make sense to implement this in this driver, or at least have an option to restart the pipeline if that error occurs?

This is the output I get if the failure occurs:

[INFO] [1701451092.397209225] [robomaster_1.camera_0.gscam_publisher]: use_intra_process_comms=0
[INFO] [1701451092.398496295] [robomaster_1.camera_0.gscam_publisher]: gst_plugin_path = 
[INFO] [1701451092.398561705] [robomaster_1.camera_0.gscam_publisher]: gscam_config = nvarguscamerasrc sensor-id=0 sensor-mode=1 ! video/x-raw(memory:NVMM),width=1920,height=1080,framerate=20/1' ! nvvidconv ! videoflip method=vertical-flip ! jpegenc quality=95
[INFO] [1701451092.398617418] [robomaster_1.camera_0.gscam_publisher]: sync_sink = true
[INFO] [1701451092.398651627] [robomaster_1.camera_0.gscam_publisher]: preroll = false
[INFO] [1701451092.398673739] [robomaster_1.camera_0.gscam_publisher]: use_gst_timestamps = true
[INFO] [1701451092.398696620] [robomaster_1.camera_0.gscam_publisher]: image_encoding = jpeg
[INFO] [1701451092.398731757] [robomaster_1.camera_0.gscam_publisher]: camera_info_url = file:///opt/robomaster/front_camera.yaml
[INFO] [1701451092.398753485] [robomaster_1.camera_0.gscam_publisher]: camera_name = front_camera
[INFO] [1701451092.398776046] [robomaster_1.camera_0.gscam_publisher]: frame_id = front_camera_frame
[INFO] [1701451092.398852880] [robomaster_1.camera_0.gscam_publisher]: camera calibration URL: file:///opt/robomaster/front_camera.yaml
[INFO] [1701451092.399823751] [robomaster_1.camera_0.gscam_publisher]: Loaded camera calibration from file:///opt/robomaster/front_camera.yaml
[INFO] [1701451092.428750356] [robomaster_1.camera_0.gscam_publisher]: Gstreamer initialized
[INFO] [1701451092.428905048] [robomaster_1.camera_0.gscam_publisher]: Gstreamer version: GStreamer 1.16.3
[INFO] [1701451092.483984944] [robomaster_1.camera_0.gscam_publisher]: Time offset: 1701356423320895203
[INFO] [1701451092.484465244] [robomaster_1.camera_0.gscam_publisher]: Stream is paused
[INFO] [1701451092.487751786] [robomaster_1.camera_0.gscam_publisher]: Pipeline running
[INFO] [1701451092.488373400] [robomaster_1.camera_0.gscam_publisher]: Thread exit
[INFO] [1701451092.489834459] [robomaster_1.camera_0.gscam_publisher]: Thread running
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3840 x 2160 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 22.250000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 22.250000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 1 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 59.999999 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
CONSUMER: ERROR OCCURRED
[ERROR] [1701451133.382104885] [robomaster_1.camera_0.gscam_publisher]: Could not get sample, pause for 1s

Perhaps there is a way to read the exact error that the consumer throws for further debugging or error handling.

Thanks for clarifying, and thanks for creating this driver!

tee and filesink does not work if stream is interrupted

It would be nice to add a tee to the configuration and write the stream to a file while also generating ROS images. When tested with live streams (camera or udpsrc) the ROS images are generated correctly but the resulting mp4 file will not open. I suspect the problem is that the file is not closed correctly when the node is shut down.

This gst-launch-1.0 command works correctly -- note the "-e" option, which generates an EOS event before shutting down the pipeline:

gst-launch-1.0 -e -v v4l2src device=/dev/video1 do-timestamp=true ! queue ! video/x-h264,width=1920,height=1080,framerate=30/1 ! h264parse ! tee name=fork ! queue ! mp4mux ! filesink location=save.mp4 fork. ! queue ! avdec_h264 ! autovideosink

This similar gscam2 configuration doesn't generate a usable file:

v4l2src device=/dev/video1 do-timestamp=true ! queue ! video/x-h264,width=1920,height=1080,framerate=30/1 ! h264parse ! tee name=fork ! queue ! mp4mux ! filesink location=save.mp4 fork. ! avdec_h264 ! videoconvert

Related to #3

Autovideosink "assertion: outpad failed"

Hello,

I am extremely new to gstreamer pipelines, but have a good background in image processing. I am doing some initial testing on a USB video camera before incorporating gscam2 into platform. I'm having some issues getting the camera to run with the package.

For example, my pipeline to run gstreamer will pull up a nice window streaming the image from the camera with:

gst-launch-1.0 v4l2src device=/dev/video0 ! autovideosink

However, when using that pipeline string in the gscam2 package I get this output.

ros2 launch gscam2 composition_launch.py 
[INFO] [launch]: All log files can be found below /home/mike/.ros/log/2021-11-18-14-13-06-220931-mrawding-razor-302343
[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [component_container-1]: process started with pid [302355]
[component_container-1] [INFO] [1637262786.604877588] [my_container]: Load Library: /home/mike/Desktop/latest/ffmpeg/sitcore/mav_platform_r2/install/gscam2/lib/libgscam_node.so
[component_container-1] [INFO] [1637262786.610115733] [my_container]: Found class: rclcpp_components::NodeFactoryTemplate<gscam2::GSCamNode>
[component_container-1] [INFO] [1637262786.610131686] [my_container]: Instantiate class: rclcpp_components::NodeFactoryTemplate<gscam2::GSCamNode>
[component_container-1] [INFO] [1637262786.613842975] [image_publisher]: use_intra_process_comms=1
[component_container-1] [INFO] [1637262786.614108234] [image_publisher]: gscam_config = v4l2src device=/dev/video0 ! queue ! autovideosink
[component_container-1] [INFO] [1637262786.614122460] [image_publisher]: sync_sink = false
[component_container-1] [INFO] [1637262786.614128142] [image_publisher]: preroll = false
[component_container-1] [INFO] [1637262786.614134373] [image_publisher]: use_gst_timestamps = true
[component_container-1] [INFO] [1637262786.614138891] [image_publisher]: image_encoding = rgb8
[component_container-1] [INFO] [1637262786.614143296] [image_publisher]: camera_info_url = 
[component_container-1] [INFO] [1637262786.614147713] [image_publisher]: camera_name = 
[component_container-1] [INFO] [1637262786.614153305] [image_publisher]: frame_id = camera_frame
[component_container-1] [INFO] [1637262786.614172899] [image_publisher]: using default calibration URL
[component_container-1] [INFO] [1637262786.614189877] [image_publisher]: camera calibration URL: file:///home/mike/.ros/camera_info/camera.yaml
[component_container-1] [ERROR] [1637262786.614219126] [camera_calibration_parsers]: Unable to open camera calibration file [/home/mike/.ros/camera_info/camera.yaml]
[component_container-1] [WARN] [1637262786.614224830] [image_publisher]: Camera calibration file /home/mike/.ros/camera_info/camera.yaml not found
[component_container-1] [INFO] [1637262786.614238170] [image_publisher]: Loaded camera calibration from 
[component_container-1] [INFO] [1637262786.622444346] [image_publisher]: Gstreamer initialized
[component_container-1] [INFO] [1637262786.622468422] [image_publisher]: Gstreamer version: GStreamer 1.16.2
[component_container-1] **
[component_container-1] ERROR:/home/mike/Desktop/latest/ffmpeg/sitcore/mav_platform_r2/src/drivers/rb5/gscam2/src/gscam_node.cpp:168:bool gscam2::GSCamNode::impl::create_pipeline(): assertion failed: (outpad)
[component_container-1] Bail out! ERROR:/home/mike/Desktop/latest/ffmpeg/sitcore/mav_platform_r2/src/drivers/rb5/gscam2/src/gscam_node.cpp:168:bool gscam2::GSCamNode::impl::create_pipeline(): assertion failed: (outpad)
[ERROR] [component_container-1]: process has died [pid 302355, exit code -6, cmd '/opt/ros/galactic/lib/rclcpp_components/component_container --ros-args -r __node:=my_container -r __ns:=/'].
^C[WARNING] [launch]: user interrupted with ctrl-c (SIGINT)
[WARNING] [launch_ros.actions.load_composable_nodes]: Abandoning wait for the '/my_container/_container/load_node' service response, due to shutdown

If I take away the autosink option, I get an endless loop of Could not get sample, pause for 1s

I don't think this an issue with this code base at all and definitely a user issue. The camera is a Generic webcam, but it seems to work fine with the gstlaunch command. Is there something I can aff to the gscam2 config string to get this to run?

I would appreciate any and all help. Thanks!

Issue with linking when build in yocto

I tried to include this package in meta-ros2-foxy by adding yocto recipes and compile it.
However, it fails with the following:
CMakeFiles/gscam_main.dir/src/gscam_node.cpp.o: undefined reference to symbol 'gst_base_sink_get_type'
| ..../recipe-sysroot/usr/lib/libgstbase-1.0.so.0: error adding symbols: DSO missing from command line

I notice in the CMakeCache.txt that under the gscam_node_LIB_DEPENDS, I see gstapp-1.0.so but gstbase-1.0 and gstreamer-1.0 are missing. Which is causing the above error. However, gstbase-1.0 and gstreamer-1.0 are installed to the same location as gstapp-1.0 and the library files contain the above required symbol.

I am at a complete loss as to why this failure is happening and how to go about debugging this. I am neither a Yocto build expert nor a CMake expert but I have done all the due diligence I know how to, to ensure that all the libraries are available and the paths are set properly.

Can someone please help?!

use_gst_timestamps doesn't work

I'm seeing huge timestamps... it appears that my ROS2 port is producing bogus timestamps when use_gst_timestamps is true.

assertion failed

Hi, I am trying to use the gscam2 package but I received following error:

[INFO] [launch]: Default logging verbosity is set to INFO
~/ros2_gscam/install/gscam2/share/gscam2/cfg
~/ros2_gscam/install/gscam2/share/gscam2/cfg/params.yaml
file://~/ros2_gscam/install/gscam2/share/gscam2/cfg/my_camera.ini
[INFO] [gscam_main-1]: process started with pid [5640]
[gscam_main-1] [INFO] [1621338577.303952503] [my_camera.gscam_publisher]: use_intra_process_comms=0
[gscam_main-1] [INFO] [1621338577.304175845] [my_camera.gscam_publisher]: gscam_config = tcambin ! videoconvert ! autovideosink
[gscam_main-1] [INFO] [1621338577.304188258] [my_camera.gscam_publisher]: sync_sink = true
[gscam_main-1] [INFO] [1621338577.304195071] [my_camera.gscam_publisher]: preroll = false
[gscam_main-1] [INFO] [1621338577.304201163] [my_camera.gscam_publisher]: use_gst_timestamps = false
[gscam_main-1] [INFO] [1621338577.304207014] [my_camera.gscam_publisher]: image_encoding = rgb8
[gscam_main-1] [INFO] [1621338577.304212985] [my_camera.gscam_publisher]: camera_info_url = file://~/ros2_gscam/install/gscam2/share/gscam2/cfg/my_camera.ini
[gscam_main-1] [INFO] [1621338577.304219037] [my_camera.gscam_publisher]: camera_name = my_camera
[gscam_main-1] [INFO] [1621338577.304224837] [my_camera.gscam_publisher]: frame_id = my_camera_frame
[gscam_main-1] [INFO] [1621338577.304244154] [my_camera.gscam_publisher]: camera calibration URL: file://~/ros2_gscam/install/gscam2/share/gscam2/cfg/my_camera.ini
[gscam_main-1] [INFO] [1621338577.304408895] [my_camera.gscam_publisher]: Loaded camera calibration from file://~/ros2_gscam/install/gscam2/share/gscam2/cfg/my_camera.ini
[gscam_main-1] [INFO] [1621338577.323889448] [my_camera.gscam_publisher]: Gstreamer initialized
[gscam_main-1] [INFO] [1621338577.323942308] [my_camera.gscam_publisher]: Gstreamer version: GStreamer 1.16.2
[gscam_main-1] **
[gscam_main-1] ERROR:~/ros2_gscam/src/gscam2/src/gscam_node.cpp:168:bool gscam2::GSCamNode::impl::create_pipeline(): assertion failed: (outpad)
[gscam_main-1] Bail out! ERROR:~/ros2_gscam/src/gscam2/src/gscam_node.cpp:168:bool gscam2::GSCamNode::impl::create_pipeline(): assertion failed: (outpad)
[ERROR] [gscam_main-1]: process has died [pid 5640, exit code -6, cmd '~/ros2_gscam/install/gscam2/lib/gscam2/gscam_main --ros-args -r __node:=gscam_publisher -r __ns:=/my_camera --params-file ~/ros2_gscam/install/gscam2/share/gscam2/cfg/params.yaml --params-file /tmp/launch_params_1x_of_nr -r /image_raw:=/my_camera/image_raw -r /camera_info:=/my_camera/camera_info'].

The video stream can be visualized without any problems with:

gst-launch-1.0 tcambin ! videoconvert ! autovideosink

I also tried it with another network camera and received the same error. In this case the video stream can be also visualized without any problems with gstreamer:

gst-launch-1.0 udpsrc port=53260 ! application/x-rtp, media=video, encoding-name=JPEG  ! rtpjpegdepay ! jpegparse ! jpegdec ! autovideosink sync=false

Can the node also be used without the calibration file? I only need the raw image data in ros.

Thank you for your answer.

no image being published with gscam2

I am using ros2-galactic built on yocto dunfell through meta-ros with poky distro. ROS2 works well for me.
I tried to build gscam2 package and it did get build and I was able to run the node gscam2_main.
But when I run gscam_main node, I do not get any output stream. It shows just one image and thats all. The camera is on at all times though.

my gst pipeline is "v4l2src device=/dev/video2 ! 'video/x-raw, width=640,height=480,framerate=30/1' ! waylandsink ! videoconvert"
I export the pipeline to GSCAM_CONFIG, and run: ros2 run gscam2 gscam_main

commands:
export GSCAM_CONFIG="v4l2src device=/dev/video2 ! 'video/x-raw, width=640,height=480' ! waylandsink ! videoconvert"
ros2 run gscam2 gscam_main

root@user:~# ros2 run gscam2 gscam_main
1633723466.886373 [0] gscam_main: selected interface "lo" is not multicast-capable: disabling multicast
[INFO] [1633723466.895408561] [gscam_publisher]: use_intra_process_comms=0
[INFO] [1633723466.895880869] [gscam_publisher]: gscam_config =
[INFO] [1633723466.895920177] [gscam_publisher]: sync_sink = true
[INFO] [1633723466.895949715] [gscam_publisher]: preroll = false
[INFO] [1633723466.895976254] [gscam_publisher]: use_gst_timestamps = false
[INFO] [1633723466.896002715] [gscam_publisher]: image_encoding = rgb8
[INFO] [1633723466.896028177] [gscam_publisher]: camera_info_url =
[INFO] [1633723466.896057484] [gscam_publisher]: camera_name =
[INFO] [1633723466.896083561] [gscam_publisher]: frame_id = camera_frame
[INFO] [1633723466.896111023] [gscam_publisher]: Using GSCAM_CONFIG env var: v4l2src device=/dev/video3 ! video/x-raw,width=640,height=480,framerate=30/1 ! waylandsink ! videoconvert
[INFO] [1633723466.896145177] [gscam_publisher]: using default calibration URL
[INFO] [1633723466.896180484] [gscam_publisher]: camera calibration URL: file:///home/root/.ros/camera_info/camera.yaml
[ERROR] [1633723466.896272484] [camera_calibration_parsers]: Unable to open camera calibration file [/home/root/.ros/camera_info/camera.yaml]
[WARN] [1633723466.896385484] [gscam_publisher]: Camera calibration file /home/root/.ros/camera_info/camera.yaml not found
[INFO] [1633723466.896421792] [gscam_publisher]: Loaded camera calibration from
[INFO] [1633723466.915097792] [gscam_publisher]: Gstreamer initialized
[INFO] [1633723466.915240792] [gscam_publisher]: Gstreamer version: GStreamer 1.16.3
[INFO] [1633723467.004341715] [gscam_publisher]: Time offset: 1633723071460949386
[INFO] [1633723467.006471638] [gscam_publisher]: Stream is paused
[INFO] [1633723467.008835946] [gscam_publisher]: Pipeline running
[INFO] [1633723467.009220561] [gscam_publisher]: Thread running

The camera remains on till the time I stop the node, but video stream gets paused after displaying just the very first video frame. If i remove "waylandsink" from my config command, and export simply "v4l2src device=/dev/video3 ! video/x-raw,width=640,height=480,framerate=30/1 ! videoconvert", I do not get any video stream, but image_raw data is pulished, which i can subscribe using any other nodes like usb_cam package's show_image.py node, and using that I get a video stream.
But I want a video stream using gscam itself.
Any inputs would be nice
thank you

gstreamer assertion in gst_buffer_resize_range

I am getting the following assertion, when running the example pipeline:
gst_buffer_resize_range: assertion 'gst_buffer_is_writable (buffer)' failed

The pipeline:
export GSCAM_CONFIG='videotestsrc pattern=snow ! video/x-raw,width=1280,height=720 ! videoconvert'

cmd:
ros2 run gscam gscam_main

My system:
Ubuntu 20.04 64Bit
gstreamer 1.16.2
ROS 2 Foxy Fitzroy

Also tested gscam from ROS, but got the same assertion.

Node paramaters?

I have some ros1 melodic code using the original gscam and I am in the process of updating that to ROS2. The original gscam takes in a number of params to manage some components such as sync_sink and frame_id. My over all ROS2 experience is a bit limited so I might be missing something, but I don't see arguments getting passed into the this version of the code, and it looks like the other params are just being hard coded. Is this the case or am I missing where the args are passed to the node? If these params are still available can they be added into Readme so we know what we can/cannot pass to the node?

GScam doesn't support Tee

ROS gscam doesn't seem to support Tee, I never have an image stream, nor do I get any error output. Here is the tee that I use, but from research, this seems to be a common thread. A workaround is to launch the tee with a udpsink end in python, and using a udpsrc as my gscam config.

Here is my example gscamconfig with a Tee:

nvarguscamerasrc sensor-id=0 ! video/x-raw(memory:NVMM),width=1280,height=720,framerate=15/1 ! nvvidconv flip-method=2 ! tee name=t1 t1. ! queue silent=true ! nvv4l2h264enc bitrate=8000000 ! h264parse config-interval=1 ! queue silent=true leaky=downstream ! rndbuffersize max=65000 ! udpsink host=127.0.0.1 port=5001 sync=false t1. ! video/x-raw, format=BGRx ! videoconvert'

Is this a known issue? If so, is there any plan on resolving it?

Publishing ROS along with streaming on UDP port

@clydemcqueen I am trying to run ROS on BlueROV (publish camera topic) and pass video stream to UDP (similar to original BlueROV). The idea is to interrupt BlueROV manually if it does something crazy. We are trying to run some computer vision algorithms for object detection.

Do you think this is even possible? Do you have any ideas about that?

Thanks for the great repo. I will be following your work.

gscam2 pipeline with udp source on an Nvidia Jetson

I need a bit of help with the pipeline string, I have a udp camera connected and I can visualize the video with this gst-launch-1.0 command:

gst-launch-1.0 udpsrc port=50008 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv ! fpsdisplaysink

Im trying to use that as an input to gscam by setting the following env variable:

export GSCAM_CONFIG='udpsrc port=50008 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv"

However I get the following error:

[FATAL] [1715812318.234746933] [gscam_publisher]: Cannot link outelement("nvvconv0") -> sink

I tried with a simple pipeline with 'videotestsrc ! video/x-raw, format=BGRx ! videoconvert' and that works, but not with the above settings. Any ideas on what is wrong? the following is the full output:

$ export GSCAM_CONFIG='udpsrc port=50008 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv'

$ ros2 run gscam2 gscam_main --ros-args -p camera_info_url:=file://$PWD/my_camera.ini
[INFO] [1715812631.997915183] [gscam_publisher]: use_intra_process_comms=0
[INFO] [1715812631.998675774] [gscam_publisher]: gst_plugin_path = 
[INFO] [1715812631.998721311] [gscam_publisher]: gscam_config = 
[INFO] [1715812631.998743583] [gscam_publisher]: sync_sink = true
[INFO] [1715812631.998759232] [gscam_publisher]: preroll = false
[INFO] [1715812631.998773184] [gscam_publisher]: use_gst_timestamps = false
[INFO] [1715812631.998790176] [gscam_publisher]: image_encoding = rgb8
[INFO] [1715812631.998806081] [gscam_publisher]: camera_info_url = file:///home/jaime/kaya_ws/src/gscam2/cfg/my_camera.ini
[INFO] [1715812631.998822785] [gscam_publisher]: camera_name = 
[INFO] [1715812631.998838017] [gscam_publisher]: frame_id = camera_frame
[INFO] [1715812631.998857410] [gscam_publisher]: Using GSCAM_CONFIG env var: udpsrc port=50008 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! queue ! h264parse ! nvv4l2decoder ! queue ! nvvidconv
[INFO] [1715812631.998914723] [gscam_publisher]: camera calibration URL: file:///home/jaime/kaya_ws/src/gscam2/cfg/my_camera.ini
[INFO] [1715812631.999182792] [gscam_publisher]: Loaded camera calibration from file:///home/jaime/kaya_ws/src/gscam2/cfg/my_camera.ini
[INFO] [1715812632.011161754] [gscam_publisher]: Gstreamer initialized
[INFO] [1715812632.011244668] [gscam_publisher]: Gstreamer version: GStreamer 1.16.3
[FATAL] [1715812632.069449652] [gscam_publisher]: Cannot link outelement("nvvconv0") -> sink

[INFO] [1715812632.069807291] [gscam_publisher]: Pipeline deleted

Pipeline Launches but No Data Published

Hello there,

I am having an issue where my pipeline comes up totally fine when launching the node_params launch file but not data is being published to the topics. I have tried multiple version of my pipeline but none of them seem to fix the problem. Would love any help or insight on what I might be overlooking or issues in my setup. Thanks!

nvarguscamersrc ! video/x-raw(memory:NVMM), width=1280, height=720, framerate=30/1 ! nvvidconv flip-method=0 ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw, format=BGR

I have tried with many combinations of videonconvert being place elsewhere but even the simplest pipeline I could make with the nvarguscamerasrc does not work so I am wondering if this is an NVIDIA bug.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.