Coder Social home page Coder Social logo

depthai-ros-examples's Introduction

depthai-ros-examples

depthai-ros-examples's People

Contributors

charles-maheu avatar danilo-pejovic avatar daxoft avatar h-wata avatar jdavidberger avatar nindanaoto avatar roni-kreinin avatar saching13 avatar sfalexrog avatar skammi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

depthai-ros-examples's Issues

unable to run yolo, mobilenet in Jetson NX

unable to run yolo, mobilenet in Jetson NX. Getting killed with error.Object "/lib/aarch64-linux-gnu/libc.so.6", at 0x7f9ffc04f8, in raise Aborted (Signal sent by tkill() 30198 1000)

Host Device: Nvidia Jetson NX
OAK Model: OAK-D-POE

Whats working

  • can run depthai_demo.py
  • can run roslaunch depthai_examples stereo_inertial_node.launch

Unable to run

  • roslaunch depthai_examples yolov4_publisher.launch camera_model:=OAK-D-POE
  • roslaunch depthai_examples mobile_publisher.launch
roslaunch depthai_examples yolov4_publisher.launch camera_model:=OAK-D-POE
... logging to /home/nvidia/.ros/log/aa7cf384-dbf0-11ec-8a96-48b02d2f7086/roslaunch-forOak-30154.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://192.168.1.66:41457/

SUMMARY
========

PARAMETERS
 * /robot_description: <?xml version="1....
 * /rosdistro: melodic
 * /rosversion: 1.14.13
 * /yolov4_publisher/LRchecktresh: 5
 * /yolov4_publisher/camera_param_uri: package://depthai...
 * /yolov4_publisher/confidence: 200
 * /yolov4_publisher/monoResolution: 400p
 * /yolov4_publisher/nnName: 
 * /yolov4_publisher/resourceBaseFolder: /home/nvidia/catk...
 * /yolov4_publisher/subpixel: True
 * /yolov4_publisher/sync_nn: True
 * /yolov4_publisher/tf_prefix: oak

NODES
  /
    oak_state_publisher (robot_state_publisher/robot_state_publisher)
    yolov4_publisher (depthai_examples/yolov4_spatial_node)

auto-starting new master
process[master]: started with pid [30167]
ROS_MASTER_URI=http://192.168.1.66:11311/

setting /run_id to aa7cf384-dbf0-11ec-8a96-48b02d2f7086
process[rosout-1]: started with pid [30180]
started core service [/rosout]
process[oak_state_publisher-2]: started with pid [30191]
process[yolov4_publisher-3]: started with pid [30198]
terminate called after throwing an instance of 'std::__ios_failure'
  what():  basic_filebuf::underflow error reading the file: iostream error
Stack trace (most recent call last):
#0    Object "/lib/aarch64-linux-gnu/libc.so.6", at 0x7f9ffc04f8, in raise
Aborted (Signal sent by tkill() 30198 1000)
================================================================================REQUIRED process [yolov4_publisher-3] has died!
process has died [pid 30198, exit code -6, cmd /home/nvidia/catkin_ws/devel/lib/depthai_examples/yolov4_spatial_node __name:=yolov4_publisher __log:=/home/nvidia/.ros/log/aa7cf384-dbf0-11ec-8a96-48b02d2f7086/yolov4_publisher-3.log].
log file: /home/nvidia/.ros/log/aa7cf384-dbf0-11ec-8a96-48b02d2f7086/yolov4_publisher-3*.log
Initiating shutdown!
================================================================================
[yolov4_publisher-3] killing on exit
[oak_state_publisher-2] killing on exit
[rosout-1] killing on exit
[master] killing on exit
shutting down processing monitor...
... shutting down processing monitor complete
done

Jetson NX: unable reading color image

Hi, I'm trying to use ROS nodes for the OAK-D-LITE camera using Jetson NX. Examples from depthai-python work correctly, but using rgb_stereo_node I get an error.

Information

  • platform: Jetson NX
  • os: # R34 (release), REVISION: 1.1, GCID: 30414990, BOARD: t186ref, EABI: aarch64, DATE: Tue May 17 04:20:55 UTC 2022
  • camera: OAK-D-LITE
  • ROS: ros-noetic 1.15.14
  • code version: main

How to reproduce

  • run roslaunch depthai_examples rgb_stereo_node.launch camera_model:=OAK-D-LITE
  • in second terminal:
    • rostopic echo /rgb_stereo_publisher/stereo/depth --noarr -- works well
    • rostopic echo /rgb_stereo_publisher/color/image --noarr -- generate error below

Error

Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://akl-put:37413/

SUMMARY
========

PARAMETERS
 * /rgb_stereo_publisher/LRchecktresh: 5
 * /rgb_stereo_publisher/camera_param_uri: package://depthai...
 * /rgb_stereo_publisher/confidence: 200
 * /rgb_stereo_publisher/extended: False
 * /rgb_stereo_publisher/lrcheck: True
 * /rgb_stereo_publisher/subpixel: False
 * /rgb_stereo_publisher/tf_prefix: oak
 * /rosdistro: noetic
 * /rosversion: 1.15.14

NODES
  /
    rgb_stereo_publisher (depthai_examples/rgb_stereo_node)

auto-starting new master
process[master]: started with pid [9944]
ROS_MASTER_URI=http://localhost:11311

setting /run_id to 6bceee72-e004-11ec-a249-48b02d2f6e98
process[rosout-1]: started with pid [9955]
started core service [/rosout]
process[rgb_stereo_publisher-2]: started with pid [9962]
Stack trace (most recent call last) in thread 9990:
#25   Object "/usr/lib/aarch64-linux-gnu/ld-2.31.so", at 0xffffffffffffffff, in 
#24   Source "../sysdeps/unix/sysv/linux/aarch64/clone.S", line 78, in thread_start [0xffff9c51722b]
#23   Source "/build/glibc-70LW9c/glibc-2.31/nptl/pthread_create.c", line 477, in start_thread [0xffff9dd0251b]
#22   Object "/usr/lib/aarch64-linux-gnu/libstdc++.so.6.0.28", at 0xffff9c6a9fab, in 
#21   Source "/usr/include/c++/9/thread", line 195, in _M_run [0xffff9cef7ec3]
        192: 	{ }
        193: 
        194: 	void
      > 195: 	_M_run() { _M_func(); }
        196:       };
        197: 
        198:     void
#20   Source "/usr/include/c++/9/thread", line 251, in operator() [0xffff9cef7f47]
        248: 	{
        249: 	  using _Indices
        250: 	    = typename _Build_index_tuple<tuple_size<_Tuple>::value>::__type;
      > 251: 	  return _M_invoke(_Indices());
        252: 	}
        253:       };
#19   Source "/usr/include/c++/9/thread", line 244, in _M_invoke<0> [0xffff9cef7fbb]
        241: 	template<size_t... _Ind>
        242: 	  typename __result<_Tuple>::type
        243: 	  _M_invoke(_Index_tuple<_Ind...>)
      > 244: 	  { return std::__invoke(std::get<_Ind>(std::move(_M_t))...); }
        245: 
        246: 	typename __result<_Tuple>::type
        247: 	operator()()
#18   Source "/usr/include/c++/9/bits/invoke.h", line 95, in __invoke<dai::DataOutputQueue::DataOutputQueue(std::shared_ptr<dai::XLinkConnection>, const string&, unsigned int, bool)::<lambda()> > [0xffff9cef80cf]
         92:       using __result = __invoke_result<_Callable, _Args...>;
         93:       using __type = typename __result::type;
         94:       using __tag = typename __result::__invoke_type;
      >  95:       return std::__invoke_impl<__type>(__tag{}, std::forward<_Callable>(__fn),
         96: 					std::forward<_Args>(__args)...);
         97:     }
#17   Source "/usr/include/c++/9/bits/invoke.h", line 60, in __invoke_impl<void, dai::DataOutputQueue::DataOutputQueue(std::shared_ptr<dai::XLinkConnection>, const string&, unsigned int, bool)::<lambda()> > [0xffff9cef81a3]
         57:   template<typename _Res, typename _Fn, typename... _Args>
         58:     constexpr _Res
         59:     __invoke_impl(__invoke_other, _Fn&& __f, _Args&&... __args)
      >  60:     { return std::forward<_Fn>(__f)(std::forward<_Args>(__args)...); }
         61: 
         62:   template<typename _Res, typename _MemFun, typename _Tp, typename... _Args>
         63:     constexpr _Res
#16   Source "/tmp/depthai-core/src/device/DataQueue.cpp", line 67, in operator() [0xffff9cef4b6b]
#15   Source "/usr/include/c++/9/bits/std_function.h", line 688, in operator() [0xffff9cef8cc7]
        685:     {
        686:       if (_M_empty())
        687: 	__throw_bad_function_call();
      > 688:       return _M_invoker(_M_functor, std::forward<_ArgTypes>(__args)...);
        689:     }
        690: 
        691: #if __cpp_rtti
#14   Source "/usr/include/c++/9/bits/std_function.h", line 300, in _M_invoke [0xaaaace68badf]
        297:       static void
        298:       _M_invoke(const _Any_data& __functor, _ArgTypes&&... __args)
        299:       {
      > 300: 	(*_Base::_M_get_pointer(__functor))(
        301: 	    std::forward<_ArgTypes>(__args)...);
        302:       }
        303:     };
#13   Source "/usr/include/c++/9/functional", line 484, in operator()<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::shared_ptr<dai::ADatatype> > [0xaaaace68e653]
        481: 	{
        482: 	  return this->__call<_Result>(
        483: 	      std::forward_as_tuple(std::forward<_Args>(__args)...),
      > 484: 	      _Bound_indexes());
        485: 	}
        486: 
        487:       // Call as const
#12   Source "/usr/include/c++/9/functional", line 400, in __call<void, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >&&, std::shared_ptr<dai::ADatatype>&&, 0, 1, 2> [0xaaaace6903f3]
        397: 	_Result
        398: 	__call(tuple<_Args...>&& __args, _Index_tuple<_Indexes...>)
        399: 	{
      > 400: 	  return std::__invoke(_M_f,
        401: 	      _Mu<_Bound_args>()(std::get<_Indexes>(_M_bound_args), __args)...
        402: 	      );
        403: 	}
#11   Source "/usr/include/c++/9/bits/invoke.h", line 95, in __invoke<void (dai::ros::BridgePublisher<sensor_msgs::Image_<std::allocator<void> >, dai::ImgFrame>::*&)(std::__cxx11::basic_string<char>, std::shared_ptr<dai::ADatatype>), dai::ros::BridgePublisher<sensor_msgs::Image_<std::allocator<void> >, dai::ImgFrame>*&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::shared_ptr<dai::ADatatype> > [0xaaaace6921ef]
         92:       using __result = __invoke_result<_Callable, _Args...>;
         93:       using __type = typename __result::type;
         94:       using __tag = typename __result::__invoke_type;
      >  95:       return std::__invoke_impl<__type>(__tag{}, std::forward<_Callable>(__fn),
         96: 					std::forward<_Args>(__args)...);
         97:     }
#10   Source "/usr/include/c++/9/bits/invoke.h", line 73, in __invoke_impl<void, void (dai::ros::BridgePublisher<sensor_msgs::Image_<std::allocator<void> >, dai::ImgFrame>::*&)(std::__cxx11::basic_string<char>, std::shared_ptr<dai::ADatatype>), dai::ros::BridgePublisher<sensor_msgs::Image_<std::allocator<void> >, dai::ImgFrame>*&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::shared_ptr<dai::ADatatype> > [0xaaaace693d93]
         70:     __invoke_impl(__invoke_memfun_deref, _MemFun&& __f, _Tp&& __t,
         71: 		  _Args&&... __args)
         72:     {
      >  73:       return ((*std::forward<_Tp>(__t)).*__f)(std::forward<_Args>(__args)...);
         74:     }
         75: 
         76:   template<typename _Res, typename _MemPtr, typename _Tp>
#9    Source "/home/put/catkin_ws/src/luxonis/depthai-ros/depthai_bridge/include/depthai_bridge/BridgePublisher.hpp", line 309, in daiCallback [0xaaaace689323]
        306: void BridgePublisher<RosMsg, SimMsg>::daiCallback(std::string name, std::shared_ptr<ADatatype> data) {
        307:     // std::cout << "In callback " << name << std::endl;
        308:     auto daiDataPtr = std::dynamic_pointer_cast<SimMsg>(data);
      > 309:     publishHelper(daiDataPtr);
        310: }
        311: 
        312: template <class RosMsg, class SimMsg>
#8    Source "/home/put/catkin_ws/src/luxonis/depthai-ros/depthai_bridge/include/depthai_bridge/BridgePublisher.hpp", line 368, in publishHelper [0xaaaace68b757]
        365: #endif
        366: 
        367:     if(mainSubCount > 0 || infoSubCount > 0) {
      > 368:         _converter(inDataPtr, opMsgs);
        369: 
        370:         while(opMsgs.size()) {
        371:             RosMsg currMsg = opMsgs.front();
#7    Source "/usr/include/c++/9/bits/std_function.h", line 688, in operator() [0xaaaace68d6ff]
        685:     {
        686:       if (_M_empty())
        687: 	__throw_bad_function_call();
      > 688:       return _M_invoker(_M_functor, std::forward<_ArgTypes>(__args)...);
        689:     }
        690: 
        691: #if __cpp_rtti
#6    Source "/usr/include/c++/9/bits/std_function.h", line 300, in _M_invoke [0xaaaace688aab]
        297:       static void
        298:       _M_invoke(const _Any_data& __functor, _ArgTypes&&... __args)
        299:       {
      > 300: 	(*_Base::_M_get_pointer(__functor))(
        301: 	    std::forward<_ArgTypes>(__args)...);
        302:       }
        303:     };
#5    Source "/usr/include/c++/9/functional", line 484, in operator()<std::shared_ptr<dai::ImgFrame>, std::deque<sensor_msgs::Image_<std::allocator<void> >, std::allocator<sensor_msgs::Image_<std::allocator<void> > > >&> [0xaaaace68ac2b]
        481: 	{
        482: 	  return this->__call<_Result>(
        483: 	      std::forward_as_tuple(std::forward<_Args>(__args)...),
      > 484: 	      _Bound_indexes());
        485: 	}
        486: 
        487:       // Call as const
#4    Source "/usr/include/c++/9/functional", line 400, in __call<void, std::shared_ptr<dai::ImgFrame>&&, std::deque<sensor_msgs::Image_<std::allocator<void> >, std::allocator<sensor_msgs::Image_<std::allocator<void> > > >&, 0, 1, 2> [0xaaaace68cf37]
        397: 	_Result
        398: 	__call(tuple<_Args...>&& __args, _Index_tuple<_Indexes...>)
        399: 	{
      > 400: 	  return std::__invoke(_M_f,
        401: 	      _Mu<_Bound_args>()(std::get<_Indexes>(_M_bound_args), __args)...
        402: 	      );
        403: 	}
#3    Source "/usr/include/c++/9/bits/invoke.h", line 95, in __invoke<void (dai::ros::ImageConverter::*&)(std::shared_ptr<dai::ImgFrame>, std::deque<sensor_msgs::Image_<std::allocator<void> > >&), dai::ros::ImageConverter*&, std::shared_ptr<dai::ImgFrame>, std::deque<sensor_msgs::Image_<std::allocator<void> >, std::allocator<sensor_msgs::Image_<std::allocator<void> > > >&> [0xaaaace68f73f]
         92:       using __result = __invoke_result<_Callable, _Args...>;
         93:       using __type = typename __result::type;
         94:       using __tag = typename __result::__invoke_type;
      >  95:       return std::__invoke_impl<__type>(__tag{}, std::forward<_Callable>(__fn),
         96: 					std::forward<_Args>(__args)...);
         97:     }
#2    Source "/usr/include/c++/9/bits/invoke.h", line 73, in __invoke_impl<void, void (dai::ros::ImageConverter::*&)(std::shared_ptr<dai::ImgFrame>, std::deque<sensor_msgs::Image_<std::allocator<void> > >&), dai::ros::ImageConverter*&, std::shared_ptr<dai::ImgFrame>, std::deque<sensor_msgs::Image_<std::allocator<void> >, std::allocator<sensor_msgs::Image_<std::allocator<void> > > >&> [0xaaaace6914f7]
         70:     __invoke_impl(__invoke_memfun_deref, _MemFun&& __f, _Tp&& __t,
         71: 		  _Args&&... __args)
         72:     {
      >  73:       return ((*std::forward<_Tp>(__t)).*__f)(std::forward<_Args>(__args)...);
         74:     }
         75: 
         76:   template<typename _Res, typename _MemPtr, typename _Tp>
#1    Object "/home/put/catkin_ws/devel/.private/depthai_bridge/lib/libdepthai_bridge.so", at 0xffff9e39a193, in dai::ros::ImageConverter::toRosMsg(std::shared_ptr<dai::ImgFrame>, std::deque<sensor_msgs::Image_<std::allocator<void> >, std::allocator<sensor_msgs::Image_<std::allocator<void> > > >&)
#0    Object "/usr/lib/aarch64-linux-gnu/libopencv_optflow.so.4.2.0", at 0xffff9deee0ac, in cv::Mat::Mat(cv::Size_<int>, int, void*, unsigned long)
Segmentation fault (Address not mapped to object [0x65400000780])
================================================================================REQUIRED process [rgb_stereo_publisher-2] has died!
process has died [pid 9962, exit code -11, cmd /home/put/catkin_ws/devel/lib/depthai_examples/rgb_stereo_node __name:=rgb_stereo_publisher __log:=/home/put/.ros/log/6bceee72-e004-11ec-a249-48b02d2f6e98/rgb_stereo_publisher-2.log].
log file: /home/put/.ros/log/6bceee72-e004-11ec-a249-48b02d2f6e98/rgb_stereo_publisher-2*.log
Initiating shutdown!
================================================================================
[rgb_stereo_publisher-2] killing on exit
[rosout-1] killing on exit
[master] killing on exit
shutting down processing monitor...
... shutting down processing monitor complete
done

OAK-D Pro W Dev ROS and ROS2 support add basic options equally to every example

Please give support to OAK-D Pro W Dev in all
ROS/ROS2 examples .
I received the camera 3 weeks ago but I only was able so stream the stereo topics without depth (no depth data but published).

Another different problem is different examples have different options, even the basic ones that should be common in all of them as resolution, rectification, etc.

I can see the camera advertised in "ROS" discurse while it cannot be considered supported IMHO.

Any timeline to add it?

Publisher with blocking dataqueue

Hello,
The mobilenet_publisher.cpp example says that the addPublisherCallback call "works only when the dataqueue is non-blocking". I have checked the source code for the BridgePublisher, but my C++ is rusty so I failed to get where this limitation comes from.
Let's say I need to work with blocking dataqueues (matching passthrough frames and their original frames), could you elaborate on what would be the issue ?
Thanks you in advance !

OAK-D POE - Setting ColorCameraProperties Sensor Resolution to 12MP

Hi,

I have a OAK-D POE camera. I would like to test the color image output with 12mp resolution.

This is what I have done:

In stereo_inertial_publisher.cpp, I have set

camRgb->setResolution(dai::ColorCameraProperties::SensorResolution::THE_12_MP)

However, I got the following error:

OpenCV Error: Assertion failed (sz.width % 2 == 0 && sz.height % 3 == 0 && depth == CV_8U) in cvtColor, file /build/opencv-L2vuMj/opencv-3.2.0+dfsg/modules/imgproc/src/color.cpp, line 9878
[2022-06-20 13:49:33.505] [error] Callback with id: 1 throwed an exception: /build/opencv-L2vuMj/opencv-3.2.0+dfsg/modules/imgproc/src/color.cpp:9878: error: (-215) sz.width % 2 == 0 && sz.height % 3 == 0 && depth == CV_8U in function cvtColor

THE_4_K is working fine. May I know what is the Problem?

Thanks.

Weird disparity values being published in stereo_node

Hello,
I am trying to retrieve a disparity image from the stereo_node publisher with mode != depth, and the values observed, be it from a subscriber to .../stereo/disparity or with a rostopic echo .../stereo/disparity, seem off:
most are 0 with some integer values scattered everywhere. I checked that the disparity image obtained directly from the device with a getFrame call does not produce these values but "proper values" ranging between 1 and 255.

Also, I noticed with rostopic echo that the delta_d attribute of the published DisparityImage is 0.0, whereas the source code for the DisparityConverter seems to produce values that are either 1 or 1/32.

Could these two observations be related ?
Thanks in advance

/depthai-ros-examples/depthai_examples/ros1_src/stereo_nodelet.cpp bug!

Hi!
Would you update some code?

/depthai-ros-examples/depthai_examples/ros1_src/stereo_nodelet.cpp
...
line 148
std::tuple<dai::Pipeline, int, int> createPipeline(bool withDepth, bool lrcheck, bool extended, bool subpixel, int confidence, int LRchecktresh, std::string resolution){
dai::Pipeline pipeline;
...
// MonoCamera
//monoLeft->setResolution(dai::MonoCameraProperties::SensorResolution::THE_720_P);
monoLeft->setResolution(monoResolution); // here
monoLeft->setBoardSocket(dai::CameraBoardSocket::LEFT);
//monoRight->setResolution(dai::MonoCameraProperties::SensorResolution::THE_720_P);
monoRight->setResolution(monoResolution); // here
monoRight->setBoardSocket(dai::CameraBoardSocket::RIGHT);
...

[yolov4_publisher.launch] Depth image size and camera info don't match

Thanks for the great product.

When I run yolov4_publisher.launch and try to see DepthCloud in rviz, I get the following error.

Message
Error updating depth cloud: Depth image size and camera info don't match: 640 x 400 vs 1280 x 720(binning: 1 x 1, ROI size: 1280 x 720)

image

Hardware: OAK-D

If you modify the following part to THE_720_P instead of THE_400_P, DepthCloud will be output without problems.

monoLeft->setResolution(dai::MonoCameraProperties::SensorResolution::THE_400_P);
monoLeft->setBoardSocket(dai::CameraBoardSocket::LEFT);
monoRight->setResolution(dai::MonoCameraProperties::SensorResolution::THE_400_P);

image

May I send a PR?

Noisy Pointcloud after launching the stereo_node.launch

Hi,

I just got a new OAK D 2 days ago. I've been tinkering with the camera. I noticed that the camera's giving very noisy pointcloud. I could find a solution online. I've uploaded a bag file in case you want to take a look at the pointcloud and the images.

Kindly look into this.

Thanks,
Harshal

SpatialLocationCalculator distortion with setDepthAlign()

Hi team!
I want to visualize detections with Markers on a Depth Cloud using the depthai_example yolov4_publisher so I configured the stereo node with setDepthAlign(dai::CameraBoardSocker::RGB).

Although we can visualize the depth cloud, this setting appears to distort the spatial location calculation. Specifically, most estimates are near (x=-2, y=0, z=4) or undefined (0, 0, 0). [see control video vs variant video attached].

Investigating further, I tried the dai::StereoDepthConfig::AlgorithmControl::DepthAlign::CENTER enum found here, which appears to restore location calculation. However, the resolutions of the image frame and the depth frame do not align using this enumerator so we have no depth cloud. Thank you for your attention.

discord discussion

yolov4_control.mp4
yolov4_variant.mp4

OAK-D Factory sensor misalignment affects depth

My new OAK-D came with the sensors misaligned. In discord I was said it is normal.
After tests and two different branch, with mesh and without mesh in the calibration, the depth in both is affected.
Even if I aim the camera to an area which the most near area is 3m, the misaligned areas, gets distance of 0 or close to 0, and in the best case much closer than 3 meters, it appears red, and as it has valor (have no depth would be much better), it will prevent my drone or rovers of be armed or will be stuck or will have an undetermined behaviour as they will detect an obstacle in front of them, and in the better case they will be replanning continuosly.

Just it cannot be used, after show this video I would like luxonis confirm it is normal. Maybe ROS devs understand better the issue that is generating and how will affect to a robot

https://youtu.be/HyfL8lXYvUo

OAK-D Pro W POE - Launch

Hi,

May I know which launch file to use for OAK-D Pro W POE?

I tried to launch by using the "stereo_inertial_node.launch", but I got the following errors:

terminate called after throwing an instance of 'std::runtime_error'
what(): Failed to find device after booting, error message: X_LINK_DEVICE_NOT_FOUND

Do I need to change any param?

FYI, I have tried to run the examples in depthai-python. The examples are working.

Why not merge this repository into depthai-ros?

It might be good to merge this repository into the depthai-ros repository, the repositories are related anyway, and are a cohesive set of ros packages. At this moment readme instructions refer across repositories, and you need both anyway. Users not wanting the examples can use colcon to exclude the depthai_ros_examples package.

The folder structure in the depthai-ros repository could then become:

  • README.md with instructions on usage
  • depthai_ros_msgs as is
  • depthai_bridge as is
  • depthai_examples merged from depthai-ros-examples repo

This would ease the use of the ros packages a bit.

Different fps for video and preview output

Is it possible to modify the code such that the frame rates for video and preview output are different? I'm using the camera on a resource contraint platform and want to lower the computational burden. As an example I would like to have 30 fps for the preview output and 2 fps for the video output.

Alternatively, could there be a ROS service to capture still images at the full sensor resolution?

IMU erros and fluctuations readings when there is not movement.

Hi, I can see unusual fluctuations of IMU when it is resting, it happens in two different OAK-D in two different branchs, main and devel. It make it unusable not only to use it as main IMU, even to fuse it with other sensors
Also I noted likely there is not mag info or quaternions in any of the 3 modes after the linearization update (I didn't check many the IMU before).
Not sure what is the advantage of lose quaternions/AHRS and mag respect the other 3 new modes, more seeing the behaviour, which in discord someone told me is normal, even having errors as gyro data, or acc data not found.
I'm sure it is not and before make permanent changes, or give fast generic answers, the changes and documented issues should be properly checked and to avoid users became betatesters instead users.
Later comes other derivates issues that comes to that decisions as that even to get that unreliable readings it force me to use a extra node to fuse in a CPU/SO that is not realtime and add latency, something that the IMU does itself much better (lots of filers in its Cortex M0 built in MCU) and without CPU cost.

Can please Luxonis check again and test the IMU behaviour and give a official position about this?, because if this is the normal IMUs behaviour,and it is the better we got in 1 year, seeing the results, for what I would want buy another OAK-D with IMU? For what a S2 instead a OAK Lite?
Worse if future customers start to make me questions about the IMU if I include a OAK-D in my products. I'm sure they will not accept a is normal as answer, the same for other unresolved issues I'm having.
Sorry by be a PITA, I know it, I just do my thing.

https://youtube.com/shorts/YkSzmqhkz1Y?feature=share

https://youtu.be/jjtVldqyEhU

IMG_20220516_141215
20220516_142250

fatal error No such file or directory. C++ header files cannot be found.

I'm having the No such file or directory error when I try to build my catkin_ws. C++ header files cannot be found.

Errors << depthai_examples:make /home/ubuntu/catkin_ws/logs/depthai_examples/build.make.002.log /home/ubuntu/catkin_ws/src/depthai-ros-examples/depthai_examples/lib/rgb_pipeline.cpp:1:10: fatal error: depthai_examples/rgb_pipeline.hpp: No such file or directory #include <depthai_examples/rgb_pipeline.hpp> ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ compilation terminated. make[2]: *** [CMakeFiles/depthai_examples.dir/lib/rgb_pipeline.cpp.o] Error 1 make[2]: *** Waiting for unfinished jobs.... /home/ubuntu/catkin_ws/src/depthai-ros-examples/depthai_examples/lib/rgb_stereo_pipeline.cpp:1:10: fatal error: depthai_examples/rgb_stereo_pipeline.hpp: No such file or directory #include <depthai_examples/rgb_stereo_pipeline.hpp> ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ compilation terminated. /home/ubuntu/catkin_ws/src/depthai-ros-examples/depthai_examples/lib/stereo_from_host.cpp:1:10: fatal error: depthai_examples/stereo_from_host.hpp: No such file or directory #include <depthai_examples/stereo_from_host.hpp> ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ compilation terminated. make[2]: *** [CMakeFiles/depthai_examples.dir/lib/rgb_stereo_pipeline.cpp.o] Error 1 make[2]: *** [CMakeFiles/depthai_examples.dir/lib/stereo_from_host.cpp.o] Error 1 make[1]: *** [CMakeFiles/depthai_examples.dir/all] Error 2 make: *** [all] Error 2

I tried sourcing /opt/ros/melodic/setup.bash but that didn't solve it.

mobilenet_publisher Acquisition of distance coordinates

Hi
Thank you for the wonderful repository. mobilenet_publisher
Will the acquisition of the distance coordinates of be implemented in the future?

roslaunch depthai_examples mobile_publisher.launch 
rostopic echo /mobilenet_publisher/color/mobilenet_detections
        id: 11
        score: 0.7041015625
        pose: 
          pose: 
            position: 
              x: 0.0
              y: 0.0
              z: 0.0
            orientation: 
              x: 0.0
              y: 0.0
              z: 0.0
              w: 0.0
          covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
    bbox: 
      center: 
        x: 38.0
        y: 137.0
        theta: 0.0
      size_x: 76.0
      size_y: 262.0
    source_img: 
      header: 
        seq: 0
        stamp: 
          secs: 0
          nsecs:         0
        frame_id: ''
      height: 0
      width: 0
      encoding: ''
      is_bigendian: 0
      step: 0
      data: []
---

Getting Center Coodinates as Zero(0) in detections

  • Launch file DEPTHAI_LEVEL=debug roslaunch depthai_examples yolov4_publisher.launch camera_model:=OAK-D-POE
  • Same in Stereo_Intertial_node also. roslaunch depthai_examples stereo_inertial_node.launch
      center: 
        x: 0.0
        y: 0.0
        theta: 0.0
 DEPTHAI_LEVEL=debug roslaunch depthai_examples yolov4_publisher.launch camera_model:=OAK-D-POE
... logging to /home/nvidia/.ros/log/d0391bce-e006-11ec-ae8a-48b02d2f7086/roslaunch-forOak-11014.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://192.168.1.66:41251/

SUMMARY
========

PARAMETERS
 * /robot_description: <?xml version="1....
 * /rosdistro: melodic
 * /rosversion: 1.14.13
 * /yolov4_publisher/LRchecktresh: 5
 * /yolov4_publisher/camera_param_uri: package://depthai...
 * /yolov4_publisher/confidence: 200
 * /yolov4_publisher/monoResolution: 400p
 * /yolov4_publisher/nnName: x
 * /yolov4_publisher/resourceBaseFolder: /home/nvidia/catk...
 * /yolov4_publisher/subpixel: True
 * /yolov4_publisher/sync_nn: True
 * /yolov4_publisher/tf_prefix: oak

NODES
  /
    oak_state_publisher (robot_state_publisher/robot_state_publisher)
    yolov4_publisher (depthai_examples/yolov4_spatial_node)

auto-starting new master
process[master]: started with pid [11027]
ROS_MASTER_URI=http://192.168.1.66:11311/

setting /run_id to d0391bce-e006-11ec-ae8a-48b02d2f7086
process[rosout-1]: started with pid [11041]
started core service [/rosout]
process[oak_state_publisher-2]: started with pid [11054]
process[yolov4_publisher-3]: started with pid [11059]
[2022-05-30 16:24:03.924] [debug] Library information - version: 2.15.5, commit: cedff3f6419e07dca5fd9c5a087432f665613db8 from 2022-05-28 22:40:44 +0200, build: 2022-05-30 09:43:50 +0000
[2022-05-30 16:24:03.926] [debug] Initialize - finished
[2022-05-30 16:24:04.101] [debug] Resources - Archive 'depthai-bootloader-fwp-0.0.18+c555ac2fb184b801291c95f7f73d23bf4dd42cf1.tar.xz' open: 5ms, archive read: 170ms
[2022-05-30 16:24:04.764] [debug] Resources - Archive 'depthai-device-fwp-12ab42f2923f30cedbf129fbb765ae7ca973bdba.tar.xz' open: 4ms, archive read: 834ms
[2022-05-30 16:24:05.236] [debug] Device - OpenVINO version: 2021.4
[2022-05-30 16:24:05.244] [debug] Device - BoardConfig: {"gpio":[],"network":{"mtu":0,"xlinkTcpNoDelay":true},"pcieInternalClock":null,"sysctl":[],"uart":[],"usb":{"flashBootedPid":63037,"flashBootedVid":999,"maxSpeed":4,"pid":63035,"vid":999},"usb3PhyInternalClock":null,"watchdogInitialDelayMs":null,"watchdogTimeoutMs":null} 
libnop:
0000: b9 09 b9 05 81 e7 03 81 3b f6 81 e7 03 81 3d f6 04 b9 02 00 01 ba 00 be be bb 00 bb 00 be be
[2022-05-30 16:24:06.263] [debug] Booting FW with Bootloader. Version 0.0.16, Time taken: 845ms
[2022-05-30 16:24:06.263] [debug] DeviceBootloader about to be closed...
[2022-05-30 16:24:06.266] [debug] XLinkResetRemote of linkId: (0)
[2022-05-30 16:24:06.813] [debug] DeviceBootloader closed, 550
[10.42.0.37] [121.825] [system] [info] Memory Usage - DDR: 0.12 / 341.19 MiB, CMX: 2.05 / 2.50 MiB, LeonOS Heap: 24.83 / 78.13 MiB, LeonRT Heap: 2.88 / 41.43 MiB
[10.42.0.37] [121.825] [system] [info] Temperatures - Average: 46.45 °C, CSS: 48.03 °C, MSS 46.00 °C, UPA: 45.77 °C, DSS: 46.00 °C
[10.42.0.37] [121.825] [system] [info] Cpu Usage - LeonOS 6.25%, LeonRT: 0.59%
[2022-05-30 16:24:10.985] [debug] Schema dump: {"connections":[{"node1Id":1,"node1Output":"passthroughDepth","node1OutputGroup":"","node2Id":6,"node2Input":"in","node2InputGroup":""},{"node1Id":1,"node1Output":"out","node1OutputGroup":"","node2Id":7,"node2Input":"in","node2InputGroup":""},{"node1Id":1,"node1Output":"passthrough","node1OutputGroup":"","node2Id":5,"node2Input":"in","node2InputGroup":""},{"node1Id":3,"node1Output":"out","node1OutputGroup":"","node2Id":4,"node2Input":"right","node2InputGroup":""},{"node1Id":2,"node1Output":"out","node1OutputGroup":"","node2Id":4,"node2Input":"left","node2InputGroup":""},{"node1Id":4,"node1Output":"depth","node1OutputGroup":"","node2Id":1,"node2Input":"inputDepth","node2InputGroup":""},{"node1Id":0,"node1Output":"preview","node1OutputGroup":"","node2Id":1,"node2Input":"in","node2InputGroup":""}],"globalProperties":{"calibData":null,"cameraTuningBlobSize":null,"cameraTuningBlobUri":"","leonCssFrequencyHz":700000000.0,"leonMssFrequencyHz":700000000.0,"pipelineName":null,"pipelineVersion":null,"xlinkChunkSize":-1},"nodes":[[5,{"id":5,"ioInfo":[[["","in"],{"blocking":true,"group":"","name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,7,112,114,101,118,105,101,119,0]}],[4,{"id":4,"ioInfo":[[["","rectifiedRight"],{"blocking":false,"group":"","name":"rectifiedRight","queueSize":8,"type":0,"waitForMessage":false}],[["","rectifiedLeft"],{"blocking":false,"group":"","name":"rectifiedLeft","queueSize":8,"type":0,"waitForMessage":false}],[["","syncedLeft"],{"blocking":false,"group":"","name":"syncedLeft","queueSize":8,"type":0,"waitForMessage":false}],[["","syncedRight"],{"blocking":false,"group":"","name":"syncedRight","queueSize":8,"type":0,"waitForMessage":false}],[["","depth"],{"blocking":false,"group":"","name":"depth","queueSize":8,"type":0,"waitForMessage":false}],[["","outConfig"],{"blocking":false,"group":"","name":"outConfig","queueSize":8,"type":0,"waitForMessage":false}],[["","debugDispLrCheckIt1"],{"blocking":false,"group":"","name":"debugDispLrCheckIt1","queueSize":8,"type":0,"waitForMessage":false}],[["","debugExtDispLrCheckIt1"],{"blocking":false,"group":"","name":"debugExtDispLrCheckIt1","queueSize":8,"type":0,"waitForMessage":false}],[["","debugDispLrCheckIt2"],{"blocking":false,"group":"","name":"debugDispLrCheckIt2","queueSize":8,"type":0,"waitForMessage":false}],[["","debugExtDispLrCheckIt2"],{"blocking":false,"group":"","name":"debugExtDispLrCheckIt2","queueSize":8,"type":0,"waitForMessage":false}],[["","debugDispCostDump"],{"blocking":false,"group":"","name":"debugDispCostDump","queueSize":8,"type":0,"waitForMessage":false}],[["","confidenceMap"],{"blocking":false,"group":"","name":"confidenceMap","queueSize":8,"type":0,"waitForMessage":false}],[["","left"],{"blocking":false,"group":"","name":"left","queueSize":8,"type":3,"waitForMessage":true}],[["","right"],{"blocking":false,"group":"","name":"right","queueSize":8,"type":3,"waitForMessage":true}],[["","disparity"],{"blocking":false,"group":"","name":"disparity","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":false,"group":"","name":"inputConfig","queueSize":4,"type":3,"waitForMessage":false}]],"name":"StereoDepth","properties":[185,16,185,5,185,8,0,2,136,0,0,122,68,1,0,1,5,3,185,7,5,0,185,5,0,2,136,0,0,0,63,0,1,185,4,0,3,136,205,204,204,62,0,185,2,0,134,255,255,0,0,185,2,0,50,185,2,1,0,185,4,255,0,1,0,185,5,1,0,0,128,200,185,3,0,2,127,185,5,1,128,250,129,244,1,128,250,129,244,1,0,1,0,190,190,190,190,1,185,5,189,0,189,0,190,16,16,0,3,255,255,1,190]}],[6,{"id":6,"ioInfo":[[["","in"],{"blocking":true,"group":"","name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,5,100,101,112,116,104,0]}],[7,{"id":7,"ioInfo":[[["","in"],{"blocking":true,"group":"","name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,10,100,101,116,101,99,116,105,111,110,115,0]}],[0,{"id":0,"ioInfo":[[["","video"],{"blocking":false,"group":"","name":"video","queueSize":8,"type":0,"waitForMessage":false}],[["","preview"],{"blocking":false,"group":"","name":"preview","queueSize":8,"type":0,"waitForMessage":false}],[["","raw"],{"blocking":false,"group":"","name":"raw","queueSize":8,"type":0,"waitForMessage":false}],[["","still"],{"blocking":false,"group":"","name":"still","queueSize":8,"type":0,"waitForMessage":false}],[["","isp"],{"blocking":false,"group":"","name":"isp","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":false,"group":"","name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputControl"],{"blocking":true,"group":"","name":"inputControl","queueSize":8,"type":3,"waitForMessage":false}]],"name":"ColorCamera","properties":[185,18,185,20,0,3,0,185,3,0,0,0,185,5,0,0,0,0,0,185,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,255,255,0,0,0,129,160,1,129,160,1,255,255,255,255,0,136,0,0,240,65,136,0,0,128,191,136,0,0,128,191,1,185,4,0,0,0,0]}],[1,{"id":1,"ioInfo":[[["","passthrough"],{"blocking":false,"group":"","name":"passthrough","queueSize":8,"type":0,"waitForMessage":false}],[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","boundingBoxMapping"],{"blocking":false,"group":"","name":"boundingBoxMapping","queueSize":8,"type":0,"waitForMessage":false}],[["","passthroughDepth"],{"blocking":false,"group":"","name":"passthroughDepth","queueSize":8,"type":0,"waitForMessage":false}],[["","in"],{"blocking":false,"group":"","name":"in","queueSize":5,"type":3,"waitForMessage":true}],[["","inputDepth"],{"blocking":false,"group":"","name":"inputDepth","queueSize":4,"type":3,"waitForMessage":true}]],"name":"SpatialDetectionNetwork","properties":[185,15,0,130,128,188,185,0,189,12,97,115,115,101,116,58,95,95,98,108,111,98,8,0,0,136,0,0,0,63,80,4,186,12,136,0,0,32,65,136,0,0,96,65,136,0,0,184,65,136,0,0,216,65,136,0,0,20,66,136,0,0,104,66,136,0,0,162,66,136,0,0,164,66,136,0,0,7,67,136,0,0,41,67,136,0,0,172,67,136,0,128,159,67,187,2,189,6,115,105,100,101,49,51,188,12,3,0,0,0,4,0,0,0,5,0,0,0,189,6,115,105,100,101,50,54,188,12,1,0,0,0,2,0,0,0,3,0,0,0,136,0,0,0,63,136,0,0,0,63,185,2,100,129,136,19,0]}],[2,{"id":2,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","raw"],{"blocking":false,"group":"","name":"raw","queueSize":8,"type":0,"waitForMessage":false}],[["","inputControl"],{"blocking":true,"group":"","name":"inputControl","queueSize":8,"type":3,"waitForMessage":false}]],"name":"MonoCamera","properties":[185,5,185,20,0,3,0,185,3,0,0,0,185,5,0,0,0,0,0,185,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,255,2,136,0,0,240,65]}],[3,{"id":3,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","raw"],{"blocking":false,"group":"","name":"raw","queueSize":8,"type":0,"waitForMessage":false}],[["","inputControl"],{"blocking":true,"group":"","name":"inputControl","queueSize":8,"type":3,"waitForMessage":false}]],"name":"MonoCamera","properties":[185,5,185,20,0,3,0,185,3,0,0,0,185,5,0,0,0,0,0,185,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,255,2,136,0,0,240,65]}]]}
[2022-05-30 16:24:10.986] [debug] Asset map dump: {"map":{"/node/1/__blob":{"alignment":64,"offset":0,"size":12172416}}}
[10.42.0.37] [122.459] [MonoCamera(3)] [info] Using board socket: 2, id: 2
[10.42.0.37] [122.459] [MonoCamera(2)] [info] Using board socket: 1, id: 1
[10.42.0.37] [122.462] [system] [info] ImageManip internal buffer size '80640'B, shave buffer size '34816'B
[10.42.0.37] [122.463] [system] [info] SpatialLocationCalculator shave buffer size '29696'B
[10.42.0.37] [122.463] [system] [info] SIPP (Signal Image Processing Pipeline) internal buffer size '16384'B
[10.42.0.37] [122.498] [system] [info] NeuralNetwork allocated resources: shaves: [0-9] cmx slices: [0-9] 
ColorCamera allocated resources: no shaves; cmx slices: [13-15] 
MonoCamera allocated resources: no shaves; cmx slices: [13-15] 
StereoDepth allocated resources: shaves: [12-12] cmx slices: [10-12] 
ImageManip allocated resources: shaves: [15-15] no cmx slices. 
SpatialLocationCalculator allocated resources: shaves: [14-14] no cmx slices. 

[10.42.0.37] [122.512] [SpatialDetectionNetwork(1)] [info] Needed resources: shaves: 6, ddr: 5191680 
[10.42.0.37] [122.512] [SpatialDetectionNetwork(1)] [warning] Network compiled for 6 shaves, maximum available 10, compiling for 5 shaves likely will yield in better performance
[10.42.0.37] [122.516] [StereoDepth(4)] [info] Using focal length from calibration intrinsics '418.65836'
[10.42.0.37] [122.538] [SpatialDetectionNetwork(1)] [info] Inference thread count: 1, number of shaves allocated per thread: 6, number of Neural Compute Engines (NCE) allocated per thread: 2
[10.42.0.37] [122.827] [system] [info] Memory Usage - DDR: 70.75 / 341.19 MiB, CMX: 2.25 / 2.50 MiB, LeonOS Heap: 64.38 / 78.13 MiB, LeonRT Heap: 5.29 / 41.43 MiB
[10.42.0.37] [122.827] [system] [info] Temperatures - Average: 48.47 °C, CSS: 49.14 °C, MSS 47.58 °C, UPA: 48.25 °C, DSS: 48.92 °C
[10.42.0.37] [122.827] [system] [info] Cpu Usage - LeonOS 78.55%, LeonRT: 15.82%
[10.42.0.37] [123.828] [system] [info] Memory Usage - DDR: 70.75 / 341.19 MiB, CMX: 2.25 / 2.50 MiB, LeonOS Heap: 64.38 / 78.13 MiB, LeonRT Heap: 5.29 / 41.43 MiB
[10.42.0.37] [123.828] [system] [info] Temperatures - Average: 49.92 °C, CSS: 51.14 °C, MSS 49.37 °C, UPA: 49.37 °C, DSS: 49.81 °C
[10.42.0.37] [123.828] [system] [info] Cpu Usage - LeonOS 99.77%, LeonRT: 21.95%
[10.42.0.37] [124.829] [system] [info] Memory Usage - DDR: 70.75 / 341.19 MiB, CMX: 2.25 / 2.50 MiB, LeonOS Heap: 64.38 / 78.13 MiB, LeonRT Heap: 5.29 / 41.43 MiB
[10.42.0.37] [124.829] [system] [info] Temperatures - Average: 50.92 °C, CSS: 51.58 °C, MSS 50.03 °C, UPA: 51.36 °C, DSS: 50.70 °C
[10.42.0.37] [124.829] [system] [info] Cpu Usage - LeonOS 99.98%, LeonRT: 22.69%
[10.42.0.37] [125.831] [system] [info] Memory Usage - DDR: 70.75 / 341.19 MiB, CMX: 2.25 / 2.50 MiB, LeonOS Heap: 64.38 / 78.13 MiB, LeonRT Heap: 5.29 / 41.43 MiB
[10.42.0.37] [125.831] [system] [info] Temperatures - Average: 51.47 °C, CSS: 52.24 °C, MSS 50.92 °C, UPA: 51.80 °C, DSS: 50.92 °C
[10.42.0.37] [125.832] [system] [info] Cpu Usage - LeonOS 99.88%, LeonRT: 22.50%
[10.42.0.37] [126.833] [system] [info] Memory Usage - DDR: 70.75 / 341.19 MiB, CMX: 2.25 / 2.50 MiB, LeonOS Heap: 64.38 / 78.13 MiB, LeonRT Heap: 5.29 / 41.43 MiB
[10.42.0.37] [126.833] [system] [info] Temperatures - Average: 52.24 °C, CSS: 53.12 °C, MSS 51.58 °C, UPA: 52.46 °C, DSS: 51.80 °C
[10.42.0.37] [126.833] [system] [info] Cpu Usage - LeonOS 99.72%, LeonRT: 24.92%
^C[yolov4_publisher-3] killing on exit
[oak_state_publisher-2] killing on exit
terminate called after throwing an instance of 'class_loader::LibraryUnloadException'
  what():  Attempt to unload library that class_loader is unaware of.
Stack trace (most recent call last):
#0    Object "/lib/aarch64-linux-gnu/libc.so.6", at 0x7f81cef598, in raise
Aborted (Signal sent by tkill() 11059 1000)
[rosout-1] killing on exit
[master] killing on exit
shutting down processing monitor...
... shutting down processing monitor complete
done
nvidia@forOak:~/dev/camera$ 

rostopic echo /yolov4_publisher/color/yolov4_Spatial_detections

  • Output `
header: 
  seq: 572
  stamp: 
    secs: 1653907480
    nsecs: 459586920
  frame_id: "oak_rgb_camera_optical_frame"
detections: 
  - 
    results: 
      - 
        id: 0
        score: 0.858389139175
    bbox: 
      center: 
        x: 0.0
        y: 0.0
        theta: 0.0
      size_x: 321.0
      size_y: 266.0
    position: 
      x: 0.0
      y: 0.11686437577
      z: 0.89058727026
    is_tracking: False
    tracking_id: ''

Build error on stereo_inertial_publisher.cpp

Pulled latest main branch and tried to build, got these errors:

stereo_inertial_publisher.cpp:185:15: error: ‘ImuSyncMethod’ is not a member of ‘dai::ros’
  185 |     dai::ros::ImuSyncMethod imuMode = static_cast<dai::ros::ImuSyncMethod>(imuModeParam);

stereo_inertial_publisher.cpp:216:72: error: ‘imuMode’ was not declared in this scope; did you mean ‘mode’?
  216 |     dai::rosBridge::ImuConverter imuConverter(tfPrefix + "_imu_frame", imuMode, linearAccelCovariance, angularVelCovariance);

Tried installing latest depthai-core build but had the same issue.

I want to subscribe to the topic "color / yolov4_Spatial_detections" and get the xyz coordinates from "geometry_msgs / Point position"

I'm writing a program that subscribes to the topic "color / yolov4_Spatial_detections", gets the xyz coordinate value from "geometry_msgs / Point position", gets it with MOVEIT, and does pick and place work.

For now, I'm creating a simple node that subscribes to xyz from "yolov4_Spatial_detections".

#include <ros/ros.h>
#include "geometry_msgs/Point.h"
#include <depthai_ros_msgs/SpatialDetection.h>

void clbk(const depthai_ros_msgs::SpatialDetection::ConstPtr& msg) {

    ROS_INFO("position: x=%.2f, y=%.2f", msg->position.x, msg->position.y);
}

int main(int argc, char **argv)
{
  // ROS objects
  ros::init(argc, argv, "tf_example");
  ros::NodeHandle n;
  ros::Subscriber sub = n.subscribe("/yolov4_publisher/color/yolov4_Spatial_detections", 1, clbk);

  ros::spin();

}

However, the following error occurs.

[ERROR] [1649045029.615433297]: Client [/tf_example] wants topic /yolov4_publisher/color/yolov4_Spatial_detections to have datatype/md5sum [depthai_ros_msgs/SpatialDetection/503c7980b555f0fd79e92d14cb9ac446], but our version has [depthai_ros_msgs/SpatialDetectionArray/f3caf8c6374b91f3a4b93f9610044116]. Dropping connection.

The situation was the same even if I deleted the develop and build files and side-compiled to rebuild the workspace.

Can you give me some advice on subscribing to this publisher?

Environment

ubuntu 18.04
Ros Melodic

Thank you.

Galactic support

Hello,

We want to use these ROS packages but we noticed that there is no Galactic one. Is there a roadmap to add this support?

Thanks!

devel branch dont work

I rebuild it following the detpthai-ros , 20.04,noetic from apt, arm64, but main branch works
sometimes crash , sometimes just do nothing . made it work partially removing things, but 400p, IMU, detections, marker dont work
``
roslaunch depthai_examples stereo_inertial_node.launch
... logging to /home/ubuntu/.ros/log/22af78e4-d535-11ec-9bbf-db787525b9cb/roslaunch-ubuntu-15634.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://192.168.43.170:36775/

SUMMARY

PARAMETERS

  • /depth_image_to_rgb_pointcloud/queue_size: 10
  • /robot_description: <?xml version="1....
  • /rosdistro: noetic
  • /rosversion: 1.15.14
  • /stereo_inertial_publisher/LRchecktresh: 5
  • /stereo_inertial_publisher/angularVelCovariance: 0
  • /stereo_inertial_publisher/confidence: 200
  • /stereo_inertial_publisher/depth_aligned: True
  • /stereo_inertial_publisher/dotProjectormA: 200.0
  • /stereo_inertial_publisher/enableDotProjector: False
  • /stereo_inertial_publisher/enableFloodLight: False
  • /stereo_inertial_publisher/enableSpatialDetection: True
  • /stereo_inertial_publisher/extended: False
  • /stereo_inertial_publisher/floodLightmA: 200.0
  • /stereo_inertial_publisher/imuMode: 1
  • /stereo_inertial_publisher/linearAccelCovariance: 0
  • /stereo_inertial_publisher/lrcheck: True
  • /stereo_inertial_publisher/mode: depth
  • /stereo_inertial_publisher/monoResolution: 720p
  • /stereo_inertial_publisher/mxId:
  • /stereo_inertial_publisher/poeMode: False
  • /stereo_inertial_publisher/rectify: True
  • /stereo_inertial_publisher/stereo_fps: 15
  • /stereo_inertial_publisher/subpixel: True
  • /stereo_inertial_publisher/syncNN: True
  • /stereo_inertial_publisher/tf_prefix: oak
  • /stereo_inertial_publisher/usb2Mode: False

NODES
/
depth_image_convertion_nodelet (nodelet/nodelet)
depth_image_to_rgb_pointcloud (nodelet/nodelet)
nodelet_manager (nodelet/nodelet)
oak_state_publisher (robot_state_publisher/robot_state_publisher)
rviz (rviz/rviz)
stereo_inertial_publisher (depthai_examples/stereo_inertial_node)

ROS_MASTER_URI=http://192.168.43.170:11311/

process[oak_state_publisher-1]: started with pid [15674]
process[stereo_inertial_publisher-2]: started with pid [15675]
process[nodelet_manager-3]: started with pid [15676]
process[depth_image_convertion_nodelet-4]: started with pid [15681]
process[depth_image_to_rgb_pointcloud-5]: started with pid [15683]
process[rviz-6]: started with pid [15689]
[ INFO] [1652718489.992307036]: Initializing nodelet with 4 worker threads.
nn path.../home/ubuntu/dai_ws/src/luxonis/depthai-ros-examples/depthai_examples/resources/tiny-yolo-v4_openvino_2021.2_6shave.blob
nn path.../home/ubuntu/dai_ws/src/luxonis/depthai-ros-examples/depthai_examples/resources/tiny-yolo-v4_openvino_2021.2_6shave.blob
Listing available devices...
Device Mx ID: 14442C10F1B6C2D200
[14442C10F1B6C2D200] [880.336] [SpatialDetectionNetwork(8)] [warning] Network compiled for 6 shaves, maximum available 10, compiling for 5 shaves likely will yield in better performance
Device USB status: SUPER
[ WARN] [1652718499.589680484]: Accel data not found. Dropping data
^C[rviz-6] killing on exit
[depth_image_to_rgb_pointcloud-5] killing on exit
[nodelet_manager-3] killing on exit
[depth_image_convertion_nodelet-4] killing on exit
[oak_state_publisher-1] killing on exit
[stereo_inertial_publisher-2] killing on exit
shutting down processing monitor...
... shutting down processing monitor complete
done
ubuntu@ubuntu:~$ roslaunch depthai_examples stereo_inertial_node.launch
... logging to /home/ubuntu/.ros/log/22af78e4-d535-11ec-9bbf-db787525b9cb/roslaunch-ubuntu-16041.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://192.168.43.170:36423/

SUMMARY

PARAMETERS

  • /depth_image_to_rgb_pointcloud/queue_size: 10
  • /robot_description: <?xml version="1....
  • /rosdistro: noetic
  • /rosversion: 1.15.14
  • /stereo_inertial_publisher/LRchecktresh: 5
  • /stereo_inertial_publisher/angularVelCovariance: 0
  • /stereo_inertial_publisher/confidence: 200
  • /stereo_inertial_publisher/depth_aligned: True
  • /stereo_inertial_publisher/dotProjectormA: 200.0
  • /stereo_inertial_publisher/enableDotProjector: False
  • /stereo_inertial_publisher/enableFloodLight: False
  • /stereo_inertial_publisher/enableSpatialDetection: True
  • /stereo_inertial_publisher/extended: False
  • /stereo_inertial_publisher/floodLightmA: 200.0
  • /stereo_inertial_publisher/imuMode: 1
  • /stereo_inertial_publisher/linearAccelCovariance: 0
  • /stereo_inertial_publisher/lrcheck: True
  • /stereo_inertial_publisher/mode: depth
  • /stereo_inertial_publisher/monoResolution: 720p
  • /stereo_inertial_publisher/mxId:
  • /stereo_inertial_publisher/poeMode: False
  • /stereo_inertial_publisher/rectify: True
  • /stereo_inertial_publisher/stereo_fps: 15
  • /stereo_inertial_publisher/subpixel: True
  • /stereo_inertial_publisher/syncNN: True
  • /stereo_inertial_publisher/tf_prefix: oak
  • /stereo_inertial_publisher/usb2Mode: False

NODES
/
depth_image_convertion_nodelet (nodelet/nodelet)
depth_image_to_rgb_pointcloud (nodelet/nodelet)
nodelet_manager (nodelet/nodelet)
oak_state_publisher (robot_state_publisher/robot_state_publisher)
rviz (rviz/rviz)
stereo_inertial_publisher (depthai_examples/stereo_inertial_node)

ROS_MASTER_URI=http://192.168.43.170:11311/

process[oak_state_publisher-1]: started with pid [16063]
process[stereo_inertial_publisher-2]: started with pid [16064]
process[nodelet_manager-3]: started with pid [16065]
process[depth_image_convertion_nodelet-4]: started with pid [16070]
process[depth_image_to_rgb_pointcloud-5]: started with pid [16072]
process[rviz-6]: started with pid [16078]
nn path.../home/ubuntu/dai_ws/src/luxonis/depthai-ros-examples/depthai_examples/resources/tiny-yolo-v4_openvino_2021.2_6shave.blob
nn path.../home/ubuntu/dai_ws/src/luxonis/depthai-ros-examples/depthai_examples/resources/tiny-yolo-v4_openvino_2021.2_6shave.blob
[ INFO] [1652718654.482431717]: Initializing nodelet with 4 worker threads.
Listing available devices...
Device Mx ID: 14442C10F1B6C2D200
terminate called without an active exception
Stack trace (most recent call last):
#21 Object "/home/ubuntu/dai_ws/devel/lib/depthai_examples/stereo_inertial_node", at 0xaaaab5731053, in $x
#20 Source "../csu/libc-start.c", line 308, in __libc_start_main [0xffffa61f1e0f]
#19 Source "/home/ubuntu/dai_ws/src/luxonis/depthai-ros-examples/depthai_examples/ros1_src/stereo_inertial_publisher.cpp", line 273, in main [0xaaaab57335df]
270: }
271: } else if(mxId.empty()) {
272: isDeviceFound = true;
> 273: device = std::make_shareddai::Device(pipeline);
274: }
275: }
#18 Source "/usr/include/c++/9/bits/shared_ptr.h", line 718, in make_shared<dai::Device, dai::Pipeline&> [0xaaaab573d44b]
715: {
716: typedef typename std::remove_cv<_Tp>::type _Tp_nc;
717: return std::allocate_shared<_Tp>(std::allocator<_Tp_nc>(),
> 718: std::forward<_Args>(__args)...);
719: }
720:
721: /// std::hash specialization for shared_ptr.
#17 Source "/usr/include/c++/9/bits/shared_ptr.h", line 702, in allocate_shared<dai::Device, std::allocatordai::Device, dai::Pipeline&> [0xaaaab57416eb]
699: allocate_shared(const _Alloc& __a, _Args&&... __args)
700: {
701: return shared_ptr<_Tp>(_Sp_alloc_shared_tag<_Alloc>{__a},
> 702: std::forward<_Args>(__args)...);
703: }
704:
705: /**
#16 Source "/usr/include/c++/9/bits/shared_ptr.h", line 359, in shared_ptr<std::allocatordai::Device, dai::Pipeline&> [0xaaaab574661b]
356: // This constructor is non-standard, it is used by allocate_shared.
357: template<typename _Alloc, typename... _Args>
358: shared_ptr(_Sp_alloc_shared_tag<_Alloc> __tag, _Args&&... __args)
> 359: : __shared_ptr<_Tp>(__tag, std::forward<_Args>(__args)...)
360: { }
361:
362: template<typename _Yp, typename _Alloc, typename... _Args>
#15 Source "/usr/include/c++/9/bits/shared_ptr_base.h", line 1344, in __shared_ptr<std::allocatordai::Device, dai::Pipeline&> [0xaaaab574bc9f]
1341: // This constructor is non-standard, it is used by allocate_shared.
1342: template<typename _Alloc, typename... _Args>
1343: __shared_ptr(_Sp_alloc_shared_tag<_Alloc> __tag, _Args&&... __args)
>1344: : _M_ptr(), _M_refcount(_M_ptr, __tag, std::forward<_Args>(__args)...)
1345: { _M_enable_shared_from_this_with(_M_ptr); }
1346:
1347: template<typename _Tp1, _Lock_policy _Lp1, typename _Alloc,
#14 Source "/usr/include/c++/9/bits/shared_ptr_base.h", line 679, in __shared_count<dai::Device, std::allocatordai::Device, dai::Pipeline&> [0xaaaab575313b]
676: typename _Sp_cp_type::__allocator_type __a2(__a._M_a);
677: auto __guard = std::__allocate_guarded(__a2);
678: _Sp_cp_type* __mem = __guard.get();
> 679: auto __pi = ::new (__mem)
680: _Sp_cp_type(__a._M_a, std::forward<_Args>(__args)...);
681: __guard = nullptr;
682: _M_pi = __pi;
#13 Source "/usr/include/c++/9/bits/shared_ptr_base.h", line 548, in _Sp_counted_ptr_inplacedai::Pipeline& [0xaaaab57577df]
545: {
546: // _GLIBCXX_RESOLVE_LIB_DEFECTS
547: // 2070. allocate_shared should use allocator_traits::construct
> 548: allocator_traits<_Alloc>::construct(__a, _M_ptr(),
549: std::forward<_Args>(__args)...); // might throw
550: }
#12 Source "/usr/include/c++/9/bits/alloc_traits.h", line 483, in construct<dai::Device, dai::Pipeline&> [0xaaaab575bc87]
480: static void
481: construct(allocator_type& __a, _Up* __p, _Args&&... __args)
482: noexcept(std::is_nothrow_constructible<_Up, _Args...>::value)
> 483: { __a.construct(__p, std::forward<_Args>(__args)...); }
484:
485: /**
486: * @brief Destroy an object of type @A _Up
#11 Source "/usr/include/c++/9/ext/new_allocator.h", line 146, in construct<dai::Device, dai::Pipeline&> [0xaaaab575f433]
143: void
144: construct(_Up* __p, _Args&&... __args)
145: noexcept(std::is_nothrow_constructible<_Up, _Args...>::value)
> 146: { ::new((void )__p) _Up(std::forward<_Args>(__args)...); }
147:
148: template
149: void
#10 Source "/home/ubuntu/dai_ws/src/luxonis/depthai-ros/depthai-core/src/device/Device.cpp", line 27, in Device [0xffffa6b7c54b]
#9 Source "/home/ubuntu/dai_ws/src/luxonis/depthai-ros/depthai-core/src/device/DeviceBase.cpp", line 381, in tryStartPipeline [0xffffa6b8eccb]
#8 Source "/home/ubuntu/dai_ws/src/luxonis/depthai-ros/depthai-core/src/device/DeviceBase.cpp", line 864, in startPipeline [0xffffa6b923bb]
#7 Source "/home/ubuntu/dai_ws/src/luxonis/depthai-ros/depthai-core/src/device/Device.cpp", line 317, in startPipelineImpl [0xffffa6b7e7ef]
#6 Source "/home/ubuntu/dai_ws/src/luxonis/depthai-ros/depthai-core/src/device/DeviceBase.cpp", line 907, in startPipelineImpl [0xffffa6b92cf7]
#5 Source "/usr/include/c++/9/thread", line 139, in ~thread [0xaaaab573a24b]
136: ~thread()
137: {
138: if (joinable())
> 139: std::terminate();
140: }
141:
142: thread(const thread&) = delete;
#4 Object "/usr/lib/aarch64-linux-gnu/libstdc++.so.6.0.28", at 0xffffa640727f, in std::terminate()
#3 Object "/usr/lib/aarch64-linux-gnu/libstdc++.so.6.0.28", at 0xffffa640721b, in
#2 Object "/usr/lib/aarch64-linux-gnu/libstdc++.so.6.0.28", at 0xffffa640983b, in __gnu_cxx::__verbose_terminate_handler()
#1 Source "/build/glibc-RIFKjK/glibc-2.31/stdlib/abort.c", line 79, in abort [0xffffa61f1aab]
#0 Source "../sysdeps/unix/sysv/linux/raise.c", line 51, in raise [0xffffa6204d78]
Aborted (Signal sent by tkill() 16064 1000)
================================================================================REQUIRED process [stereo_inertial_publisher-2] has died!
process has died [pid 16064, exit code -6, cmd /home/ubuntu/dai_ws/devel/lib/depthai_examples/stereo_inertial_node __name:=stereo_inertial_publisher __log:=/home/ubuntu/.ros/log/22af78e4-d535-11ec-9bbf-db787525b9cb/stereo_inertial_publisher-2.log].
log file: /home/ubuntu/.ros/log/22af78e4-d535-11ec-9bbf-db787525b9cb/stereo_inertial_publisher-2
.log
Initiating shutdown!

[rviz-6] killing on exit
[depth_image_to_rgb_pointcloud-5] killing on exit
[depth_image_convertion_nodelet-4] killing on exit
[nodelet_manager-3] killing on exit
[stereo_inertial_publisher-2] killing on exit
[oak_state_publisher-1] killing on exit
shutting down processing monitor...
... shutting down processing monitor complete
done

``

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.