carlosmccosta / dynamic_robot_localization Goto Github PK
View Code? Open in Web Editor NEWPoint cloud registration pipeline for robot localization and 3D perception
License: BSD 3-Clause "New" or "Revised" License
Point cloud registration pipeline for robot localization and 3D perception
License: BSD 3-Clause "New" or "Revised" License
Hi,
In order to use this package in real production environment I have a couple of questions:
P.S: I know that I can comment out the yaml file for the initial pose but when it looses the track, TF is not published
Hi,
I am using mmWave radars from TI (IWR6843AOP) and wonder which best method will suite from DRL package to perform a localization?
Suppose that I have made a PCL map of the environment (which is also sparse and noisy as the map will be made from the same radars) and want to localize, will you suggest ICP method or NDT?
Hello
Im trying to use this package but it's a little bit overwhelming!
Let's suppose I have a rosbag with pointcloud and imu data, what would be the first steps to generate a map and use it for navigation? Thank you
Dear Carlos,
Thank you for your work on this package.
I pull the newest noetic branch, and I get this error
In file included from /home/genius/ros_ws/src/dynamic_robot_localization/src/outlier_detectors/euclidean_outlier_detector.cpp:10: /home/genius/ros_ws/src/dynamic_robot_localization/include/dynamic_robot_localization/outlier_detectors/impl/euclidean_outlier_detector.hpp: In member function ‘virtual size_t dynamic_robot_localization::EuclideanOutlierDetector<PointT>::detectOutliers(typename pcl::search::KdTree<PointT>::Ptr, const pcl::PointCloud<PointT>&, typename pcl::PointCloud<PointT>::Ptr&, typename pcl::PointCloud<PointT>::Ptr&, double&)’: /home/genius/ros_ws/src/dynamic_robot_localization/include/dynamic_robot_localization/outlier_detectors/impl/euclidean_outlier_detector.hpp:158:11: error: ‘HSVtoRGB’ is not a member of ‘pcl’ 158 | pcl::HSVtoRGB(hue, 1.0f, 1.0f, point.r, point.g, point.b); | ^~~~~~~~
I installed with your install script.
And I noticed you pushed a new update related to HSV recently, do you know how this happes ?
Thank you
Best regards
Hello @carlosmccosta ,thank you for the open-source.
I checked out some of the issues and readme,but I am not very clear about the 3dof localization.
I had a few questions-
I have a bag file attached here, and I changed the launch file to this and also the yaml file ,then I run the following commands-
1.roscore
2.roslaunch dynamic_robot_localization dynamic_robot_localization_system.launch
3.rosrun map_server map_server map_default.yaml
4.rosbag play --clock expt6.bag
5 rviz
I have attached the map yaml here and pgm.I am clueless as to what should be the output,in the sense I have the ground_truth pose of the bot,so which topic should I compare this to(when I run the above commands) ,so that I know how good is the 3dof localization?
What should be the expected rviz output?
Let me know if any more information is required.
Any help/suggestion is appreciated.Waiting for your early reply.
Thank you
Hello Carlos Mccosta :)
when i compile from the source, some error occured:
Errors << dynamic_robot_localization:make /home/curry/catkin_ws_drl/logs_release/dynamic_robot_localization/build.make.000.log
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/cloud_viewer.cpp:10:0:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/cloud_viewer.hpp: In instantiation of ‘void dynamic_robot_localization::CloudViewer::setupConfigurationFromParameterServer(ros::NodeHandlePtr&, ros::NodeHandlePtr&, const string&) [with PointT = pcl::PointXYZRGBNormal; ros::NodeHandlePtr = boost::shared_ptrros::NodeHandle; std::_cxx11::string = std::cxx11::basic_string]’:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/cloud_viewer.cpp:19:1: required from here
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/cloud_viewer.hpp:33:85: warning: unused parameter ‘node_handle’ [-Wunused-parameter]
void CloudViewer::setupConfigurationFromParameterServer(ros::NodeHandlePtr& node_handle, ros::NodeHandlePtr& private_node_handle, const std::string& configuration_namespace) {
^~~~~~~~~~~
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/pointcloud_conversions.h:32:0,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/pointcloud_conversions.hpp:9,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/pointcloud_conversions.cpp:10:
/usr/include/pcl-1.8/pcl/visualization/cloud_viewer.h:202:14: warning: ‘template class std::auto_ptr’ is deprecated [-Wdeprecated-declarations]
std::auto_ptr<CloudViewer_impl> impl;
^~~~~~~~
In file included from /usr/include/c++/7/memory:80:0,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/pointcloud_conversions.h:12,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/pointcloud_conversions.hpp:9,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/pointcloud_conversions.cpp:10:
/usr/include/c++/7/bits/unique_ptr.h:51:28: note: declared here
template class auto_ptr;
^~~~~~~~
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:0:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp: In instantiation of ‘bool dynamic_robot_localization::RegistrationVisualizer<PointSource, PointTarget>::setRegistration(pcl::Registration<PointSource, PointTarget>&) [with PointSource = pcl::PointXYZRGBNormal; PointTarget = pcl::PointXYZRGBNormal]’:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:52:1: required from here
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:70:2: error: no matching function for call to ‘pcl::Registration<pcl::PointXYZRGBNormal, pcl::PointXYZRGBNormal, float>::registerVisualizationCallback(std::function<void(const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&, const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&)>&)’
registration.registerVisualizationCallback(this->update_visualizer);
^~~~~~~~~~~~
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/registration_visualizer.h:41:0,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:39,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:
/usr/include/pcl-1.8/pcl/registration/registration.h:375:7: note: candidate: template bool pcl::Registration<PointSource, PointTarget, Scalar>::registerVisualizationCallback(boost::function&) [with FunctionSignature = FunctionSignature; PointSource = pcl::PointXYZRGBNormal; PointTarget = pcl::PointXYZRGBNormal; Scalar = float]
registerVisualizationCallback (boost::function &visualizerCallback)
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/usr/include/pcl-1.8/pcl/registration/registration.h:375:7: note: template argument deduction/substitution failed:
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:0:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:70:2: note: ‘std::function<void(const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&, const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&)>’ is not derived from ‘boost::function’
registration.registerVisualizationCallback(this->update_visualizer);
i dont kown the reasons,could you please help me to solve it? thanks again
Hi ,Carlos Miguel :
I have installed:
ros-kinetic-laserscan-to-pointcloud_1.0.0-0xenial_amd64.deb
ros-kinetic-pcl_1.8.1-0xenial_amd64.deb
ros-kinetic-pose-to-tf-publisher_1.0.0-0xenial_amd64.deb
then I build the drl and get the following error:
[ 98%] Linking CXX executable devel/lib/dynamic_robot_localization/drl_localization_node
devel/lib/libdrl_localization.so:对‘pcl::search::Searchpcl::PointXYZRGBNormal::getName() const’未定义的引用
collect2: error: ld returned 1 exit status
what can I do to fix this problem
Whenvever you find the TF from the lidar frame using the ICP:
icp.getFinalTransformation()
, getting the matrix and converting to ROS TF, how do you then proceed on publishing the Map to Odom TF? Is it like this?
final_transformation_as_tf = final_transformation_as_tf (the one you got from the matrix) * map_to_base_tf2 (from listening the TF);
relative_tf2 = sensor_to_base_TF.inverseTimes(odom_to_base_TF)
final_transformation_as_tf = final_transformation_as_tf * relative_tf2;
Is that correct?
I followed your instructions and applied "3 DoF robot localization / mapping" to my real robot successfully. (#18 (comment))
And I want to try 6 DoF robot localization now.
First, I wonder if I must use this file(https://github.com/carlosmccosta/dynamic_robot_localization_tests/blob/hydro-devel/launch/environments/labiomep/labiomep_6dof.launch).
Or should I use octo_map.launch?
Can you tell me which parameters to change and which launch file to use?
Hello, Carlos
I am very surprised by your package performance.
I want to use this package on ROS2 as well.
Is there any way to use it?
Or do you have plan to support ROS2 for this package.
Hi,
If I have predefined static objects in specific locations of occupancy map, is it possible to select a region of interest for the ICP? So instead of trying to match the whole visible map, it will only try to match the selected RoI of the map
Hi,
I'm trying to use your package for a SLAM project, but I'm running into a weird issue when launching the localization node.
The drl_localization_node starts up but remains stuck in a loop, posting the message "Waiting for valid time...", which corresponds to this loop:
Have you ever run into this problem before? If so, could you be so kind as to provide some information in order to solve it?
Thanks!
Hi Carlos,
Sorry for asking so many questions but I just find your library fascinating and I am learning and reading your code since the last week. Also thank you for taking your time to answer my questions, I really appreciate it :)
The question I have is about datatypes you are using for the ICP algorithm.
From your code correct me if I am wrong:
Compiling on ARM takes a lot of time, especially PCL! What is the best way to compile PCL once for all and save its binaries and libraries so I can simply transfer to another ARM machine? I tried to simply make the PCL independently and then install its binaries to some folder and use that but getting an error at:
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
CMake Error at /opt/ros/melodic/share/dynamic_reconfigure/cmake/dynamic_reconfigure-macros.cmake:80 (add_custom_target):
add_custom_target cannot create target "pcl_ros_gencfg" because another
target with the same name already exists. The existing target is a custom
target created in source directory "/home/abyl/right-wheel/src/laser_icp".
See documentation for policy CMP0002 for more details.
Call Stack (most recent call first):
perception_pcl/pcl_ros/CMakeLists.txt:70 (generate_dynamic_reconfigure_options)
Hi Carlos,
As the code for your library is quite substantial, can you please point the location where you use hash table to store the map (reference cloud) against which you perform ICP and get a faster convergence than k tree implementation?
Also, I saw your version of icp in PCL fork but how different is the PCL ICP implementation from this one (referring if we use k tree implementation)?
https://github.com/AndreaCensi/csm
error: no matching function for call to ‘pcl::Registration<pcl::PointXYZRGBNormal, pcl::PointXYZRGBNormal, float>::registerVisualizationCallback(std::function<void(const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&, const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&)>&)’
registration.registerVisualizationCallback(this->update_visualizer_);
Hi Carlos,
Thank you for providing this good package, but here I have a problem when build the package with ros-noetic.
I have install your customized PCL library but I still get this error, do you have any idea?
[ 30%] Building CXX object dynamic_robot_localization/CMakeFiles/drl_common.dir/src/common/registration_visualizer.cpp.o In file included from /usr/include/boost/detail/endian.hpp:9, from /usr/local/include/pcl-1.8/pcl/PCLPointCloud2.h:11, from /usr/local/include/pcl-1.8/pcl/pcl_base.h:55, from /usr/local/include/pcl-1.8/pcl/registration/registration.h:45, from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/registration_visualizer.h:41, from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:39, from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43: /usr/include/boost/predef/detail/endian_compat.h:11:161: note: #pragma message: The use of BOOST_*_ENDIAN and BOOST_BYTE_ORDER is deprecated. Please include <boost/predef/other/endian.h> and use BOOST_ENDIAN_*_BYTE instead 11 | ORDER is deprecated. Please include <boost/predef/other/endian.h> and use BOOST_ENDIAN_*_BYTE instead") | ^ In file included from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43: /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp: In instantiation of ‘bool dynamic_robot_localization::RegistrationVisualizer<PointSource, PointTarget>::setRegistration(pcl::Registration<PointSource, PointTarget>&) [with PointSource = pcl::PointXYZRGBNormal; PointTarget = pcl::PointXYZRGBNormal]’: /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:52:1: required from here /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:70:2: error: no matching function for call to ‘pcl::Registration<pcl::PointXYZRGBNormal, pcl::PointXYZRGBNormal, float>::registerVisualizationCallback(std::function<void(const pcl::PointCloud<pcl::PointXYZRGBNormal>&, const std::vector<int>&, const pcl::PointCloud<pcl::PointXYZRGBNormal>&, const std::vector<int>&)>&)’ 70 | registration.registerVisualizationCallback(this->update_visualizer_); | ^~~~~~~~~~~~ In file included from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/registration_visualizer.h:41, from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:39, from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43: /usr/local/include/pcl-1.8/pcl/registration/registration.h:380:7: note: candidate: ‘template<class FunctionSignature> bool pcl::Registration<PointSource, PointTarget, Scalar>::registerVisualizationCallback(boost::function<FunctionSignature>&) [with FunctionSignature = FunctionSignature; PointSource = pcl::PointXYZRGBNormal; PointTarget = pcl::PointXYZRGBNormal; Scalar = float]’ 380 | registerVisualizationCallback (boost::function<FunctionSignature> &visualizerCallback) | ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~ /usr/local/include/pcl-1.8/pcl/registration/registration.h:380:7: note: template argument deduction/substitution failed: In file included from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43: /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:70:2: note: ‘std::function<void(const pcl::PointCloud<pcl::PointXYZRGBNormal>&, const std::vector<int>&, const pcl::PointCloud<pcl::PointXYZRGBNormal>&, const std::vector<int>&)>’ is not derived from ‘boost::function<Signature>’ 70 | registration.registerVisualizationCallback(this->update_visualizer_); | ^~~~~~~~~~~~ make[2]: *** [dynamic_robot_localization/CMakeFiles/drl_common.dir/build.make:193: dynamic_robot_localization/CMakeFiles/drl_common.dir/src/common/registration_visualizer.cpp.o] Error 1 make[1]: *** [CMakeFiles/Makefile2:7231: dynamic_robot_localization/CMakeFiles/drl_common.dir/all] Error 2 make: *** [Makefile:141: all] Error 2 Invoking "make -j8 -l8" failed
Hi ,Carlos Miguel :
I have a problem about the "catkin_make",below was the result:
In file included from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/src/cloud_filters/plane_segmentation.cpp:10:0:
/home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp: In instantiation of ‘void dynamic_robot_localization::PlaneSegmentation::filter(const typename pcl::PointCloud::Ptr&, typename pcl::PointCloud::Ptr&) [with PointT = pcl::PointXYZRGBNormal; typename pcl::PointCloud::Ptr = boost::shared_ptr<pcl::PointCloudpcl::PointXYZRGBNormal >]’:
/home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/src/cloud_filters/plane_segmentation.cpp:19:1: required from here
/home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp:124:157: error: call of overloaded ‘transformPointCloudWithNormals(pcl::PointCloudpcl::PointXYZRGBNormal&, pcl::PointCloudpcl::PointXYZRGBNormal&, const Eigen::internal::inverse_impl<Eigen::Matrix<float, 4, 4> >)’ is ambiguous
pcl::transformPointCloudWithNormals(*convex_hull_for_projected_plane_inliers, *convex_hull_for_projected_plane_inliers, centroid_matrix.inverse());
^
/home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp:124:157: note: candidates are:
In file included from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/plane_segmentation.h:26:0,
from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp:9,
from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/src/cloud_filters/plane_segmentation.cpp:10:
/home/zcb/catkin_ws_drl/devel_release/include/pcl-1.8/pcl/common/transforms.h:145:3: note: void pcl::transformPointCloudWithNormals(const pcl::PointCloud&, pcl::PointCloud&, const Affine3f&, bool) [with PointT = pcl::PointXYZRGBNormal; Eigen::Affine3f = Eigen::Transform<float, 3, 2>]
transformPointCloudWithNormals (const pcl::PointCloud &cloud_in,
^
In file included from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/plane_segmentation.h:26:0,
from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp:9,
from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/src/cloud_filters/plane_segmentation.cpp:10:
/home/zcb/catkin_ws_drl/devel_release/include/pcl-1.8/pcl/common/transforms.h:318:3: note: void pcl::transformPointCloudWithNormals(const pcl::PointCloud&, pcl::PointCloud&, const Matrix4f&, bool) [with PointT = pcl::PointXYZRGBNormal; Eigen::Matrix4f = Eigen::Matrix<float, 4, 4>]
transformPointCloudWithNormals (const pcl::PointCloud &cloud_in,
^
make[2]: *** [dynamic_robot_localization-kinetic-devel/CMakeFiles/drl_cloud_filters.dir/src/cloud_filters/plane_segmentation.cpp.o] 错误 1
make[2]: *** 正在等待未完成的任务....
make[1]: *** [dynamic_robot_localization-kinetic-devel/CMakeFiles/drl_cloud_filters.dir/all] 错误 2
make: *** [all] 错误 2
Hi,
I want to combine the AMCL and ICP to get a more accurate approach localization. My idea is to use AMCL and turn off the tracking_2d from the DRL while navigating and then when the goal distance is less than X threshold, I can turn off AMCL and turn on DRL for final approach.
Is there a simple way to turn off the DRL completely (tracking_2d) and turn on with an initial position?
Thank you for your great work with this package, it looks really promising!
I recorded a 3d Pointcloud map of our environment beforhand. And now I'd like to lokalize our robot in said map.
Therefore I wanted to use the ynamic_robot_localization_system_6dof.launch launchfile. But when starting it, after the Map is loaded and global pose estimate is published, all the new Pointclouds get dropped, because:
[ WARN] [1660297010.079226327]: Dropping pointcloud because tf between base_link and odom isn't available
Failed to find match for field 'rgb'.
Failed to find match for field 'normal_x'.
Failed to find match for field 'normal_y'.
Failed to find match for field 'normal_z'.
Failed to find match for field 'curvature'.
Is that a Problem because of the pointcloud type I use? I am using a robosense Lidar and can decide between XYZIRT and XYZI type. So there is no option for rgb... . Am I still able to use this package somehow?
Dear Carlos,
I tried using your outstanding work and its positioning effect was very excellent. But I have encountered a problem now, and I have tried to modify the code or configuration file, but none of them can solve it. I sincerely hope that you can point out the mistakes I have made.
The problem I encountered is that /ambient_pointcloud is getting farther and farther away from the wall. Then the positioning data also showed a few centimeters of error compared to the actual data. /ambient_pointcloud becomes less attached to the wall as it runs, and then /aligned_pointcloud remains attached to the wall. I believe that the positioning data should be output by /aligned_pointcloud instead of /ambient_pointcloud , but the actual situation is that /ambient_pointcloud and radar /scan are attached, and the positioning data is also published based on /ambient_pointcloud.
What is in red in the picture is /aligned_pointcloud ,and what is in white in the picture is /ambient_pointcloud ,It can be seen that /aligned_pointcloud is in contact with the wall. And /ambient_pointcloud has error. And this error will gradually increase as the car runs, I guess it's due to compensation for the odometer. By the way, I confirmed that the odometer data is accurate.
/ambient_pointcloud and /scan are matched:
Only when I give it another initial position value will everything match and become normal.
This is my TF data, one radar, one Odom:
Here is my configuration file:
drl_configs.txt
launch file:
launch.txt
What can I do to make the location data based on /aligned_pointcloud ?
Wishing you a wonderful day!
Greetings Carlos,
after I clone your repository and execute the install.bash, I get the following error at the end:
Errors << dynamic_robot_localization:make /home/warehouse/catkin_ws_drl/logs_release/dynamic_robot_localization/build.make.002.log
make[2]: *** No rule to make target 'CMakeFiles/drl_common.dir/build'. Stop.
make[1]: *** [CMakeFiles/drl_common.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
make: *** [all] Error 2
cd /home/warehouse/catkin_ws_drl/build_release/dynamic_robot_localization; catkin build --get-env dynamic_robot_localization | catkin env -si /usr/bin/make --jobserver-fds=6,7 -j; cd -
...............................................................................
Failed << dynamic_robot_localization:make [ Exited with code 2 ]
Failed <<< dynamic_robot_localization [ 2.0 seconds ]
[build] Successful packages:
[Successful] PCL [Successful] laserscan_to_pointcloud
[Successful] mesh_to_pointcloud [ Ignored] octomap_mapping
[Successful] octomap_server [Successful] pcl_conversions
[Successful] pcl_msgs [Successful] pcl_ros
[ Ignored] perception_pcl [Successful] pose_to_tf_publisher
[build] Failed packages:
[ Failed] dynamic_robot_localization
[build] Summary: 8 of 9 packages succeeded.
[build] Ignored: 2 packages were skipped or are blacklisted.
[build] Warnings: None.
[build] Abandoned: None.
[build] Failed: 1 packages failed.
[build] Runtime: 20.5 seconds total.
I'm running it on Kinetic, Ubuntu 16.04.
Any ideas of what might be causing the problem?
Thanks,
Juan Mesa
As far as I understood, this localization uses NDT-MCL. How is it compared with a traditional AMCL package?
Also, as I can see that the package focuses on 3D lidars, does it make sense to use this package having only 2d lidar?
And what is the reason of using PCL instead of laserscan data for 2D measurement?
Hi Carlo,
I am configuring the yaml file, can I know what is the unit of the "max_new_pose_diff_angle"?
Degree among 360° or rad among 2π?
Best regards
Sun Rentao
Hi Carlos,
Sometimes during the localization the robot jumps very far away. Is that because I need to restrict the box search further down to avoid that?
Also, is it possible to update the map along the navigation? Because I have a very dynamic obstacle and I think I need to always refresh my map
Hi,
I am trying to reproduce 6DoF tracking as in Video 6: Free fly test with Kinect in the ETHZ RGB-D dataset using the 6 DoF localization system.
After configuring workspace and installing all packages, I have executed following commands from terminal.
$ source ~/catkin_ws_drl/devel_release/setup.bash
$ cd /catkin_ws_drl/src/dynamic_robot_localization_tests/launch/environments/asl/bags
$ roslaunch ethzasl_kinect_dataset_high_complexity_slow_fly_movement.launch
Once RViz comes up, I switched to terminal and pressed space bar.
Now, tracking gets visualized in RViz.
However it does not look like that in reference video (https://www.youtube.com/watch?v=UslDiUkm7wE)
The output I got is shared here: https://youtu.be/8FZZhAnhFvs
What could be the reason for this mismatch?
To make the reference map visible in RViz, I have set reference_pointcloud_update_mode" default="FullIntegration" for unless="$(arg use_slam)" case in ethzasl_kinect_dataset.launch file.
I am also getting set of warnings and errors in terminal as below.
[ WARN] [1615551179.209201725, 1298112281.916861620]: No transform between frames map_ground_truth_corrected and base_link available after 1298112281.916862 seconds of waiting. This warning only prints once.
[ WARN] [1615551179.213807044, 1298112281.916861620]: No transform between frames map and base_link available after 1298112281.916862 seconds of waiting. This warning only prints once.
[ WARN] [1615551179.222639773, 1298112281.916861620]: No transform between frames map_drl and base_link available after 1298112281.916862 seconds of waiting. This warning only prints once.
[ WARN] [1615551179.231428189, 1298112281.916861620]: No transform between frames odom and base_link available after 1298112281.916862 seconds of waiting. This warning only prints once.
[ WARN] [1615551190.198542742, 1298112287.954772391]: Discarded cloud because localization couldn't be calculated
[ WARN] [1615551190.198679600, 1298112287.954772391]: Failed point cloud processing with error [PointCloudFilteringFailed]
[ WARN] [1615551191.853380178, 1298112289.606053195]: Failed point cloud processing with error [PointCloudFilteringFailed]
[ WARN] [1615551191.886966564, 1298112289.636235797]: Discarded cloud because localization couldn't be calculated
[ WARN] [1615551191.886997556, 1298112289.636235797]: Failed point cloud processing with error [PointCloudFilteringFailed]
[ERROR] [1615551191.919287026, 1298112289.666394334]: Lost tracking!
In initial attempts, I got warnings related to euclidean_transformation_validator parameters, like:
[ WARN] [1615551455.183423693, 1298112283.032384128]: EuclideanTransformationValidator rejected new pose at time 1298112283.032384128 due to max_outliers_percentage ->
correction translation: 3.12924 | (max_transformation_distance: 4)
correction rotation: 2.03959 | (max_transformation_angle: 0.1)
new pose diff translation: 3.12924 | (max_new_pose_diff_distance: 3.9)
new pose diff rotation: 2.03959 | (max_new_pose_diff_angle: 3.57)
root_mean_square_error: 0.0333355 | (max_root_mean_square_error: 0.05)
root_mean_square_error_reference_pointcloud: 0 | (max_root_mean_square_error_reference_pointcloud: -1)
outliers_percentage: 0.925824 | (max_outliers_percentage: 0.6)
outliers_percentage_reference_pointcloud: 0 | (max_outliers_percentage_reference_pointcloud: -1)
inliers_angular_distribution: 2 | (min_inliers_angular_distribution: 0.125)
outliers_angular_distribution: -2 | (max_outliers_angular_distribution: 0.875)
[pcl::IterativeClosestPointWithNormals::computeTransformation] Not enough correspondences found. Relax your threshold parameters.
Fo fix this, I had modified values in ../dynamic_robot_localization/yaml/configs/pose_tracking/cluttered_environments_dynamic_large_map_3d.yaml as below:
transformation_validators:
euclidean_transformation_validator:
max_transformation_angle: 3.20
max_transformation_distance: 4.0
max_new_pose_diff_angle: 3.57
max_new_pose_diff_distance: 3.90
max_root_mean_square_error: 0.05
max_outliers_percentage: 1.00
Please guide me to resolve this.
My configuration is:
Hi,
I decided tom tryout the default configuration of the guardian localization example. I am only using the front lidar and therefore edited the launch file a bit (just the lidar area).
First, the localization was not able to find the correspondance points and I thought this is due to the global localization problem. After I pointed out the initial pose, I get:
[ INFO] [1569164427.212248087]: Received initial pose at time [1569164427.180871518]:
Translation -> [ x: -1.34668 | y: -0.0939072 | z: 0 ]
Rotation -> [ qx: 0 | qy: 0 | qz: 0.055193 | qw: 0.998476 ]
[pcl::RandomSampleConsensus::computeModel] No samples could be selected!
[pcl::SampleConsensusModel::getSamples] Can not select 0 unique points out of 2!
[pcl::RandomSampleConsensus::computeModel] No samples could be selected!
[pcl::IterativeClosestPoint::computeTransformation] Not enough correspondences found. Relax your threshold parameters.
[ WARN] [1569164428.699073609]: Discarded cloud because localization couldn't be calculated
[ WARN] [1569164428.699200539]: Failed point cloud processing with error [FailedInitialPoseEstimation]
[ WARN] [1569164429.367911121]: Pose to TF publisher reached timeout for last valid pose
TF translation -> [ x: -1.34668 | y: -0.0939072 | z: 0 ]
TF orientation -> [ qx: 0 | qy: 0 | qz: 0.055193 | qw: 0.998476 ]
Current time: 1569164429.367611740
Last pose time: 1569164427.180871518
Last pose arrival time: 1569164427.268565237
[ WARN] [1569164430.468016846]: Pose to TF publisher reached timeout for last valid pose
TF translation -> [ x: -1.34668 | y: -0.0939072 | z: 0 ]
TF orientation -> [ qx: 0 | qy: 0 | qz: 0.055193 | qw: 0.998476 ]
Current time: 1569164430.467624702
Last pose time: 1569164427.180871518
Last pose arrival time: 1569164427.268565237
[ WARN] [1569164430.611402094]: The /scan observation buffer has not been updated for 1.31 seconds, and it should be updated every 1.20 seconds.
[pcl::IterativeClosestPoint2D::computeTransformation] Not enough correspondences found. Relax your threshold parameters.
[ WARN] [1569164430.830316015]: EuclideanTransformationValidator rejected new pose at time 1569164430.830252524 due to root_mean_square_error ->
correction translation: 0 | (max_transformation_distance: 0.5)
correction rotation: 0 | (max_transformation_angle_: 0.7)
new pose diff translation: 0 | (max_new_pose_diff_distance_: 0.9)
new pose diff rotation: 0 | (max_new_pose_diff_angle_: 1.1)
root_mean_square_error: 1.79769e+308 | (max_root_mean_square_error_: 0.1)
root_mean_square_error_reference_pointcloud: 0 | (max_root_mean_square_error_reference_pointcloud_: -1)
outliers_percentage: 1 | (max_outliers_percentage_: 0.85)
outliers_percentage_reference_pointcloud: 0 | (max_outliers_percentage_reference_pointcloud_: -1)
inliers_angular_distribution: 2 | (min_inliers_angular_distribution_: 0.07)
outliers_angular_distribution: -2 | (max_outliers_angular_distribution_: 0.875)
[ WARN] [1569164430.830735712]: Discarded cloud because localization couldn't be calculated
[ WARN] [1569164430.830845610]: Failed point cloud processing with error [FailedPoseEstimation]
Is that really the parameter choice?
Aslo, I have noticed that the pointcloud produced by lasercan_to_pointcloud (/guardian/lasers) works fine but no node subscribes to it. So I am not sure the purpose of it. The other PCL produced by drl node (ambient_pointcloud) works fine.
Here is the another warning:
[ WARN] [1569224041.915014225]: EuclideanTransformationValidator rejected new pose at time 1569224041.914963026 due to max_new_pose_diff_angle ->
correction translation: 0.0208703 | (max_transformation_distance: 0.5)
correction rotation: 0.00947034 | (max_transformation_angle_: 0.7)
new pose diff translation: 0.0208703 | (max_new_pose_diff_distance_: 0.9)
new pose diff rotation: 0.00947034 | (max_new_pose_diff_angle_: 1.1)
root_mean_square_error: 0.0307908 | (max_root_mean_square_error_: 0.1)
root_mean_square_error_reference_pointcloud: 0 | (max_root_mean_square_error_reference_pointcloud_: -1)
outliers_percentage: 0.348315 | (max_outliers_percentage_: 0.85)
outliers_percentage_reference_pointcloud: 0 | (max_outliers_percentage_reference_pointcloud_: -1)
inliers_angular_distribution: 2 | (min_inliers_angular_distribution_: 0.07)
outliers_angular_distribution: -2 | (max_outliers_angular_distribution_: 0.875)
I am not sure why this warning appears as the max_new_pose_diff_angle
is higher than new pose diff rotation: 0.00947034
Hi Carlos!
First of all, thanks for sharing your research work, which is quite fascinating.
I've successfully installed from the source and recently I'm looking through your codes. I try to implement your algorithm on my own 2-D laser data to accomplish 3 DoF (position_x, position_y, angle_yaw) localization tasks. However, I have a few questions which I hope you could clarify for me:
By the way, I wrongly posted the issue #17 and you can delete it. Sorry about this.
Thanks a lot and looking forward to your early reply!
Best Regards,
William
Hi, I would like to perform localization of a lidar which is moving in a space, which is also recorded as a pcl pointcloud. I checked the big config file and I haven't seen any option for map upload. Where can I find a good launch file for this? Maybe it would be better to use the dynamic_robot_localization_tests
package?
Regards
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.