Coder Social home page Coder Social logo

als_ros's Introduction

als_ros

An Advanced Localization System [1] for Robot Operating System use (als_ros) is a localization package with 2D LiDAR. als_ros contains following functions;

  • Robust localization based on sensor measurement class estimation [2],
  • Reliability estimation based on Bayesian filtering with a simple classifier of localization correctness [3],
  • Misalignment recognition using Markov random fields with fully connected latent variables [4],
  • Quick re-localization based on fusion of pose tracking and global localization via the importance sampling [5].

These details can be seen at Reliable Monte Carlo Localization for Mobile Robots (arXiv preprint).

Demonstration video showing comparison of als_ros with ROS amcl.

How to install and use

How to install

ROS environment is needed to be installed first. I confirmed that als_ros works on Ubuntu 18.04 with melodic and Ubuntu 20.04 with noetic.

als_ros can be installed with following commands.

$ git clone https://github.com/NaokiAkai/als_ros.git
$ cd als_ros
$ catkin_make
$ source devel/setup.bash

If you do not want to make a new workspace for als_ros, please copy the als_ros package to your workspace. The cloned directory has a ROS workspace.

How to use

Following messages (topics) are needed to be published;

  • sensor_msgs::LaserScan (/scan)
  • nav_msgs::Odometry (/odom)
  • nav_msgs::OccupancyGrid (/map)

Names inside of the brackets are default topic names.

Also, static transformation between following two frames is needed to be set.

  • origin of a robot (base_link)
  • 2D LiDAR (laser)

Names inside of the brackets are default frame names.

There are launch files in the als_ros package. These names can be changed in mcl.launch.

After setting the topics and transformation, the localization software can be used with mcl.launch.

$ roslaunch als_ros mcl.launch

In default, localization for pose tracking with the robust localization and reliability estimation techniques presented in [2, 3] is executed.

If you want to use fusion of pose tracking and global localization, please set use_gl_pose_sampler flag to true.

$ roslaunch als_ros mcl.launch use_gl_pose_sampler:=true

In als_ros, global localization is implemented using the free-space feature presented in [6].

If you want to use estimation of localization failure probability with misalignment recognition, please set use_mrf_failure_detector flag to true.

$ roslaunch als_ros mcl.launch use_mrf_failure_detector:=true

Parameter descriptions

Descriptions for all the parameters are written in the launch files. I am planning to make a more precise document.

Citation

arXiv preprint and journal paper are available. If you used als_ros in your research, please cite the journal paper.

Naoki Akai. "Reliable Monte Carlo Localization for Mobile Robots," Journal of Field Robotics, vol. 40, no.3, pp. 595-613, 2023.
@article{Akai2023JFR:ReliableMC,
    title = {Reliable Monte Carlo Localization for Mobile Robots},
    author = {Akai, Naoki},
    journal = {Journal of Field Robotics},
    volume = {40},
    number = {3},
    pages = {595--613},
    year = {2023}
}

References

[1] Naoki Akai."An advanced localization system using LiDAR: Performance improvement of localization for mobile robots and its implementation," CORONA PUBLISHING CO., LTD (to be appeared, in Japanese).

[2] Naoki Akai, Luis Yoichi Morales, and Hiroshi Murase. "Mobile robot localization considering class of sensor observations," In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3159-3166, 2018.

[3] Naoki Akai, Luis Yoichi Morales, Hiroshi Murase. "Simultaneous pose and reliability estimation using convolutional neural network and Rao-Blackwellized particle filter," Advanced Robotics, vol. 32, no. 17, pp. 930-944, 2018.

[4] Naoki Akai, Luis Yoichi Morales, Takatsugu Hirayama, and Hiroshi Murase. "Misalignment recognition using Markov random fields with fully connected latent variables for detecting localization failures," IEEE Robotics and Automation Letters, vol. 4, no. 4, pp. 3955-3962, 2019.

[5] Naoki Akai, Takatsugu Hirayama, and Hiroshi Murase. "Hybrid localization using model- and learning-based methods: Fusion of Monte Carlo and E2E localizations via importance sampling," In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pp. 6469-6475, 2020.

[6] Alexander Millane, Helen, Oleynikova, Juan Nieto, Roland Siegwart, and César Cadena. "Free-space features: Global localization in 2D laser SLAM using distance function maps," In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1271-1277, 2019.

SLAMER

A node for Simultaneous Localization And Map-assisted Environment Recognition (SLAMER) is also implemented in this repository. However, I do not consider to support it officially. Please refer the SLAMER paper if you want to know details. This work was presented at ICRA 2023.

License

Apache License 2.0

als_ros's People

Contributors

h-wata avatar naokiakai avatar zzuxzt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

als_ros's Issues

High CPU usage for gl_pose_sampler and mrf_failure_detector node

Hi,

I have implemented this package in a simulated robot. With use_mrf_failure_detector:=true and use_gl_pose_sampler:=true , I realised that the gl_pose_sampler and mrf_failure_detector nodes would consumes a lot of cpu, about 100% for two cpu cores. Is this normal? Is there a way to reduce the cpu usage?

Thanks.

positioning drift

During the positioning process, the laser has been unable to match the map well, and the mapping environment is the same as the positioning environment, without any change. And the particles are scattered from the start

Could not use als_ros with navigation stack

System information

  • Ubuntu 18.04
  • ROS Melodic

Detailed description

I hope to replace amcl with als_ros. So, I tried on Gazebo simulator.
But it occurs the following error.

[ INFO] [1654619677.594540743, 18.175000000]: Got new plan
[ERROR] [1654619677.594762760, 18.176000000]: Extrapolation Error: Lookup would require extrapolation into the future.  Requested time 18.151000000 but the latest data is at time 17.952000000, when looking up transform from frame [odom] to frame [map]

[ERROR] [1654619677.594791175, 18.176000000]: Global Frame: odom Plan Frame size 89: map

Steps to reproduce

Disable AMCL

I changed /opt/ros/melodic/share/turtlebot3_navigation/launch/turtlebot3_navigation.launch to disable amcl.

Launch Gazezbo

roslaunch turtlebot3_gazebo turtlebot3_world.launch

Launch als_ros

roslaunch als_ros mcl.launch laser_frame:=base_scan

Launch turtlebot3_navigation

roslaunch turtlebot3_navigation turtlebot3_navigation.launch

Investigation

The amcl has transform_tolerance to transform the coordinate system using tf.

But, it seems that als_ros does not have this parameter.

mclPoseStamp_ = scan_.header.stamp;

So, I changed als_ros to resolve this problem.

--- a/src/als_ros/include/als_ros/MCL.h
+++ b/src/als_ros/include/als_ros/MCL.h
@@ -511,7 +511,8 @@ public:
         if (rejectUnknownScan_ && (measurementModelType_ == 0 || measurementModelType_ == 1))
             rejectUnknownScan();
 
-        mclPoseStamp_ = scan_.header.stamp;
+        ros::Duration transform_tolerance(1.0); // TODO: This value should be set as a parameter.
+        mclPoseStamp_ = scan_.header.stamp + transform_tolerance;

As a result, I could use als_ros with navigation stack.

Estimating pose drift

People block part of the laser, and the position keeps changing.How to deal with it?

ROS2 release

Hi

Is there any ROS2 release for als_ros? I would like to try it out with ROS2.

Best,
Samuel

Build Error

I got these Errors as below when catkin build als_ros.

Errors     << als_ros:make /home/gisen/ros_workspaces/alk_ws/logs/als_ros/build.make.002.log
CMakeFiles/classifier_dataset_generator.dir/src/classifier_dataset_generator.cpp.o: 関数 `als_ros::ClassifierDatasetGenerator::ClassifierDatasetGenerator()' 内:
classifier_dataset_generator.cpp:(.text._ZN7als_ros26ClassifierDatasetGeneratorC2Ev[_ZN7als_ros26ClassifierDatasetGeneratorC5Ev]+0x1a4): `cv::Mat::Mat()' に対する定義されていない参照です
CMakeFiles/classifier_dataset_generator.dir/src/classifier_dataset_generator.cpp.o: 関数 `als_ros::ClassifierDatasetGenerator::mapCB(boost::shared_ptr<nav_msgs::OccupancyGrid_<std::allocator<void> > const> const&)' 内:
classifier_dataset_generator.cpp:(.text._ZN7als_ros26ClassifierDatasetGenerator5mapCBERKN5boost10shared_ptrIKN8nav_msgs14OccupancyGrid_ISaIvEEEEE[_ZN7als_ros26ClassifierDatasetGenerator5mapCBERKN5boost10shared_ptrIKN8nav_msgs14OccupancyGrid_ISaIvEEEEE]+0x22d): `cv::Mat::Mat(int, int, int)' に対する定義されていない参照です
classifier_dataset_generator.cpp:(.text._ZN7als_ros26ClassifierDatasetGenerator5mapCBERKN5boost10shared_ptrIKN8nav_msgs14OccupancyGrid_ISaIvEEEEE[_ZN7als_ros26ClassifierDatasetGenerator5mapCBERKN5boost10shared_ptrIKN8nav_msgs14OccupancyGrid_ISaIvEEEEE]+0x4ed): `cv::Mat::Mat(int, int, int)' に対する定義されていない参照です
collect2: error: ld returned 1 exit status
make[2]: *** [/home/gisen/ros_workspaces/alk_ws/devel/.private/als_ros/lib/als_ros/classifier_dataset_generator] Error 1
make[1]: *** [CMakeFiles/classifier_dataset_generator.dir/all] Error 2
make[1]: *** 未完了のジョブを待っています....
CMakeFiles/mae_classifier_learning.dir/src/mae_classifier_learning.cpp.o: 関数 `als_ros::ClassifierDatasetGenerator::ClassifierDatasetGenerator()' 内:
mae_classifier_learning.cpp:(.text._ZN7als_ros26ClassifierDatasetGeneratorC2Ev[_ZN7als_ros26ClassifierDatasetGeneratorC5Ev]+0x1a4): `cv::Mat::Mat()' に対する定義されていない参照です
collect2: error: ld returned 1 exit status
make[2]: *** [/home/gisen/ros_workspaces/alk_ws/devel/.private/als_ros/lib/als_ros/mae_classifier_learning] Error 1
make[1]: *** [CMakeFiles/mae_classifier_learning.dir/all] Error 2
make: *** [all] Error 2

I think ${OpenCV_LIBRARIES} should be add to target_link_libraries in CMakeLists.txt.
I would send PR, please confirm it.

Issue with OpenCV on ROS Noetic on Nvidia Jetson Xavier NX (Jetpack 5.1.1)

Hello,
I'm trying to use als_ros on Noetic but I can't run map_server and als_ros at the same time.

When I run :
roslaunch als_ros mcl.launch
I don't get any error until I run the following command in another terminal :
rosrun map_server map_server <myMap>

I get the following error in the terminal where I ran als_ros :

terminate called after throwing an instance of 'cv::Exception'
  what():  OpenCV(4.5.4) /home/ubuntu/build_opencv/opencv/modules/core/src/matrix.cpp:250: error: (-215:Assertion failed) s >= 0 in function 'setSize'

[mcl-2] process has died [pid 21493, exit code -6, cmd /home/nvidia/catkin_ws/devel/lib/als_ros/mcl __name:=mcl __log:=/home/nvidia/.ros/log/e577bfa6-263f-11ee-95f6-48b02d059d63/mcl-2.log].
log file: /home/nvidia/.ros/log/e577bfa6-263f-11ee-95f6-48b02d059d63/mcl-2*.log

And home/nvidia/.ros/log/e577bfa6-263f-11ee-95f6-48b02d059d63/mcl-2.log does not exist

Any Description regarding SLAMMER

Hi

I found that there's newly added slammer node, but i couldn't find any description / elaboration regarding this slammer node. Can briefly explain what's it functions?

Best,
Samuel

How to use als_ros package correctly?

I can run the navigation function with AMCL, but when I use mcl.launch instead of amcl.launch,this error occurred:

Timed out waiting for transform from base_footprint to map to become available before running costmap, tf error: canTransform: target_frame map does not exist.

I didn't make any changes to mcl.launch. I hope someone help me.o(╥﹏╥)o

Does it support 180 degree lidar?

Does the mcl localizer support 180 degree lidar? In my setup I tried to replace amcl, but the robot started rotating and shifting strongly in the gazebo simulation. Thank you very much!

using als_ros when no known map and slamer mode

Hello,

I have been practising als_ros in simulation environment. It has been working very nice.

Do we have to have a map when using als_ros? Is it possible to use it when there is no map file?

And also, what is slamer mode and how can we use it?

Best Regards.
C.

Yaw drift in narrow areas

Hi Naoki,
Thank you for open sourcing the great work.
We are trying als ros for map-based localization with a 2D lidar of range 10 m.
However, when the person carrying the lidar turns in a narrow cubicle, the als drift significantly in orientation as shown in the attached video.

Untitled.Project.mp4

Can you please instruct us on how to mitigate this issue?
We tried to bump up the number of particles to 1000 to no avail.
Surprisingly, we also tried the AMCL in ros1, which did not suffer this problem.
Your early reply will be deeply appreciated as our project's deadline is approaching real quick.

porting to ROS2, or ROS2 equivalent for the package

Hello NaokiAkai,

First of all thanks for the hard work. How diffucult would it be to port this package to ROS2? I have gone over the learning curve of ROS2, and I have been able to port few simple things, but I can not estimate how diffucult it would be to undertake to get als ros running on ROS2.

Best Regards,
C.A.

TF_NAN_INPUT error on launch

Hello,

I am trying to get als_ros working but as soon as I run mcl.launch I get this error back, on Noetic. the tf published by als_ros is also only filled with nan.

I'm not really sure where the error could be coming from, everything is setup correctly from odom and down, and I can run AMCL with navigation without problem. Do you have any idea where I could look further into? i've not modified at all the mcl.launch.

[ERROR] [1684233719.104600625]: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of an invalid quaternion in the transform (nan nan nan nan)
process[mcl-1]: started with pid [79721]
[ INFO] [1684231427.306997742]: MCL is ready to perform

MCL: x = nan [m], y = nan [m], yaw = nan [deg]
Odom: x = 0.844567 [m], y = 0.892555 [m], yaw = 5.77389 [deg]
total likelihood = nan
average likelihood = nan
max likelihood = nan
effective sample size = nan
reliability = 0.0354446 (mae = Error:   TF_NAN_INPUT: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of a nan value in the transform (nan nan nan) (nan nan nan nan)
         at line 0.36875 [m], th = 0.225176 [m])

240 in /tmp/binarydeb/ros-noetic-tf2-0.7.6/src/buffer_core.cpp
Error:   TF_DENORMALIZED_QUATERNION: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of an invalid quaternion in the transform (nan nan nan nan)
         at line 255 in /tmp/binarydeb/ros-noetic-tf2-0.7.6/src/buffer_core.cpp
MCL: x = nan [m], y = nan [m], yaw = nan [deg]
Odom: x = 0.844523 [m], y = 0.892563 [m], yaw = 5.77502 [deg]
total likelihood = nan
average likelihood = nan
max likelihood = nan
effective sample size = nan
reliability = 0.00134853 (mae = Error:   TF_NAN_INPUT: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of a nan value in the transform (nan nan nan) (nan nan nan nan)
         at line 240 in 0.373445 [m], th = 0.225176 [m])

/tmp/binarydeb/ros-noetic-tf2-0.7.6/src/buffer_core.cpp
Error:   TF_DENORMALIZED_QUATERNION: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of an invalid quaternion in the transform (nan nan nan nan)
         at line 255 in /tmp/binarydeb/ros-noetic-tf2-0.7.6/src/buffer_core.cpp
MCL: x = nan [m], y = nan [m], yaw = nan [deg]
Odom: x = 0.844484 [m], y = 0.892575 [m], yaw = 5.77925 [deg]
total likelihood = nan
average likelihood = nan
max likelihood = nan
effective sample size = nan
reliability = 0.0001 (mae = Error:   TF_NAN_INPUT: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of a nan value in the transform (nan nan nan) (nan nan nan nan)
         at line 240 in /tmp/binarydeb/ros-noetic-tf2-0.7.6/src/buffer_core.cpp
Error:   TF_DENORMALIZED_QUATERNION: Ignoring transform for child_frame_id "odom" from authority "unknown_publisher" because of an invalid quaternion in the transform (nan nan nan nan)
         at line 255 in /tmp/binarydeb/ros-noetic-tf2-0.7.6/src/buffer_core.cpp
0.364485 [m], th = 0.225176 [m])

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.