Coder Social home page Coder Social logo

stereo_slam's Introduction

Stereo SLAM

stereo_slam is a ROS node to execute Simultaneous Localization And Mapping (SLAM) using only one stereo camera. The algorithm was designed and tested for underwater robotics. This node is based on the G2O library for graph optimization and uses the power of libhaloc to find loop closures between graph nodes. It uses a keyframe to multi-keyframe loop closing mechanism, based on keypoint clustering, to improve the SLAM corrections on feature-poor environments.

You can see it in action here:

Stereo-SLAM video

Specially designed for underwater scenarios:

Video: ORB-SLAM vs Stereo SLAM (underwater)

More videos...

Video: Stereo SLAM and 3D reconstruction and Video: Stereo SLAM at UIB outdoor pond

Related paper

ICRA'16, nominated for the best student paper award

CITATION:

@INPROCEEDINGS{7487416,
  author={P. L. Negre and F. Bonin-Font and G. Oliver},
  booktitle={2016 IEEE International Conference on Robotics and Automation (ICRA)},
  title={Cluster-based loop closing detection for underwater slam in feature-poor regions},
  year={2016},
  pages={2589-2595},
  keywords={SLAM (robots);autonomous underwater vehicles;feature extraction;image matching;image registration;mobile robots;object detection;path planning;robot vision;AUV navigation;Balearic Islands;autonomous underwater vehicle;cluster-based loop closing detection;feature matching;feature-poor underwater environment;marine environments;sandbanks;seagrass;simultaneous localization and mapping;underwater SLAM;vision-based localization systems;visual features;visual information;visual keypoint clustering;visual registration;Cameras;Feature extraction;Image edge detection;Pipelines;Rocks;Simultaneous localization and mapping;Visualization},
  doi={10.1109/ICRA.2016.7487416},
  month={May}
}

Installation (Ubuntu + ROS)

  1. Install the dependencies
sudo apt-get install ros-<your ros distro>-libg2o

sudo apt-get install libceres-dev

You also need to setup a stereo visual odometer (e.g. viso2 or fovis).

  1. Download the code, save it to your ROS workspace enviroment and compile.
roscd
git clone https://github.com/srv/stereo_slam.git
cd stereo_slam
rosmake

Parameters

  • refine - Refine odometry (true/false).
  • distance_between_keyframes - Minimum distance (m) between keyframes.
  • working_directory - Directory where all output files will be stored.
  • feature_detector_selection - Name of the feature detector to be used (ORB or SIFT).
  • lc_min_inliers - Minimum number of inliers to close a loop.
  • lc_epipolar_thresh - Maximum reprojection error allowed.
  • map_frame_name - Frame of the slam output
  • lc_neighbors - Number of neighbours to recover in order to increase the possibility of closing the loop.
  • lc_discard_window - Window size of discarded vertices.
  • ransac_iterations - Number of RANSAC iterations for the solvePnPRansac

Subscribed topics

  • odom - Input odometry (type nav_msgs::Odometry).
  • left_image_rect_color - Left image in color and rectified (type sensor_msgs::Image).
  • right_image_rect_color- Right image in color and rectified (type sensor_msgs::Image).
  • left_camera_info - Extrinsic and intrinsic camera parameters (type sensor_msgs::CameraInfo).
  • right_camera_info- Extrinsic and intrinsic camera parameters (type sensor_msgs::CameraInfo).

Published Topics

  • /stereo_slam/odometry - The odometry of the robot (type nav_msgs::Odometry).
  • /stereo_slam/graph_poses - The updated graph poses (type stereo_slam::GraphPoses).
  • /stereo_slam/graph_camera_odometry - Absolute pose of the camera (type nav_msgs::Odometry).
  • /stereo_slam/graph_robot_odometry - Absolute pose of the vehicle (type nav_msgs::Odometry).
  • /stereo_slam/keyframes - Number of inserted keyframes (type std_msgs::Int32).
  • /stereo_slam/keypoints_clustering - Image containing the keypoint clusters (type sensor_msgs::Image).
  • /stereo_slam/stereo_matches_num - Number of correspondences between the stereo pair (type std_msgs::Int32).
  • /stereo_slam/stereo_matches_img - Correspondences between the stereo pair (type sensor_msgs::Image).
  • /stereo_slam/num_clusters - Number of clusters (type std_msgs::Int32).
  • /stereo_slam/loop_closing_inliers_img - Image of the loop closing correspondences. Correspondences are keyframe-to-multi-keyframe (type sensor_msgs::Image).
  • /stereo_slam/loop_closing_matches_num - Number of matches of each loop closing (type std_msgs::Int32).
  • /stereo_slam/loop_closing_inliers_num - Number of inliers of each loop closing (type std_msgs::Int32).
  • /stereo_slam/loop_closing_queue - Number of keyframes waiting on the loop closing queue. Please monitor this topic, to check the real-time performance: if this number grows indefinitely it means that your system is not able to process all the keyframes, then, scale your images (type std_msgs::String).
  • /stereo_slam/loop_closings_num - Number of loop closings found (type std_msgs::Int32).
  • /robot_0/stereo_slam/time_tracking - The elapsed time of each iteration of the tracking thread (type stereo_slam::TimeTracking).
  • /robot_0/stereo_slam/time_graph - The elapsed time of each iteration of the graph thread (type stereo_slam::TimeGraph).
  • /robot_0/stereo_slam/time_loop_closing - The elapsed time of each iteration of the loop closing thread (type stereo_slam::TimeLoopClosing).
  • /robot_0/stereo_slam/sub_time_loop_closing - The elapsed time in certain processes of the loop closing thread (type stereo_slam::SubTimeLoopClosing).

Run the node

You can run the node using the following launch file (please, for a better performance scale your images if more than 960px width).

<launch>
  <arg name = "camera"        default = "/stereo"/>
  <arg name = "robot_name"  default = "robot_0"/>

  <!-- Run the stereo image proc -->
  <node ns = "$(arg camera)" pkg = "stereo_image_proc" type = "stereo_image_proc" name = "stereo_image_proc" />

  <node pkg = "viso2_ros" type = "stereo_odometer" name = "stereo_odometer">
    <remap from = "stereo" to = "$(arg camera)"/>
    <remap from = "image" to = "image_rect"/>
  </node>

  <node pkg="stereo_slam" type="localization" name="stereo_slam" output="screen" if = "$(eval enable_decimate_x1 == true)">
    <remap from = "odom"                        to = "stereo_odometer/odometry"/>
    <remap from = "left_camera_info"            to = "/$(arg camera)/left/camera_info"/>
    <remap from = "right_camera_info"           to = "/$(arg camera)/right/camera_info"/>
    <remap from = "left_image_rect_color"       to = "/$(arg camera)/left/image_rect_color"/>
    <remap from = "right_image_rect_color"      to = "/$(arg camera)/right/image_rect_color"/> 
    <param name = "refine"                      value = "false"/>
    <param name = "distance_between_keyframes"  value = "0.5"/>
    <param name = "feature_detector_selection"  value = "ORB"/>
    <param name = "lc_min_inliers"              value = "30"/>
    <param name = "lc_epipolar_thresh"          value = "1.0"/>
    <param name = "map_frame_name"              value = "/$(arg robot_name)/map"/>
    <param name = "lc_neighbors"                value = "5"/>
    <param name = "lc_discard_window"           value = "20"/>
    <param name = "ransac_iterations"           value = "150"/>
  </node>
</launch>

Saved data

The node stores some data into the stereo_slam directory during the execution:

  • haloc - A folder containing all the files needed for the libhaloc library, which is responsible for loop closing detection. You do not need this folder at all.
  • keyframes - Stores the left stereo image for every keyframe (with the possibility of drawing the keypoint clustering over the image).
  • loop_closures - Stores all the images published in the topic /stereo_slam/loop_closing_inliers_img.

Online graph viewer

The node provides a python script to visualize the results of the stereo_slam during the execution: scripts/graph_viewer.py:

usage: 	graph_viewer.py [-h]
	ground_truth_file visual_odometry_file
	graph_vertices_file graph_edges_file

or (automatically detect the graph files):

roscd stereo_slam
./scripts/graph_viewer.py

The odometry file can be recorded directly from ros using:

rostopic echo -p /your_odometry_topic/odometry > odometry.txt

The ground truth file must have the same format than the odometry.

Post processing

The node provides a python script to evaluate the results of the stereo_slam once the execution finishes: scripts/slam_evaluation.py:

usage: slam_evaluation.py [-h]
	ground_truth_file visual_odometry_file
	graph_vertices_file graph_edges_file

This script perform a set of operations in order to evaluate the performance of the stereo_slam algorithm:

  1. Align all the curves (ground truth, visual_odometry and graph vertices) to the same origin.

  2. Compute the average translation error.

  3. Plot the error vs trajectory distance.

stereo_slam's People

Contributors

bomiquel avatar francescbonin avatar miquelmassot avatar plnegre avatar rosusersrv avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

stereo_slam's Issues

libhaloc error while compilling

Project 'stereo_slam' tried to find library 'haloc'. The library is
neither a target nor built/installed properly. Did you compile project
'libhaloc'? Did you find_package() it before the subdirectory containing
its code is included?
Call Stack (most recent call first):
CMakeLists.txt:33 (find_package)

Reconstruction.yaml missing from stereo slam package

I am trying to use a stereo slam package,and by following the instructions, I completed the installation. I am trying to run this slam with drcsim Atlas bot, spawned in a gazebo vrc task environment.

Now, by editing the demo.launch file of the package, I can link the stereo_slam package to subscribe to topics which provide with the stereo pair of images. The edited launch file

    <launch>

<arg name="bagfile" default="/tmp/ROS.bag"/>

<!-- Setup the camera -->
<arg name="camera" default="/multisense_sl/camera/"/>
<param name="/use_sim_time" value="true"/>

<!-- Play the bagfile -->
<node pkg="rosbag" type="play" name="rosbag" args="--clock $(arg bagfile)"/>

<!-- Run the stereo image proc -->
<group ns="$(arg camera)">
    <node pkg="stereo_image_proc" type="stereo_image_proc" name="stereo_image_proc" />
</group>

<!-- Run the visual odometry: You can choose between viso2 and fovis -->
<node pkg="viso2_ros" type="stereo_odometer" name="stereo_odometer">
    <remap from="stereo" to="$(arg camera)"/>
    <remap from="image" to="image_color"/>
    <param name="base_link_frame_id" value="/pelvis"/>
</node>
<!--node pkg="fovis_ros" type="stereo_odometer" name="fovis">
    <remap from="stereo" to="$(arg camera)"/>
    <remap from="image" to="image_color"/>
    <param name="base_link_frame_id" value="/pelvis"/>
</node-->

<!-- Stereo SLAM -->
<node pkg="stereo_slam" type="localization" name="slam" output="screen">

    <!-- Working directory -->
    <param name="work_dir" value="/home/tonystark/freelancing/Atlas_slam/slam/" />

    <!-- Topic parameters -->
    <param name="odom_topic" value="stereo_odometer/odometry"/>
    <param name="left_topic" value="$(arg camera)/left/image_color"/>
    <param name="right_topic" value="$(arg camera)/right/image_color"/>
    <param name="left_info_topic" value="$(arg camera)/left/camera_info"/>
    <param name="right_info_topic" value="$(arg camera)/right/camera_info"/>

    <!-- Min displacement between graph vertices (in meters) -->
    <param name="min_displacement" value="0.25"/>

    <!-- Descriptor type: SIFT or SURF -->
    <param name="desc_type" value="SIFT"/>

    <!-- Can be "CROSSCHECK" or "RATIO" -->
    <param name="desc_matching_type" value="CROSSCHECK"/>

    <!-- Descriptor threshold for crosscheck matching (typically between 0.7-0.9) or ratio for ratio matching (typically between 0.6-0.8).  -->
    <param name="desc_thresh_ratio" value="0.88" />

    <!-- Minimum number of neighbors that will be skiped for the loop closure (tipically between 5-20, but depends on the frame rate) -->
    <param name="min_neighbor" value="10" />

    <!-- Get the n first candidates of the hash matching (tipically between 1-5) -->
    <param name="n_candidates" value="5" />

    <!-- Minimun number of descriptor matches to consider a matching as possible loop closure (>8) -->
    <param name="min_matches" value="30" />

    <!-- Minimum number of inliers to consider a matching as possible loop closure (>8) -->
    <param name="min_inliers" value="15" />

    <!-- Set to 0 for LinearSlam Solver with gauss-newton. Set to 1 for LinearSlam Solver with Levenberg -->
    <param name="g2o_algorithm" value="0" />

    <!-- Maximum number of iteration for the g2o algorithm -->
    <param name="g2o_opt_max_iter" value="10" />
</node>

<!-- Pointcloud collector -->
<node pkg="stereo_slam" type="collector" name="collector" output="screen">

    <param name="work_dir" value="/home/tonystark/freelancing/Atlas_slam/slam/" />

    <param name="graph_topic" value="slam/graph"/>
    <param name="cloud_topic" value="$(arg camera)/scaled/points2"/>

    <rosparam file="$(find stereo_slam)/etc/reconstruction.yaml" />

</node>

When executing

    roslaunch stereo_slam demo.launch

I get the following error

    error loading <rosparam> tag: 
file does not exist [/home/tonystark/lib_haloc/src/stereo_slam/etc/reconstruction.yaml]
    XML is <rosparam file="$(find stereo_slam)/etc/reconstruction.yaml"/>
    The traceback for the exception was written to the log file

Upon searching, neither directory /etc nor the file reconstruction.yaml exited. Has anybody made use of this package? If yes, can you please tell me how to solve this? Is it a file and folder I should create by myself? Thanks in advance

which camera ?

would be great to indicate in the doc which device is compatible ?
do I build my own rig with 2 webcams or do I need to purchase a kinect or something ?

collector node.

Hello! Thanks for the code.

Has the collector node that is launched by the demo been deprecated?

If so are there any pointcloud collector/ publishers that you might recommend?

Impossible to compile in Indigo

Would be possible to upload a complete and working Indigo packet? I've tried doing a lot of changes and correcting one by one the errors that I found while compiling, but it is not possible to fix it at all.

My brain is goint to explode.

Thank you in advance.

loop closing build error

I tried to build stereo SLAM from kinetic branch then I got the following error

error: ‘class haloc::Hash’ has no member named ‘isInitialized’
if (!hash_.isInitialized()
It seems that the function is belong to libhaloc indigo version not kinetic version

Camera Calibration

This is a great package, thanks for publishing your code! I'm having a little bit of trouble with camera calibration though. Right now I am calibrating the camera in the water (onboard the ROV) using the ROS camera calibration system. It's still providing very poor results, do you have a recommended method for properly calibrating a stereo array to utilize this package? Thanks again.

Handling pose jumps from loop closures

Hey guys,

If one uses the corrected odometry output for vehicle control, how would the pose output behave on loop closures? I guess there would be a jump if there graph correction is done starting from the previous nodes ?

Or is the correction applied from the front nodes, thus only affecting the older graph points and there is no odometry jump?

Kabir

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.