Coder Social home page Coder Social logo

moveit_calibration's Introduction

MoveIt Calibration

Tools for robot arm hand-eye calibration.

Warning to Melodic users
OpenCV 3.2, which is the version in Ubuntu 18.04, has a buggy ArUco board pose detector. Do not expect adequate results if you are using an ArUco board with OpenCV 3.2.

MoveIt Calibration supports ArUco boards and ChArUco boards as calibration targets. Experiments have demonstrated that a ChArUco board gives more accurate results, so it is recommended.

This repository has been developed and tested on ROS Melodic and Noetic. It has not been tested on earlier ROS versions. When building moveit_calibration on ROS Melodic, rviz_visual_tools must also be built from source.

This package was originally developed by Dr. Yu Yan at Intel, and was originally submitted as a PR to the core MoveIt repository. For background, see this Github discussion.

GitHub Actions - Continuous Integration

Format BuildAndTest codecov

moveit_calibration's People

Contributors

brinij avatar christian-rauch avatar davetcoleman avatar jakubach avatar jstech avatar mikeferguson avatar rhaschke avatar roboticsyy avatar samarth-robo avatar stephanie-eng avatar tylerjw avatar valeriomagnago avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

moveit_calibration's Issues

Changing the image topic sometimes crashes Rviz

Changing the image topic sometimes crashes Rviz. We could reproduce it a few times by saving the Rviz config with the plugin open and the image topics set, then reopening Rviz, closing the plugin, opening the plugin and setting the image topic. The error message refers to a mutex lock. I don't know if it is related to this code or an issue with Rviz.

Automatic calibration function does not work reliably

While using the Auto Calibration function, I've noticed that the Plan and Execute buttons sometimes basically do nothing. The buttons also seem to be active always although they only start to work after manually capturing at least five samples. However, even then the functionality appears to fail occasionally. I haven't been able to use the calibration tool like it is used in this video: https://www.youtube.com/watch?v=VUZYQ23Kfas

Getting the current implementation to work reliably would be a good first step but I think the calibration workflow would be improved if user could just show the target to the camera once, and a list of poses would be generated based on that. Ideally, user would be able to scroll through that list, see a preview of each goal pose and be able to adjust them and possible remove some of them from the list. Executing would then drive the arm to each goal pose, capture an image and perform calibration based on that data once all the poses have been reached.

Feature request: automated calibration

A common use-case is to repeat a calibration, maybe because an end-effector was replaced, or the camera was bumped, or just to maintain accuracy over time. To support this, it would be helpful to be able to save the full MoveIt Calibration configuration to a single yaml file, containing target parameters, frame selections, joint states, etc., which could then be loaded and re-run by a single command (ideally without even launching RViz).

This was also suggested in #25.

crashing when inserting values with "," instead of "." in measured marker size and measures separation

The default values are though with a ",". So this can be pretty missleading. The error message I get when i use ",":

@HandEyeArucoTarget::setTargetDimension @128] [1615452051.080090049]: Invalid target measured dimensions: marker_size 0,000000, marker_seperation 0,000000
OpenCV Error: Assertion failed (markersX > 0 && markersY > 0 && markerLength > 0 && markerSeparation > 0) in create, file /home/yy/git/opencv-install/opencv_contrib/modules/aruco/src/aruco.cpp, line 1646

Build Issue

Thanks for your repo. I'm a beginner of ROS, when I use catkin_make, it has errors. That confused me.
Anyone could help me with that? Thanks in advance!!

nvidia@miivii-tegra:/brickbot_ws$ catkin build
bash: catkin: cannot find command
nvidia@miivii-tegra:
/brickbot_ws$ catkin_make
Base path: /home/nvidia/brickbot_ws
Source space: /home/nvidia/brickbot_ws/src
Build space: /home/nvidia/brickbot_ws/build
Devel space: /home/nvidia/brickbot_ws/devel
Install space: /home/nvidia/brickbot_ws/install

Running command: "cmake /home/nvidia/brickbot_ws/src -DCATKIN_DEVEL_PREFIX=/home/nvidia/brickbot_ws/devel -DCMAKE_INSTALL_PREFIX=/home/nvidia/brickbot_ws/install -G Unix Makefiles" in "/home/nvidia/brickbot_ws/build"

-- Using CATKIN_DEVEL_PREFIX: /home/nvidia/brickbot_ws/devel
-- Using CMAKE_PREFIX_PATH: /home/nvidia/brickbot_ws/devel;/opt/ros/melodic
-- This workspace overlays: /home/nvidia/brickbot_ws/devel;/opt/ros/melodic
-- Found PythonInterp: /usr/bin/python2 (found suitable version "2.7.17", minimum required is "2")
-- Using PYTHON_EXECUTABLE: /usr/bin/python2
-- Using Debian Python package layout
-- Using empy: /usr/bin/empy
-- Using CATKIN_ENABLE_TESTING: ON
-- Call enable_testing()
-- Using CATKIN_TEST_RESULTS_DIR: /home/nvidia/brickbot_ws/build/test_results
-- Found gtest sources under '/usr/src/googletest': gtests will be built
-- Found gmock sources under '/usr/src/googletest': gmock will be built
-- Found PythonInterp: /usr/bin/python2 (found version "2.7.17")
-- Using Python nosetests: /usr/bin/nosetests-2.7
-- catkin 0.7.29
-- BUILD_SHARED_LIBS is on
-- BUILD_SHARED_LIBS is on
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-- ~~ traversing 23 packages in topological order:
-- ~~ - brickbot_moveit_config
-- ~~ - robot_control
-- ~~ - ros_numpy
-- ~~ - depth_for_brickbot
-- ~~ - moveit_calibration_plugins
-- ~~ - brickbot_description
-- ~~ - moveit_calibration_gui
-- ~~ - zed_ar_track_alvar_example
-- ~~ - zed_depth_sub_tutorial
-- ~~ - zed_display_rviz
-- ~~ - zed_examples (metapackage)
-- ~~ - zed_interfaces
-- ~~ - zed_multicamera_example
-- ~~ - zed_nodelet_example
-- ~~ - zed_nodelets
-- ~~ - zed_obj_det_sub_tutorial
-- ~~ - zed_ros (metapackage)
-- ~~ - zed_rtabmap_example
-- ~~ - zed_sensors_sub_tutorial
-- ~~ - zed_sync_test
-- ~~ - zed_tracking_sub_tutorial
-- ~~ - zed_video_sub_tutorial
-- ~~ - zed_wrapper
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-- +++ processing catkin package: 'brickbot_moveit_config'
-- ==> add_subdirectory(brickbot_moveit_config)
-- +++ processing catkin package: 'robot_control'
-- ==> add_subdirectory(robot_control)
-- +++ processing catkin package: 'ros_numpy'
-- ==> add_subdirectory(ros_numpy)
-- +++ processing catkin package: 'depth_for_brickbot'
-- ==> add_subdirectory(zed-ros-wrapper/depth_for_brickbot)
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- depth_for_brickbot: 1 messages, 0 services
-- +++ processing catkin package: 'moveit_calibration_plugins'
-- ==> add_subdirectory(moveit_calibration/moveit_calibration_plugins)
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- Found OpenCV: /usr (found version "3.2.0") found components: aruco
-- Found PythonLibs: /usr/lib/aarch64-linux-gnu/libpython2.7.so (found suitable version "2.7.17", minimum required is "2.7")
-- Found NumPy: /usr/lib/python2.7/dist-packages/numpy/core/include (found suitable version "1.13.3", minimum required is "1.7")
-- NumPy ver. 1.13.3 found (include: /usr/lib/python2.7/dist-packages/numpy/core/include)
-- +++ processing catkin package: 'brickbot_description'
-- ==> add_subdirectory(brickbot_description)
-- +++ processing catkin package: 'moveit_calibration_gui'
-- ==> add_subdirectory(moveit_calibration/moveit_calibration_gui)
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- Found OpenCV: /usr (found version "3.2.0")
-- +++ processing catkin package: 'zed_ar_track_alvar_example'
-- ==> add_subdirectory(zed-ros-examples/examples/zed_ar_track_alvar_example)
-- +++ processing catkin package: 'zed_depth_sub_tutorial'
-- ==> add_subdirectory(zed-ros-examples/tutorials/zed_depth_sub_tutorial)
-- +++ processing catkin package: 'zed_display_rviz'
-- ==> add_subdirectory(zed-ros-examples/zed_display_rviz)
-- +++ processing catkin metapackage: 'zed_examples'
-- ==> add_subdirectory(zed-ros-examples/zed_examples)
-- +++ processing catkin package: 'zed_interfaces'
-- ==> add_subdirectory(zed-ros-wrapper/zed_interfaces)
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- zed_interfaces: 3 messages, 13 services
-- +++ processing catkin package: 'zed_multicamera_example'
-- ==> add_subdirectory(zed-ros-examples/examples/zed_multicamera_example)
-- +++ processing catkin package: 'zed_nodelet_example'
-- ==> add_subdirectory(zed-ros-examples/examples/zed_nodelet_example)
-- +++ processing catkin package: 'zed_nodelets'
-- ==> add_subdirectory(zed-ros-wrapper/zed_nodelets)
-- A library with BLAS API found.
-- Found CUDA: /usr/local/cuda-10.0 (found suitable version "10.0", minimum required is "10")
-- Found TensorRT (found version "5.1")

-- Found CUDA: /usr/local/cuda-10.0 (found version "10.0")
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- +++ processing catkin package: 'zed_obj_det_sub_tutorial'
-- ==> add_subdirectory(zed-ros-examples/tutorials/zed_obj_det_sub_tutorial)
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- +++ processing catkin metapackage: 'zed_ros'
-- ==> add_subdirectory(zed-ros-wrapper/zed_ros)
-- +++ processing catkin package: 'zed_rtabmap_example'
-- ==> add_subdirectory(zed-ros-examples/examples/zed_rtabmap_example)
-- +++ processing catkin package: 'zed_sensors_sub_tutorial'
-- ==> add_subdirectory(zed-ros-examples/tutorials/zed_sensors_sub_tutorial)
-- +++ processing catkin package: 'zed_sync_test'
-- ==> add_subdirectory(zed-ros-examples/tests/zed_sync_test)
-- +++ processing catkin package: 'zed_tracking_sub_tutorial'
-- ==> add_subdirectory(zed-ros-examples/tutorials/zed_tracking_sub_tutorial)
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- +++ processing catkin package: 'zed_video_sub_tutorial'
-- ==> add_subdirectory(zed-ros-examples/tutorials/zed_video_sub_tutorial)
-- +++ processing catkin package: 'zed_wrapper'
-- ==> add_subdirectory(zed-ros-wrapper/zed_wrapper)
-- Configuring done
-- Generating done
-- Build files have been written to: /home/nvidia/brickbot_ws/build

Running command: "make -j8 -l8" in "/home/nvidia/brickbot_ws/build"

Scanning dependencies of target moveit_handeye_calibration_solver_core
Scanning dependencies of target moveit_handeye_calibration_target_core
[ 0%] Built target std_msgs_generate_messages_eus
[ 0%] Built target std_msgs_generate_messages_py
[ 0%] Built target std_msgs_generate_messages_nodejs
[ 0%] Built target std_msgs_generate_messages_lisp
[ 0%] Built target std_msgs_generate_messages_cpp
[ 0%] Built target sensor_msgs_generate_messages_eus
[ 0%] Built target brickbot_description_xacro_generated_to_devel_space_
[ 0%] Built target _depth_for_brickbot_generate_messages_check_deps_brickbot_msg
[ 2%] Built target zed_depth_sub
[ 2%] Built target geometry_msgs_generate_messages_eus
[ 2%] Built target _zed_interfaces_generate_messages_check_deps_stop_3d_mapping
[ 2%] Built target _zed_interfaces_generate_messages_check_deps_stop_remote_stream
[ 2%] Built target _zed_interfaces_generate_messages_check_deps_set_pose
[ 2%] Built target _zed_interfaces_generate_messages_check_deps_RGBDSensors
[ 2%] Built target _zed_interfaces_generate_messages_check_deps_start_object_detection
[ 2%] Built target _zed_interfaces_generate_messages_check_deps_reset_tracking
[ 3%] Building CXX object moveit_calibration/moveit_calibration_plugins/handeye_calibration_solver/CMakeFiles/moveit_handeye_calibration_solver_core.dir/src/handeye_solver_default.cpp.o
[ 4%] Building CXX object moveit_calibration/moveit_calibration_plugins/handeye_calibration_target/CMakeFiles/moveit_handeye_calibration_target_core.dir/src/handeye_target_aruco.cpp.o
[ 4%] Built target _zed_interfaces_generate_messages_check_deps_stop_object_detection
[ 4%] Building CXX object moveit_calibration/moveit_calibration_plugins/handeye_calibration_target/CMakeFiles/moveit_handeye_calibration_target_core.dir/src/handeye_target_charuco.cpp.o
[ 4%] Built target _zed_interfaces_generate_messages_check_deps_start_svo_recording
[ 4%] Built target _zed_interfaces_generate_messages_check_deps_stop_svo_recording
[ 4%] Built target _zed_interfaces_generate_messages_check_deps_start_remote_stream
[ 4%] Built target _zed_interfaces_generate_messages_check_deps_reset_odometry
[ 4%] Built target _zed_interfaces_generate_messages_check_deps_ObjectStamped
[ 4%] Built target sensor_msgs_generate_messages_lisp
[ 4%] Built target zed_interfaces_generate_messages_check_deps_toggle_led
[ 4%] Built target geometry_msgs_generate_messages_lisp
[ 4%] Built target zed_interfaces_generate_messages_check_deps_set_led_status
[ 4%] Built target zed_interfaces_generate_messages_check_deps_Objects
[ 4%] Built target zed_interfaces_generate_messages_check_deps_start_3d_mapping
[ 4%] Built target sensor_msgs_generate_messages_cpp
[ 4%] Built target geometry_msgs_generate_messages_cpp
[ 4%] Built target geometry_msgs_generate_messages_nodejs
[ 4%] Built target geometry_msgs_generate_messages_py
[ 4%] Built target sensor_msgs_generate_messages_py
[ 4%] Built target sensor_msgs_generate_messages_nodejs
[ 5%] Built target zed_nodelets_gencfg
[ 5%] Built target bond_generate_messages_nodejs
[ 5%] Built target rosgraph_msgs_generate_messages_py
[ 5%] Built target rosgraph_msgs_generate_messages_nodejs
[ 5%] Built target rosgraph_msgs_generate_messages_lisp
[ 5%] Built target rosgraph_msgs_generate_messages_cpp
[ 5%] Built target rosgraph_msgs_generate_messages_eus
[ 5%] Built target roscpp_generate_messages_lisp
[ 5%] Built target roscpp_generate_messages_py
Scanning dependencies of target tf2_msgs_generate_messages_py
Scanning dependencies of target actionlib_msgs_generate_messages_cpp
[ 5%] Built target roscpp_generate_messages_nodejs
[ 5%] Built target roscpp_generate_messages_eus
[ 5%] Built target tf2_msgs_generate_messages_py
[ 5%] Built target roscpp_generate_messages_cpp
Scanning dependencies of target tf2_msgs_generate_messages_nodejs
[ 5%] Built target actionlib_msgs_generate_messages_cpp
Scanning dependencies of target tf2_msgs_generate_messages_eus
Scanning dependencies of target actionlib_generate_messages_nodejs
Scanning dependencies of target tf2_msgs_generate_messages_lisp
[ 5%] Built target tf2_msgs_generate_messages_nodejs
[ 5%] Built target tf2_msgs_generate_messages_eus
Scanning dependencies of target actionlib_generate_messages_py
[ 5%] Built target actionlib_generate_messages_nodejs
[ 5%] Built target tf2_msgs_generate_messages_lisp
Scanning dependencies of target tf2_msgs_generate_messages_cpp
Scanning dependencies of target actionlib_msgs_generate_messages_eus
[ 5%] Built target actionlib_generate_messages_py
Scanning dependencies of target actionlib_msgs_generate_messages_lisp
[ 5%] Built target tf2_msgs_generate_messages_cpp
[ 5%] Built target actionlib_msgs_generate_messages_eus
Scanning dependencies of target actionlib_msgs_generate_messages_nodejs
Scanning dependencies of target actionlib_generate_messages_lisp
[ 5%] Built target actionlib_msgs_generate_messages_lisp
Scanning dependencies of target actionlib_msgs_generate_messages_py
Scanning dependencies of target actionlib_generate_messages_cpp
[ 5%] Built target actionlib_generate_messages_lisp
[ 5%] Built target actionlib_msgs_generate_messages_nodejs
Scanning dependencies of target actionlib_generate_messages_eus
[ 5%] Built target actionlib_msgs_generate_messages_py
[ 5%] Built target actionlib_generate_messages_cpp
Scanning dependencies of target dynamic_reconfigure_generate_messages_lisp
Scanning dependencies of target dynamic_reconfigure_gencfg
[ 5%] Built target actionlib_generate_messages_eus
Scanning dependencies of target dynamic_reconfigure_generate_messages_py
[ 5%] Built target dynamic_reconfigure_gencfg
[ 5%] Built target dynamic_reconfigure_generate_messages_lisp
Scanning dependencies of target dynamic_reconfigure_generate_messages_eus
[ 5%] Built target dynamic_reconfigure_generate_messages_py
Scanning dependencies of target dynamic_reconfigure_generate_messages_cpp
Scanning dependencies of target dynamic_reconfigure_generate_messages_nodejs
[ 5%] Built target dynamic_reconfigure_generate_messages_eus
[ 5%] Built target dynamic_reconfigure_generate_messages_cpp
[ 5%] Built target diagnostic_msgs_generate_messages_py
[ 5%] Built target dynamic_reconfigure_generate_messages_nodejs
[ 5%] Built target stereo_msgs_generate_messages_nodejs
[ 5%] Built target diagnostic_msgs_generate_messages_lisp
[ 5%] Built target nodelet_generate_messages_nodejs
[ 5%] Built target nodelet_generate_messages_cpp
[ 5%] Built target stereo_msgs_generate_messages_cpp
[ 5%] Built target stereo_msgs_generate_messages_py
[ 5%] Built target nodelet_generate_messages_eus
[ 5%] Built target bond_generate_messages_eus
[ 5%] Built target diagnostic_msgs_generate_messages_eus
[ 5%] Built target bond_generate_messages_py
[ 5%] Built target stereo_msgs_generate_messages_eus
[ 5%] Built target diagnostic_msgs_generate_messages_cpp
[ 5%] Built target nodelet_generate_messages_lisp
[ 5%] Built target bond_generate_messages_cpp
[ 5%] Built target bond_generate_messages_lisp
[ 5%] Built target diagnostic_msgs_generate_messages_nodejs
[ 5%] Built target stereo_msgs_generate_messages_lisp
[ 5%] Built target nodelet_generate_messages_py
[ 6%] Built target depth_for_brickbot_generate_messages_nodejs
[ 7%] Built target zed_sensors_sub
[ 10%] Built target zed_tracking_sub
[ 11%] Built target zed_video_sub
[ 12%] Built target depth_for_brickbot_generate_messages_cpp
[ 13%] Built target depth_for_brickbot_generate_messages_py
[ 14%] Built target depth_for_brickbot_generate_messages_lisp
[ 15%] Built target depth_for_brickbot_generate_messages_eus
[ 29%] Built target zed_interfaces_generate_messages_eus
[ 42%] Built target zed_interfaces_generate_messages_lisp
[ 53%] Built target zed_interfaces_generate_messages_cpp
[ 67%] Built target zed_interfaces_generate_messages_py
[ 79%] Built target zed_interfaces_generate_messages_nodejs
[ 79%] Built target depth_for_brickbot_generate_messages
[ 81%] Built target zed_wrapper_node
[ 82%] Built target ZEDSyncTest
[ 82%] Built target zed_interfaces_generate_messages
[ 84%] Built target zed_obj_det_sub
[ 87%] Built target ZEDNodelets
[ 88%] Linking CXX shared library /home/nvidia/brickbot_ws/devel/lib/libmoveit_handeye_calibration_solver_core.so
[ 88%] Built target moveit_handeye_calibration_solver_core
Scanning dependencies of target moveit_handeye_calibration_solver
[ 89%] Building CXX object moveit_calibration/moveit_calibration_plugins/handeye_calibration_solver/CMakeFiles/moveit_handeye_calibration_solver.dir/src/plugin_init.cpp.o
[ 90%] Linking CXX shared library /home/nvidia/brickbot_ws/devel/lib/libmoveit_handeye_calibration_target_core.so
[ 90%] Built target moveit_handeye_calibration_target_core
Scanning dependencies of target moveit_handeye_calibration_target
[ 90%] Building CXX object moveit_calibration/moveit_calibration_plugins/handeye_calibration_target/CMakeFiles/moveit_handeye_calibration_target.dir/src/plugin_init.cpp.o
[ 90%] Linking CXX shared library /home/nvidia/brickbot_ws/devel/lib/libmoveit_handeye_calibration_solver.so
[ 90%] Built target moveit_handeye_calibration_solver
[ 92%] Linking CXX shared library /home/nvidia/brickbot_ws/devel/lib/libmoveit_handeye_calibration_target.so
[ 92%] Built target moveit_handeye_calibration_target
Scanning dependencies of target moveit_handeye_calibration_rviz_plugin_core_autogen
[ 93%] Automatic MOC for target moveit_handeye_calibration_rviz_plugin_core
[ 93%] Built target moveit_handeye_calibration_rviz_plugin_core_autogen
Scanning dependencies of target moveit_handeye_calibration_rviz_plugin_core
[ 94%] Building CXX object moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_calibration_gui.cpp.o
[ 95%] Building CXX object moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_target_widget.cpp.o
[ 95%] Building CXX object moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_context_widget.cpp.o
[ 96%] Building CXX object moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_control_widget.cpp.o
[ 97%] Building CXX object moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/moveit_handeye_calibration_rviz_plugin_core_autogen/mocs_compilation.cpp.o
/home/nvidia/brickbot_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_context_widget.cpp: In member function ‘void moveit_rviz_plugin::ContextTabWidget::updateAllMarkers()’:
/home/nvidia/brickbot_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_context_widget.cpp:327:16: error: ‘class rviz_visual_tools::TFVisualTools’ has no member named ‘clearAllTransforms’; did you mean ‘publishAllTransforms’?
tf_tools
->clearAllTransforms();
^~~~~~~~~~~~~~~~~~
publishAllTransforms
/home/nvidia/brickbot_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_context_widget.cpp:376:81: error: no matching function for call to ‘moveit_visual_tools::MoveItVisualTools::publishMesh(Eigen::Isometry3d&, shape_msgs::Mesh&, rviz_visual_tools::colors, double, const char [4], int)’
visual_tools
->publishMesh(fov_pose
, mesh, rvt::YELLOW, 1.0, "fov", 1);
^
In file included from /home/nvidia/brickbot_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/include/moveit/handeye_calibration_rviz_plugin/handeye_context_widget.h:58:0,
from /home/nvidia/brickbot_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_context_widget.cpp:37:
/opt/ros/melodic/include/rviz_visual_tools/rviz_visual_tools.h:829:8: note: candidate: bool rviz_visual_tools::RvizVisualTools::publishMesh(const Isometry3d&, const string&, rviz_visual_tools::colors, double, const string&, std::size_t)
bool publishMesh(const Eigen::Isometry3d& pose, const std::string& file_name, colors color = CLEAR, double scale = 1,
^~~~~~~~~~~
/opt/ros/melodic/include/rviz_visual_tools/rviz_visual_tools.h:829:8: note: no known conversion for argument 2 from ‘shape_msgs::Mesh {aka shape_msgs::Mesh
<std::allocator >}’ to ‘const string& {aka const std::cxx11::basic_string&}’
/opt/ros/melodic/include/rviz_visual_tools/rviz_visual_tools.h:831:8: note: candidate: bool rviz_visual_tools::RvizVisualTools::publishMesh(const Pose&, const string&, rviz_visual_tools::colors, double, const string&, std::size_t)
bool publishMesh(const geometry_msgs::Pose& pose, const std::string& file_name, colors color = CLEAR,
^~~~~~~~~~~
/opt/ros/melodic/include/rviz_visual_tools/rviz_visual_tools.h:831:8: note: no known conversion for argument 1 from ‘Eigen::Isometry3d {aka Eigen::Transform<double, 3, 1>}’ to ‘const Pose& {aka const geometry_msgs::Pose
<std::allocator >&}’
/home/nvidia/brickbot_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_control_widget.cpp: In member function ‘bool moveit_rviz_plugin::ControlTabWidget::solveCameraRobotPose()’:
/home/nvidia/brickbot_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_control_widget.cpp:399:20: error: ‘class rviz_visual_tools::TFVisualTools’ has no member named ‘clearAllTransforms’; did you mean ‘publishAllTransforms’?
tf_tools
->clearAllTransforms();
^~~~~~~~~~~~~~~~~~
publishAllTransforms
moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/build.make:110: recipe for target 'moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_context_widget.cpp.o' failed
make[2]: *** [moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_context_widget.cpp.o] Error 1
make[2]: *** Waiting for unfinished tasks....
moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/build.make:134: recipe for target 'moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_control_widget.cpp.o' failed
make[2]: *** [moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_control_widget.cpp.o] Error 1
CMakeFiles/Makefile2:5551: recipe for target 'moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/all' failed
make[1]: *** [moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/all] Error 2
Makefile:140: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j8 -l8" failed

solution is way off

Setup

I am trying to calibrate in an eye-in-hand setting, where the "hand" is actually the head of a fixed-based robot with a top-mounted camera. I am looking for the extrinsic calibration between the head and the camera mount point.

The kinematic chain is as follows:
base → torso → head → camera_mount → optical-frame

The base → torso → head is given by the joint states, camera_mount → optical-frame is a static transform given by the camera models (Intel RealSense in this case). I am looking for the transformation head → camera_mount, which I can get via head → optical-frame (minus the known static transform camera_mount → optical-frame).

Edit: From #58 (comment), it is actually possible to find head → camera_mount directly by setting the camera_mount as the "Sensor frame".

Calibration

This figure shows the setup with an initial guess:
rviz_screenshot_2021_01_18-13_56_06

$ rosrun tf tf_echo HEAD_JOINT1_Link nextagea_realsense_color_optical_frame
At time 1610978311.120
- Translation: [0.021, 0.033, 0.165]
- Rotation: in Quaternion [-0.558, 0.558, -0.435, 0.435]
            in RPY (radian) [-1.817, 0.000, -1.571]
            in RPY (degree) [-104.106, 0.005, -90.001]

After collecting samples and running the solver, the plugin provides the "solution":
rviz_screenshot_2021_01_18-14_01_03

$ rosrun tf tf_echo HEAD_JOINT1_Link nextagea_realsense_color_optical_frame
At time 1610978514.896
- Translation: [-0.256, 0.124, 0.528]
- Rotation: in Quaternion [0.695, -0.424, 0.432, -0.387]
            in RPY (radian) [-1.917, -0.275, -1.290]
            in RPY (degree) [-109.850, -15.774, -73.896]

Is this an issue with the solver or is the result interpreted incorrectly?

I cannot finish calibration process

At the last tab of 'HandEyeCalibration' panel ('Calibrate'), I can take four different samples with manual calibration and I can record those joint states. But, when I try to take the fifth sample it appears the next error message: "Solver failed to return a calibration". I don't know the reason of that issue.

Add document about how to setup FOV in Rviz

Our FOV is not displayed. Do we need to add anything to the scene? We tried adding the target_calibration/camera/raw topic as a display to Rviz, but moving its window around caused Rviz to crash.

Copyright Holder?

The LICENSE file assigns the copyright to "ROS Planning" - but that's not a real legal entity to hold the copyright. Most of the files inside are assigned to Intel Corporation - I'd suggest the LICENSE file either be to match "Copyright 2019 Intel Corporation". You can also assign future work to the Open Source Robotics Foundation, which is a real legal entity (or a MoveIt foundation someday).

Repeated target initialization

The TargetTabWidget::imageCallback function (in moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_target_widget.cpp) creates a new target object every time it is called. Since the target initialization generates some info messages, this ends up spamming the terminal. I'll have to look into it further, but it seems unnecessary to recreate the target with each image--it's probably sufficient to verify that it has been created, and to only re-initialize if the parameters change.

Issues with multiple copies of panel in RViz

A colleague and I have both made the mistake of "closing" the MoveIt Calibration panel by pressing the X, and then adding a new one from the Panels menu. Since the X doesn't stop the panel, but just hides it, we then had problematic behavior when, for instance, both panels publish target detections to /handeye_calibration/target_detection.

To check if this has happened to you, and correct it, go to Panels->Delete Panel and look for multiple HandEyeCalibration entries (and delete them, or all but one):
Menu_030

Also, if you see these errors in the terminal, you've probably added a second HandEyeCalibration panel:

[ERROR] [1594318848.806460152]: Tried to advertise a service that is already advertised in this node 
[/handeye_calibration/target_detection/compressedDepth/set_parameters]
[ERROR] [1594318848.813994890]: Tried to advertise a service that is already advertised in this node 
[/handeye_calibration/target_detection/compressed/set_parameters]
[ERROR] [1594318848.821224191]: Tried to advertise a service that is already advertised in this node 
[/handeye_calibration/target_detection/theora/set_parameters]

Longer-term, we should probably change the behavior when a second panel is added. I'm not familiar enough with RViz to say now what the best approach is, but I can look into it.

Feature: Show recorded samples

There should be a checkbox in the Calibrate panel that displays recorded samples. Using the markers from rviz_visual_tools, a box + cone could represent the camera, and a plane + LabeledAxis the detected marker.

This would be the first step towards showing the user which samples gave a bad result, and which regions (or new camera poses) they should take more samples from.

I think this comment shows how awkward it is to try to debug calibration in text, and how helpful a screenshot of the setup and the poses would be instead.

Rename "master" to "main"

Since this is being discussed on the main MoveIt repo, I wanted to bring it up here, too. I think whatever is done there should also be mirrored here--no reason for inconsistency amongst MoveIt-branded software.

I'll add my personal opinion: I think it's a small change, both in terms of the effort required and the benefit gained. It won't magically fix the myriad of thorny issues related to inclusion and representation in robotics (at least in the US), but it is a step in the right direction.

transforms are published even if disabled

The plugin seems to publish to /tf, event when it is disabled, i.e. when it is unchecked in the Display overview. This is very dangerous as it will override transforms published by a robot state publisher, once RViz starts. I would expect that the calibrated transform is only published once a calibration procedure has finished.

MoveIt Calibration Noetic Release (0.1.0)

moveit.ros.org release documentation

Checklist

  • Update changelogs
  • Run ROS buildfarm prerelease test
  • Bloom
  • Write discourse post

Points for discussion

  1. Since it doesn't look like this has been released you can likely release it for both melodic and noetic off the same branch (master). Once you do this though on your next release you will likely only be able to release Noetic because there is an expectation that released packages for Melodic do not change.

Target display error

This happens when using an odd padding size with a ChArUco target. When the target is saved it's correct, though, so not a huge problem.
target_display

Determine if source build of MoveIt is necessary, adjust CI as appropriate

Currently, the .rosinstall file causes MoveIt to be built from source on CI. If it is necessary to build from source, then CI should be updated to use the MoveIt source build docker image. If not, the .rosinstall entry should be removed.

I'm marking this as a "good first issue" because at least the initial step of trying to build MoveIt Calibration without MoveIt is straightforward:

  1. set up a workspace following the MoveIt "Getting Started" tutorial,
  2. add the MoveIt Calibration source,
  3. rm -rf src/moveit
  4. Try to build, see if it works.

Crash in calibration after image topic is selected

Hello.

I'm trying to calibrate my realsense and UR3 setup using this toolkit.
I have ROS Melodic and Ubuntu 18.04
OpenCV version is 3.4

I have installed UR_ROBOT_DRIVER and compiled the calibration toolbox.
When I run rviz I see the plugin GUI but it crashes once I pick the camera topic from the drop list.
It gives me this error.

[ERROR] [1606993910.391483979]: Invalid target measured dimensions: marker_size 0,000000, marker_seperation 0,000000

I definitely set the numbers in the GUI but for some reason the backend is not aware of them

Does anybody else experience this?

Wrong Pose-Estimate

Hello,

I am performing moveit calibration using ur5e robot arm and realsense camera. As far as I understand, calibration publish tf frame named with handeye_target after estimating pose of marker respect to camera_color_optical_frame. In my case, estimated pose is wrong (see image). It marked correct frame on image but tf transformation is wrong. Is any one facing same problem and solve? I am using ROS_DISTRO as melodic.

calibration_problem

Camera intrinsics not read correctly from the ros topic (unsupported distortion models)

Ubuntu 18.04, OpenCV 3.2, ROS Melodic.

I was trying to use this plugin to perform hand eye calibration for a Realsense D435 mounted on a UR5e arm, all in simulation in Gazebo 9.0. My calibration pattern was a ChArUco board.

While the markers were correctly detected, I noticed that the interpolated corners and the pose of the board with respect to the camera was always off. This obviously resulted in very bad calibration results. I noticed that when I implemented a Charuco board detection independent of this plugin (using (reference)) the pose of the board was being computed correctly.

See the figure below. Left - from this plugin, Right - independent implementation of the exact same ChArUco code .
image

Digging further, I noticed that when I printed the camera_martrix_ from inside handeye_target_charuco.cpp, it was always identity (the value it is initialized to), but the echoing the rostopic actually had the correct values.

load samples and solve manually

The GUI allows to export samples (Save samples) to a yaml file, but it does not provide an option to load those again. The use-case would be that someone might want to apply the solver again on this set of samples. This would also require that the solver could be called manually. At the moment, the only way to call the solver is by adding more than 4 samples.

Camera calibration without robotic arm

I am currently trying to calibrate a camera without using a robotic arm. i have successfully installed moveit_calibration. I currently have the camera position in the world using the Vicon motion system.

On rivz, I open HandEyeCalibration. In the Geometric Context tab, for End-effector frame Robot-base frame, I can only select parts of the robot arm (but I don't have it physically).

Is there another way to calibrate a camera with moveit and without using a robot arm?

Separate out functionality in calibration tab

The GUI implementation (currently in #4) combines collecting samples, recording joint states, and executing the solver. All three happen when the "Take Sample" button is pressed, but it would be helpful and more transparent to be able to execute these steps independently. This will require updating the GUI on the calibration tab, and possibly adding another tab. We can discuss here how we want to accomplish this.

Moveit_Calibration Compile Issue

Hi Moveit_Calibration Team,

I am trying to use moveit_calibration to achieve the hand-eye calibration, but meet some issue during the compilation.

I used catkin build to compile this ROS package in ROS melodic. It shows me the following error:
error: ‘class rviz_visual_tools::TFVisualTools’ has no member named ‘clearAllTransforms’; did you mean ‘publishAllTransforms’?
error: no matching function for call to ‘moveit_visual_tools::MoveItVisualTools::publishMesh(Eigen::Isometry3d&, shape_msgs::Mesh&, rviz_visual_tools::colors, double, const char [4], int)’

Are there any solutions for these two error? Besides, can this package work on ROS kinetic?

Thank you so much!

I cannot fill "Object frame" field in "Context" tab

I launch RViz environment and I add the new panel "HandEyeCalibration". I can create succesfully the target in the first tab ("Target"), but when I access to the second tab ("Context"), I cannot fill "Object frame" field which appears empty (without any option to select).

Simulation environment

A Gazebo simulation, with rendered images, would help for development and testing of this package. It'd also be helpful so we can provide a tutorial that doesn't depend on having hardware. I've started working on this, but I've not had a lot of time for it and I'm new to Gazebo, so it's slow going so far.

Small bug encountered when using automatic calibration

Hi all,

I find a bug when I use automatic calibration to execute the recorded joint states.
After the execution of the plan is done, it will not automatically take a sample.

I trace source codes of the GUI and find that it results from the if statement as shown in the figure below.
If execution succeeds, variable planning_res_ will be equal to ControlTabWidget::SUCCESS, which is equal to 0.
The if statement if (planning_res_) which includes the function to take sample will not be entered.
I solve it by changing the statement into if (planning_res_ == ControlTabWidget::SUCCESS).

image
image

Sincerely,

Frank

Add capability to refresh the Move Group lists

One could save the parameters by saving the Rviz config file, but when reopening Rviz, the list of Planning Groups is empty. This can be fixed by restarting the plugin, but this resets it fully, discarding the marker parameters and other settings. Can the list of planning groups be reloaded upon clicking on it, maybe?

Create document for what the "Marker Size" means

Is the "Marker Size" in the Target tab referring to the size of one marker or the marker board? If it is possible to add a hover tooltip, this would be great to disambiguate (new users have much simpler questions than this).

ImportError: No module named handeye.calibrator

Hi,

I found the next error when trying to solve the position of the camera.

Rviz is giving me the error that: Solver failed to return a calibration.

When I check in the terminal I get the next specific error.

[ERROR] [1615289697.307779166]: Failed to load python module: handeye.calibrator
ImportError: No module named handeye.calibrator

I have been looking for this python package and I can not find any. Is this a dependency of moveit_calibration and where do I find it?

Error importing numpy with Python 3

Description

While running the calibration, the plugin reports an issue of importing numpy

[ERROR] [1616890018.137078130]: Error importing numpy:

Which refers to the code:

bool HandEyeSolverDefault::solve(const std::vector<Eigen::Isometry3d>& effector_wrt_world,
                                 const std::vector<Eigen::Isometry3d>& object_wrt_sensor, SensorMountType setup,
                                 const std::string& solver_name)
{
  // Check the size of the two sets of pose sample equal
  if (effector_wrt_world.size() != object_wrt_sensor.size())
  {
    ROS_ERROR_STREAM_NAMED(LOGNAME, "The sizes of the two input pose sample "
                                    "vectors are not equal: "
                                    "effector_wrt_world.size() = "
                                        << effector_wrt_world.size()
                                        << " and object_wrt_sensor.size() == " << object_wrt_sensor.size());
    return false;
  }

  auto it = std::find(solver_names_.begin(), solver_names_.end(), solver_name);
  if (it == solver_names_.end())
  {
    ROS_ERROR_STREAM_NAMED(LOGNAME, "Unknown handeye solver name: " << solver_name);
    return false;
  }

  char program_name[7] = "python";
#if PY_MAJOR_VERSION >= 3
  Py_SetProgramName(Py_DecodeLocale(program_name, NULL));
#else
  Py_SetProgramName(program_name);
#endif
  static bool numpy_loaded{ false };
  if (!numpy_loaded)  // Py_Initialize() can only be called once when numpy is
                      // loaded, otherwise will segfault
  {
    Py_Initialize();
    atexit(Py_Finalize);
    numpy_loaded = true;
  }
  ROS_DEBUG_STREAM_NAMED(LOGNAME, "Python C API start");

  // Load numpy c api
  if (_import_array() < 0)
  {
    ROS_ERROR_STREAM_NAMED(LOGNAME, "Error importing numpy: ");
    return false;
  }

I noticed that ROS neotic has switched to Python 3 and I believe in thie calibration tool is still using python 2. Is there any plan to update it? (At least in the CMakeLists.txt it is still searching for python 2)

Your environment

  • ROS Distro: Neotic
  • OS Version: Ubuntu 20.04
  • Commit: 3ffd736

More informative feedback when solver fails

When the solver fails, the error dialog just says, "Solver failed to return a calibration". More descriptive errors are printed in the terminal output, but there's often a lot of irrelevant information there that can make the actual error hard to find. I propose modifying the solver plugin interface to include an error string output, which could then be appended to the error dialog. I think this issue would be appropriate for a first-time contributor. I plan to have the good-first-issue stuff prepared in time for World MoveIt Day.

Can't collect samples

I am able to set all the parameters correctly, including:

  • measured marker size
  • measured separation
  • image topic
  • CameraInfo topic
  • sensor frame
  • object frame
  • end-effector frame
  • robot base frame
  • sensor configuration (eye-in-hand)

I can visualize the detections in the image and they make sense.

I have Planning Group set as well as a solver selected.

When I go to the calibrate tab, I see a loading animation for "Recorded joint state progress".

When I press "take sample" nothing happens. Same goes for all the other buttons.

In RViz, I can visualize all the tf frames and they all look correct and are updated as I move my manipulator.

The console output for rviz is this:

[ INFO] [1617728246.009939691]: Set target real dimensions:  
marker_measured_size 0.059000
marker_measured_separation 0.006000

[ INFO] [1617728248.111243317]: Set target real dimensions:  
marker_measured_size 0.059000
marker_measured_separation 0.006000

[ INFO] [1617728250.211069565]: Set target real dimensions:  
marker_measured_size 0.059000
marker_measured_separation 0.006000

[ INFO] [1617728252.310789944]: Set target real dimensions:  
marker_measured_size 0.059000
marker_measured_separation 0.006000

[ INFO] [1617728254.409946600]: Set target real dimensions:  
marker_measured_size 0.059000
marker_measured_separation 0.006000

[ INFO] [1617728256.509782447]: Set target real dimensions:  
marker_measured_size 0.059000
marker_measured_separation 0.006000

[ INFO] [1617728258.610325458]: Set target real dimensions:  
marker_measured_size 0.059000
marker_measured_separation 0.006000

[ INFO] [1617728260.709478529]: Set target real dimensions:  
marker_measured_size 0.059000
marker_measured_separation 0.006000

I'm wondering what could be wrong? Does anyone know what I could be missing? Any ideas on how to debug this issue?

Plugin library is not found after building

I just added and built this package, but get this error when I try to add the panel in Rviz:

The class required for this panel, 'moveit_rviz_plugin/HandEyeCalibration', could not be loaded.
Error:
Failed to load library /root/underlay_ws/devel/lib/libmoveit_handeye_calibration_rviz_plugin.so. Make sure that you are calling the PLUGINLIB_EXPORT_CLASS macro in the library code, and that names are consistent between this macro and your XML. Error string: Could not load library (Poco exception = libmoveit_planning_scene_monitor.so.1.0.4: cannot open shared object file: No such file or directory)

In the terminal, almost the same text is displayed, except the first sentence is PluginlibFactory: The plugin for class 'moveit_rviz_plugin/HandEyeCalibration' failed to load.

Has anyone else encountered this? I did not find anything upon cursory googling. Maybe it is a simple setup issue one of you has run into before.

Confused about which sensor frame to select

Hi

I'm trying to do eye-to-hand calibration with a between a Baxter robot and a Kinect2 camera. However I can't figure out which frame I need to select as Sensor frame: in the Context tab.

In this tutorial the frame camera_color_optical_frame is selected. Is this a frame created by the calibration plugin itself? I've found a similar frame in my tf_tree called kinect2_rgb_optical_frame, but I can't select that as sensor frame. I think this might be because it doesn't have "camera" in its name, which seems to be required according to this comment.
kinect2_rgb_optical_frame

Hard to always achieve satisfactory precision

Hi,

Context :
Currently working at Niryo, I'm using your handeye calibration to get the transform of our embedded cameras on two URs !
After solving some chessboard miss detection, and set correctly all transforms, I feed 7 samples of our UR10 into Danilidis algorithm. I instantly tried to do a vision pick to check how well the calibration has been ... It was very nice. The overall error was at the most 5 mm which is great knowing that we are 40 cm above the object when taking the picture.
Then, I went to our UR5, launch the ROS stack and took 10 samples from various positions (I tried to "turn around" the chessboard). I was confident, and when I tried, results wasn't so good as it was on the UR10 even if I felt like my sample had more sense. The first time, error was +2cm and after taking new samples, error was ~1cm

Issue Identified:
Danilidis algorithm uses algebric Maths in order to find the optimal solution. Nevertheless, there may be missing information about some DoF in the samples, which lead to a solution that only fit the samples given, and the solution given can differ from the ground truth. Also, as AX=XB system use a pair of 2 samples, the order in which are feed the samples impact the result ! I tried that by shuffling samples order, and result can change up to 2 cm !

Solution proposal:
As far I can see, this algorithm can bring excellent result which a few images. Maybe you can give me some hints on "how samples should be taken to lead to optimal solution". For instance, @JStech in the issue #40 is talking about all these DoF constraints.

At the end of this issue, we can add all of these information on the Wiki or a tutorial, to help users achieve a good calibration !

Cheers 😄

Solution is off in translation along one Axis

Description
Visually solutions appear to be correct in regards to orientation. However they are always off by some translation along the z Axis.
In the screenshot below the camera should be closer to link_5_b along the z Axis.

example_

Configuration

  • Ros Melodic
  • OpenCV 3.2 w/Charuco board
  • Realsense d435 (color stream)
  • Yaskawa GP12 Robot Arm
  • Camera is rigidly attached to link_5_b (so it has 5 DOF instead of 6)
  • Samples taken, 15-20;

What could be potentially the cause of this?

specify the convention for Euler angles output

I have spent quite a bit of time to understand which convention for angles has been used here. ROS REP-103 defines it as
"Euler angles yaw, pitch, and roll about Z, Y, X axes respectively"

Static transform publisher uses the same convention. But it seems that the code follows Eigen's one where rotation around X-axis is specified first, giving RPY as the translation around X, Y, Z axis, respectively. Which, in turn, being directly plugged into the further code, induces a mess and a feeling that the calibration is broken.
https://eigen.tuxfamily.org/dox/group__Geometry__Module.html#title20

Since the authors have switched to the quaternion from Euler angles it may not sound important anymore, but, still, RPY is published as the comment in the launch file. It would be less confusing if the axis order would be specified. Thanks

Is there a minimal robot's DOF for calibration solver to work?

Hi,
have you ever tried to calibrate a robot that has less than 6DOF with one of those three solvers that this package is using?
If yes, which one is the best for such a problem?
I have a robot with 4DOF where pitch and yaw are missing.
Thank you in advance!

catkin_make failed while building.

So I was trying to build and test it on my bot but I am getting these functions missing errors. I tried downloading it from the source but didn't wrk. Any help would be appreciated. Thanks in advance



[ 95%] Building CXX object moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/moveit_handeye_calibration_rviz_plugin_core_autogen/mocs_compilation.cpp.o
/home/sachin/Desktop/luxonis/ros_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_context_widget.cpp: In member function ‘void moveit_rviz_plugin::ContextTabWidget::updateAllMarkers()’:
/home/sachin/Desktop/luxonis/ros_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_context_widget.cpp:327:16: error: ‘class rviz_visual_tools::TFVisualTools’ has no member named ‘clearAllTransforms’; did you mean ‘publishAllTransforms’?
     tf_tools_->clearAllTransforms();
                ^~~~~~~~~~~~~~~~~~
                publishAllTransforms
/home/sachin/Desktop/luxonis/ros_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_context_widget.cpp:376:81: error: no matching function for call to ‘moveit_visual_tools::MoveItVisualTools::publishMesh(Eigen::Isometry3d&, shape_msgs::Mesh&, rviz_visual_tools::colors, double, const char [4], int)’
           visual_tools_->publishMesh(fov_pose_, mesh, rvt::YELLOW, 1.0, "fov", 1);
                                                                                 ^
In file included from /home/sachin/Desktop/luxonis/ros_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/include/moveit/handeye_calibration_rviz_plugin/handeye_context_widget.h:57:0,
                 from /home/sachin/Desktop/luxonis/ros_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_context_widget.cpp:37:
/opt/ros/melodic/include/rviz_visual_tools/rviz_visual_tools.h:829:8: note: candidate: bool rviz_visual_tools::RvizVisualTools::publishMesh(const Isometry3d&, const string&, rviz_visual_tools::colors, double, const string&, std::size_t)
   bool publishMesh(const Eigen::Isometry3d& pose, const std::string& file_name, colors color = CLEAR, double scale = 1,
        ^~~~~~~~~~~
/opt/ros/melodic/include/rviz_visual_tools/rviz_visual_tools.h:829:8: note:   no known conversion for argument 2 from ‘shape_msgs::Mesh {aka shape_msgs::Mesh_<std::allocator<void> >}’ to ‘const string& {aka const std::__cxx11::basic_string<char>&}’
/opt/ros/melodic/include/rviz_visual_tools/rviz_visual_tools.h:831:8: note: candidate: bool rviz_visual_tools::RvizVisualTools::publishMesh(const Pose&, const string&, rviz_visual_tools::colors, double, const string&, std::size_t)
   bool publishMesh(const geometry_msgs::Pose& pose, const std::string& file_name, colors color = CLEAR,
        ^~~~~~~~~~~
/opt/ros/melodic/include/rviz_visual_tools/rviz_visual_tools.h:831:8: note:   no known conversion for argument 1 from ‘Eigen::Isometry3d {aka Eigen::Transform<double, 3, 1>}’ to ‘const Pose& {aka const geometry_msgs::Pose_<std::allocator<void> >&}’
/home/sachin/Desktop/luxonis/ros_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_control_widget.cpp: In member function ‘bool moveit_rviz_plugin::ControlTabWidget::solveCameraRobotPose()’:
/home/sachin/Desktop/luxonis/ros_ws/src/moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/src/handeye_control_widget.cpp:395:20: error: ‘class rviz_visual_tools::TFVisualTools’ has no member named ‘clearAllTransforms’; did you mean ‘publishAllTransforms’?
         tf_tools_->clearAllTransforms();
                    ^~~~~~~~~~~~~~~~~~
                    publishAllTransforms
moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/build.make:110: recipe for target 'moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_context_widget.cpp.o' failed
make[2]: *** [moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_context_widget.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/build.make:134: recipe for target 'moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_control_widget.cpp.o' failed
make[2]: *** [moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/src/handeye_control_widget.cpp.o] Error 1
CMakeFiles/Makefile2:4946: recipe for target 'moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/all' failed
make[1]: *** [moveit_calibration/moveit_calibration_gui/handeye_calibration_rviz_plugin/CMakeFiles/moveit_handeye_calibration_rviz_plugin_core.dir/all] Error 2

Error: Invalid target measured dimensions: marker_size 0,000000, marker_seperation 0,000000

Hi

I'm trying to do eye-to-hand calibration with a Baxter robot and a Kinect2 camera.

This is what my setup looks like:
setup_baxter_kinect2

I printed the default Aruco board and filled in the measured marker size and separation in the UI. Then, as soon as I select my Image Topic: /kinect2/hd/image_color, RViz freezes and this error message appears in the terminal:
Error: Invalid target measured dimensions: marker_size 0,000000, marker_seperation 0,000000

Here's a screenshot of the situation:
invalid_target_measured_dimension

Does anyone know what's going wrong?

I'm using Ubuntu 18.04 and ROS melodic.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.