Coder Social home page Coder Social logo

Comments (7)

mintar avatar mintar commented on July 20, 2024

In order to execute @mintar's suggestion in issue #21 re: rviz, I assume I will have to map out the environment (such as a room) first. Is that right and if so, what is the best way to do that?

You have to map the environment regardless of whether you want to use RViz, the command line or (Python/C++/...) code. The way to do it is to simply use the mapping functionality from the MiR web interface (see the MiR docs). Once you've created and selected a map in the MiR web interface, it will also be visible from RViz.

How can I get the robot to get to a target pose (x, y, theta) from the command line?

rostopic pub /move_base_simple/goal geometry_msgs/PoseStamped 'header: 
  frame_id: "map"
pose: 
  position: 
    x: 11.0602626801
    y: 17.1888198853
  orientation: 
    x: 0.0
    y: 0.0
    z: 0.17364818
    w: 0.98480775'

You can calculate the orientation quaternion using python. For example, for yaw = 20°:

python -c 'from tf.transformations import quaternion_from_euler; from math import pi; print quaternion_from_euler(0, 0, (20.0 / 180.0 * pi))'
[ 0.          0.          0.17364818  0.98480775]

This does the same as the RViz solution: it publishes to the /move_base_simple/goal topic. RViz has the advantage that you can actually see the map, so I'd recommend that.

If you later want to do this from a program, don't do this; create a move_base action client instead.

Does specifying the target destination invoke the DWB planner?

Only in Gazebo. On the real robot, the mir_bridge simply forwards the message to the ROS software that runs on the internal MiR PC. The MiR PC has a different (proprietary) local planner that works even better than the DWB configuration we have for Gazebo.

The robot's position in real life is consistent with the position returned by /robot_pose but its orientation is not. Any thoughts on how to calibrate it accurately?

Hmm. If you create a map on the MiR, and the robot is properly localized within the map, it should work. Also, I've never used the robot_pose topic. Instead, I use TF:

rosrun tf tf_echo /map /base_link

from mir_robot.

rsr152 avatar rsr152 commented on July 20, 2024

Thanks @mintar . This is immensely helpful!

It's interesting to note that MiR's proprietary planner works better. It is a bit "unsure" in the missions I run and oscillates a lot to get to the right orientation after reaching a destination (x,y).

Earlier today, I launched mir_driver and simultaneously ran "missions" through MiR's interface. However, I was unable to echo the cmd_vel topic. rostopic echo /cmd_vel did not return a result, a message or even an error. However, I was able to echo robot_pose. I assumed there would be a subscription to the cmd_vel topic. Is that not the case? I appreciate your help.

from mir_robot.

rsr152 avatar rsr152 commented on July 20, 2024

Is there a way to use mir_driver capture the MiR robot's linear and angular velocities (and acceleration) when running "missions" from MiR's web interface, and write them to a .csv or flat file? Attaching a sample of the data fields I would like to capture in real time to analyze offline. Thanks!

Screen Shot 2019-05-01 at 3 45 57 PM

from mir_robot.

mintar avatar mintar commented on July 20, 2024

It's interesting to note that MiR's proprietary planner works better. It is a bit "unsure" in the missions I run and oscillates a lot to get to the right orientation after reaching a destination (x,y).

That usually happens if the robot somehow ended up a bit sideways (i.e., in "y" direction) of the goal pose. Since the MiR platform cannot drive sideways, this often results in a lot of back-and-forth maneuvering until the goal constraints are satisfied. Sometimes this is caused by localization uncertainties: Just before reaching the goal pose, the localization jumps sideways by 10 cm or so, and then the robot needs to do the back-and-forth. It often helps to make sure that the map is accurate and up-to-date.

Earlier today, I launched mir_driver and simultaneously ran "missions" through MiR's interface. However, I was unable to echo the cmd_vel topic. rostopic echo /cmd_vel did not return a result, a message or even an error. However, I was able to echo robot_pose. I assumed there would be a subscription to the cmd_vel topic. Is that not the case? I appreciate your help.

Ah, right. The cmd_vel topic is only forwarded in one direction by the mir_bridge: from the external PC to the internal MiR PC. That's why you cannot echo messages that originate from the internal PC. The reason why it's only forwarded one way is that I wanted to avoid message loops that would happen if a topic is forwarded in both directions. I've already implemented breaking the loop in one direction, but the other direction is not so easy.

What you can do is subscribe to the MiR roscore directly instead of using the mir_bridge:

export ROS_MASTER_URI=http://192.168.12.20:11311
export ROS_HOSTNAME=<your external PC IP here: 192.168.12.something>
rostopic echo /cmd_vel

Is there a way to use mir_driver capture the MiR robot's linear and angular velocities (and acceleration) when running "missions" from MiR's web interface, and write them to a .csv or flat file? Attaching a sample of the data fields I would like to capture in real time to analyze offline. Thanks!

If you're on MiR software 1.9.*, do this:

rostopic echo -p --noarr /odom_comb

If you're on MiR software 2.*, do this:

rostopic echo -p --noarr /odom

The topics in mir_bridge are still for MiR software 1.9, so for this second case to work you either have to change this line from odom_comb to odom:

https://github.com/dfki-ric/mir_robot/blob/7682e991e0efb66b1fcfbf0a4c2b42f9a837235a/mir_driver/nodes/mir_bridge.py#L182

... or you have to connect directly to the roscore (see above).

This will output something like this:

1513327471401208443,27451,1513327471401208443,/odom_comb,base_footprint,0.0,0.0,0.0,0.0,0.0,-7.69074383592e-05,0.999999997043,0.0,0.0,0.0,0.0,0.0,0.0

The first field (1513327471401208443) is the message received time in nanoseconds (i.e.,
1513327471.401208443 seconds, or 8:44:31 am UTC on Friday, December 15, 2017:

https://www.wolframalpha.com/input/?i=1513327471.401208443+unix+time

The remaining fields are from nav_msgs/Odometry, except the covariance arrays (because of the --noarr option):

$ rosmsg show nav_msgs/Odometry
std_msgs/Header header
  uint32 seq
  time stamp
  string frame_id
string child_frame_id
geometry_msgs/PoseWithCovariance pose
  geometry_msgs/Pose pose
    geometry_msgs/Point position
      float64 x
      float64 y
      float64 z
    geometry_msgs/Quaternion orientation
      float64 x
      float64 y
      float64 z
      float64 w
  float64[36] covariance
geometry_msgs/TwistWithCovariance twist
  geometry_msgs/Twist twist
    geometry_msgs/Vector3 linear
      float64 x
      float64 y
      float64 z
    geometry_msgs/Vector3 angular
      float64 x
      float64 y
      float64 z
  float64[36] covariance

So as you can see, the message has position, orientation, linear and angular speed. You'll have to calculate acceleration and total distance traveled yourself from these values - this is extremely easy by writing a small ROS node (only like 10 lines of code in python, slightly more in C++). Alternatively, you can also get the field moved from the topic mir_status (v 1.9) or robot_status (v 2.*) to get the total distance traveled since power-on.

Despite its name, the odom_comb (resp. odom) topic not only contains the odometry information, but the fused output from odometry and IMU.

from mir_robot.

rsr152 avatar rsr152 commented on July 20, 2024

This is very helpful! Thank you.
I was trying to send the robot to a destination using RViz similar to #21. I followed the steps there and confirmed the settings are correct but the robot does not move. Since #21 is closed, I have posted an updated query at #29. Would appreciate any suggestions. Thank you.

from mir_robot.

rsr152 avatar rsr152 commented on July 20, 2024

I am trying to address a couple of issues as I work with our MiR:

  1. I am trying to control the MiR both through REST APIs (for path planning and navigation) and by publishing to the /cmd_vel topic directly.

I have written a function in python (attached) to run the robot at a specified velocity for a certain distance, but I am confused about where I should be placing this code in the ecosystem of files within the mir_bridge node. Should it be part of:

rosbridge.py in the $ ~/catkin_ws/src/mir_robot/mir_driver/src/mir_driver directory OR
mir_bridge.py in the $ ~/catkin_ws/src/mir_robot/mir_driver/nodes directory OR
should I be doing something totally different?

  1. I would like to connect both my external PC and the MiR to a different WiFi network. What parts of the code in mir_bridge need to be modified to reflect the new IP address (192.168.xxx.xxx)?

Thank you!

publish_to_cmdvel_code.txt

from mir_robot.

mintar avatar mintar commented on July 20, 2024

Regarding 1: You shouldn't modify mir_driver at all. You should instead run mir_bridge.py separately and create a new ROS package, with a new ROS node inside where you place your code. See:

http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28python%29

Regarding 2: If the IP address of the MiR PC is different, you can simply set the hostname parameter of mir_bridge, for example by running:

roslaunch mir_driver mir.launch mir_hostname:=192.168.55.55

from mir_robot.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.