Coder Social home page Coder Social logo

move_basic's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

move_basic's Issues

feature-nav parameter missmatch

Im writing this as a separate issue, because this is more clear that just adding comments to #76.

In feature-nav branch there are some paramter missmatches that i've noticed:

  • parameter that used to be robot_front_length is now forward_obstacle_threshold - old one is still exposed in collision_checker.cpp - similar is true with robot_width - now min_side_dist
  • where has the robot_back_length gone? its exposed in collision_checker.cpp but not in move_basic.cpp

These are just the parameters that i've noticed now. I think what happened is that move_basic was heavily modified without thinking about how collision_checker will be effected and now we have a slight parameter mess. I think we also have to look at collision checker and clear this before we merge into master.

This new heavily modified move_basic works very nicely, but since so many user-exposed params have changed maybe we have to think about this becoming a new version of move_basic (eg. move_basic2)

move_basic does not respect max_angular_velocity parameter

ROS: kinetic

After move_basic is launched, the param max_angular_velocity is confirmed to be set to 0.01rad/s:

ubuntu@housie ~ $ rosparam get /move_basic/max_angular_velocity
0.009999999776482582

but when the robot is moving, im seeing big jumps in cmd_vel/angular/z - from 0.5 to -0.5

Spurious rotation

When there is goal with a small linear displacement, move_basic rotates to travel in that direction a small amount before doing the final rotation. This is confusing to the user when a pure rotation was expected.

Robot hit the brakes not in time to stop at the point

We have a robot which is using the move_basic package. But the problem is the robot tries to stop, but can't hit the point (traveled distance always after the point). It's because of the PID controller we use (it tries to smooth our velocity and and the package assumes that the robot stopped when it's actually moving). move_basic doesn't use a real data (from odometry), so that's the problem. Please, help us with this issue.

Status info

So there seems to be a lot of info that move_basic is aware of during goal movement, but cannot be communicated back to the action client. There is technically a string in GoalStatus, but there doesn't seem to be a way to access that in the python implementation at least.

I would propose creating an extra topic where some of this data could be published, such as:

  • current action being taken (rotating, linear move)
  • are we waiting for an obstacle
  • distance remaining to goal
  • etc.

That could then be displayed as relevant info in breadcrumb an ezmap, since it's sometimes not clear why the robot is standing still near a goal that's close to a wall or something, which is unintentionally triggering the obstacle wait.

Handle Lidar data with ObstacleDetector class

Currently, obstacles are only detected via Lidar when moving forwards. The ObstacleDetector estimates distances to obstacles when rotating and moving backwards too. Handling Lidar and Range data in the same class will give better OO structure.

Laser frame hardcoded in move_basic

On my bot vac which is getting LDS Scan data, my obstacle distance is fxed at 9.9

getting this warning:

[ WARN] [1529711837.494609404]: "laser" passed to lookupTransform argument source_frame does not exist.

Additional Data:

$ rostopic info /scan
Type: sensor_msgs/LaserScan

Publishers:

Subscribers:

alan@anfrosbase:/catkin_ws/src/magni_robot/magni_nav/launch$ cd
alan@anfrosbase:
$ rosmsg show sensor_msgs/LaserScan
std_msgs/Header header
uint32 seq
time stamp
string frame_id
float32 angle_min
float32 angle_max
float32 angle_increment
float32 time_increment
float32 scan_time
float32 range_min
float32 range_max
float32[] ranges
float32[] intensities

alan@anfrosbase:~$ rostopic info /obstacle_distance
Type: std_msgs/Float32

Publishers:

Subscribers: None


data: 9.89999961853

My guess is I am missing a transform from base_link or base_laser_link, but I don't know.

Suggest we decouple direct colision feature from move_basic

In general it may be time to have move_basic only use sonar data for modifications to path planning and not try to outsmart the 'danger of collision' function required on most systems. The text below was a post to our slack channel but am keeping it here in case this issue becomes prioritizd as an actionable issue.

it is time to re-think and remove all the nasty sonar baggage from move basic as we form a 'move basic' that is a more sharable component.
The thought: Have a node that listens to Sonars and limit switches and makes high level decisions and perhaps even very basic costmap data that it broadcasts on some topic.
Move basic or whatever it becomes such as 'move_smart' perhaps gets decoupled from emergency stop type activities. It may still need sonars for route plan changes but just don't use sonars in move xxxx for full stop stuff.
Here is an interesting approach to consider. I have felt that we use move_basic for a semi-smart driving system that always accepts commands from 'above' where above is aisle mode, EZmap or breadcrumb (so far). We should NOT have sonar logic to stop move basic be within move basic, it makes it not at all portable (for code and higher level apps that use it). What to do about collision is a higher level app logic.
THEREFORE: I suggest we as a group consider that we place any logic that makes decisions about combining sonar data into logic that will need to abort move base as a separate node.
The collision node deals with looking at sonars and at bumper switches and perhaps other proximity sensors and decides when it is time to 'alert' higher level code through a topic to be determined.
At this time I made sonar_detector as you are aware. I am not suggesting that, I am suggesting a decouple of the nasty collision stuff sort of like the purpose of sonar_detector. See it's readme.
Perhaps we need some node like that that deals with sonars and makes decisions suitable for our main apps like EZmap, Breadcrumb and Polaris. We put the function of that node to notifiy whatever higher level app that 'we gonna hit sumpin boss'.
Then we make the higher level boss tell move_basic to shutup.
It is also possible to have this node be listened to by move_basic itself for extra safety. We could choose to have a well known topic such as sonar_detector invented that is called system_control for the sonar_detector and now ubiquity motor. Anybody can listen for basic 'Stop NOW or you will be sorry' messages.
sonar_detector is raw basic stuff so I am suggesting we thing out a spec for something that fills that need but fits our 3 top apps.
This is the sort of thing that will also de-clutter move basic and allow us to make move basic more general for our different apps.
WHAT SAY YEE???
12:24
If we get a quarum then what I have just said and your other requirements go into a new issue under move_basic perhaps. This will be lost in a month or so so I just post here to see if other feel this is ok or not.
PS: I realize it is now Saturday but what @vid said caused other thoughts I have been having on sonars in move_basic to sort of 'gel into a requirements definition' of sorts. (edited)

Going backwards doesn't work

@jrlandau reported that goals behind don't work.

Question: should it handle this as a special case and actually move backwards, or do the arctan planner thing and rotate to face the goal and drive towards it? If it moves backwards, then a front-facing LIDAR does not provide obstacle detection.

Definition of done:
Backwards goals result in behavior that considered reasonable by consensus.

Oddly consistent offset

Another thing seen by a client, it seems that in some cases the robot follows goals with a constant offset of sorts. It's likely this is related to lidar position calibration, but it could also be related to not fetching the robot's position often enough, the transform request failing or something similar. Again, perhaps worth a closer look.

distRemaining is constant

The problem is that distRemaining is always constant due to being assigned using the "linear" variable. The solution is to change line 738
double distRemaining = sqrt(linear.x() * linear.x() + linear.y() * linear.y());
to
double distRemaining = sqrt(remaining.x() * remaining.x() + remaining.y() * remaining.y());

The solution is tested and works fully.

Visualize robot's path in Rviz

It would be nice to have a vizualization of the robot path.
Currently we are publishing our planned path (in move_basic) to /plan topic, which draws a straight line between the current position and the goal.
The fact that now we are implementing a smooth drive, we should visualize the trajectory in the same manner.
This will be more feasible when #53 closes.

Do a release

One benefit is it would enable us to document the node in the conventional style on the ROS wiki.

Unit tests for Queueing behavior

We need unit tests for the behavior introduced in #56

These should test the QueuedActionServer to make sure that goals are processed the way that we expect them to be.

Release a ROS melodic version on package manager

Can this package be released for ROS melodic?

I have cloned and built this against ROS melodic on Ubuntu 18.04.5 LTS and it appears to work. I don't know the steps to get this released into the package manager though.

Thanks!

Abort if progress not being made towards goal

One way to implement this is to store the timestamp of when we last decreased the distance to goal. If that timestamp is older than some threshold, then abort the goal.

Another way: we could also store the shortest distance we have been to the goal, and if we go over some threshold from that we abort that goal.

I am not sure which way makes more sense.

This may also solve #37 at the same time.

Wrong goal preempting

According to design spec of QueuedActionServer:
If another goal is received, add it to the queue. If the queue is already full, then set the current goal to preempted, start executing the next one in the queue, and add the new one to the queue.

What actually happens is that the new goal preempts the second goal in Queue and executes it after it finishes the first.

Rotation-only goals ignored

So it would seem that sending simple or actionclient goals without a translational change results in a complete disregard of the goal.

For example, sending a simple goal:

[ INFO] [1623088413.784146154, 48.450000000]: MoveBasic: Received simple goal
[ INFO] [1623088413.869282423, 48.500000000]: MoveBasic: Received goal 0.000000 0.000000 180.000000 base_link
[ INFO] [1623088413.869342759, 48.500000000]: Planning in goal frame: base_link

[ INFO] [1623088413.869401961, 48.500000000]: MoveBasic: Goal in base_link  0.000000 0.000000 180.000000
[ INFO] [1623088413.869471346, 48.500000000]: MoveBasic: Goal in base_link  0.000000 0.000000 180.000000

The base proceeds to do nothing. It seems that the threshold for movement is at around 4cm on both x and y, so if we send a goal at 0.04 0.04 180 it does actually work. Anyhow, pretty annoying and I'm almost sure this used to work at some point.

The tf frame doesn't seem to matter, it also ignores ones in map and odom if the translation sent matches the one of the robot. Steps to reproduce would be to load up rviz and send a manual goal in a 8x8cm area around the robot center.

Suggest we use laser_filters instead of scan filtering in obstacle_points.cpp

There is a lot going on in obstacle_points.cpp to filter LiDAR scan to get the actual obstacle points.

I would suggest to use laser_filters ROS package which has a support for Kinetic, Melodic, Noetic. More here: http://wiki.ros.org/laser_filters

The only thing that bothers me is that this then mean that move_basic is not a standalone package anymore but rather uses another ROS package.

What do you think @mjstn, @JanezCim, @MoffKalast, @rohbotics ?

External force during goal following

Currently if the robot is moving towards the goal and is moved or hit by some other dynamic object it does not update it's pose according to the goal and therefore misses it.

PROPOSED SOLUTION:
Keep track of goal_frame~base_frame transform during movement and update it accordingly if needed.

max_lateral_rotation and max_angular_velocity parameters not clear and undocumented

Found out that parameter max_angular_velocity is meant for maximum angular velocity, while robot is turning on the spot, while the parameter max_lateral_rotation controls the maximum angular velocity while driving towards goal. This answers this issue but maybe the names of the parameters are a bit unintuitive? The ROS wiki page also doesn't even mention param max_lateral_rotation

suggestion1: new names: max_turning_velocity, max_lateral_velocity
suggestion2: we set the max_lateral_velocity = max_turning_velocity as default with
nh.param<double>("max_lateral_velocity", maxLateralVelocity, maxTurningVelocity);
as this would still lave the user the control over both, but default parameters would be a bit easier to understand.

If we don't rename parameters, we should at least update documentation on ROS wiki page (we have to do that anyway).

Aborting a goal due to moving away from goal needs to be ROS option enabled

We recently went through a great deal of debug time tracking down why Aisle mode was aborting after turning around at the end of an aisle. We discovered it was due to 'Moving away from goal'.
Aisle mode has its own logic above move basic to deal with abort of goal. too many cooks spoil the broth.

Aisle mode will need that sort of abort to be disabled and I suggest ROS option.

Robot driving past goals into infinity

In a nutshell if the robot comes into such a case where it misses a goal it doesn't stop but attempts to drive ahead and loop around to find the goal. That's obviously problematic in almost all cases.

A suggested fix would be to constantly calculate a safety dot product between the robot's forward vector and the vector from the robot center to the goal, then stop driving if the result is <= 0 and distance to goal is more than 0.3 m or something. That would prevent the robot from ever driving away from the goal. As to what to do when that condition is detected, we could just return to the initial rotation stage or cancel the goal completely and resend it.

Removing the low speed requirement for having achieved a goal is probably also a good idea, after all we're there and we need to stop right now regardless of the speed we're going at.

This is probably the highest priority issue along with #80 which is the main but not only cause of it.

Detect odometry problems

move_basic detect if a robot is not reporting odometry consistent with the way it is being asked to move and abort with an error message.

DOD: test on a robot with disconnected/cross-connected encoders and verify it detects the odometry problem.

Add option to not rotate when the received quaternion is not unit

We've discussed this a fair bit, ultimately it would be of great help in many projects if move basic could just be told to go to a goal with no proper rotation specified. The most straightforward way of doing this seems to be just sending an invalid rotation, which does intuitively make sense and goes into the realm of expected behaviour.

Optionally we may need to add a parameter to enable this, but given the other changes to move_basic lately we're likely breaking most people's setup anyhow so it's not a high priority.

RootRework: Rotational Collisions

I suppose this is something that we all know about, but nobody ever brought it up as an actual issue.

Here's a worst case scenario example:

example.mp4

Like how many times has a magni scraped a wall or hit something with the rear end when turning? Happens so often that I even set up foam bumpers on this one months ago without even thinking about it too much.

When we were only using sonars this wasn't really possible to detect or fix, but now that we have lidar support this should probably be added to prevent "magni in a china shop" situations.

The interesting thing is that it does actually slow down in these cases, which would imply it's actually detected something, but it doesn't really stop. So you get slow scraping against walls instead of fast scraping at least ๐Ÿ˜„.

remaining distance stays the same

Hello i am trying to use the move_basic package for a simple inertial navigation system,i provide odometry and imu data to the robot_pose_ekf package and publish my transform message from odom to base_link. I use driving in odom frame and tried planning in both odom and base_link frame, but when i use the odom planning frame my rovers seems to do a circle and stop and when i try base_link the remaining distance stays still and the rover never stops running.Thank you in advance.

Measured moves in robot commander

Prior to the January release measured moves ( turn right 90 degrees ) worked fine for both Loki and Magni,
Even though no waypoints could be set. Now measured moves only work if aruco detect is running on Magni, and not at all on Loki.

Something is not working and I suspect a change to this node.

move_basic Features

This Issue names the features included in nav-features branch that will be merged into kinetic-devel once all the relevant people agree.

Last stable commit was 7148b24c7c5a436e56fcefe9fc344dfd12be1edb on branch nav-features. If any of the features break your code, please refer to this commit and report your problem in this issue.
Changes applied from last stable commit are:

  • Removed velocity gains(latGain, linGain, rotGain)
    REASON: Unintuitive for the end user
    COUNTER WEIGHT: Increased values for both minLinearVelocity & minTurningVelocity so that robot is not moving to
    slow + retuned PID parameters for lateral correction when moving linear
  • Added stopCallback
    REASON: Enables us to stop and resume the last navigation task through topic (no need to cancel all goals and
    resend them) - This should further changed to ROS Service
  • Oscillations
    REASON: When turning on spot robot can oscillate around the final orientation no more than 3 times, this enables us
    to miss a final orientation but give a second try(is already present in kinetic-devel)
  • Removed velThreshold parameter
    REASON: It doesn't give any significant change - linearTolerance parameter for checking if we are close enough to a
    goal is good enough for now
  • False final orientation quaternion
    REASON: Proposed for projects ease handling
  • Abort moving away from goal
    REASON: Abort before any unpredicted behaviour
    WORKING PRINCIPLE: If robot moves away from goal(cosine of angleRemaining) for more than runawayTimeout
    duration, we mark that as moving away from goal and abort it
  • Added minLinearVelocity, minAngularVelocity
    REASON: This way robot IRL is always able to perform a maneuver and is not stuck by its physical constraints
  • Renamed parameters
    REASON: To be more intuitive for the end-user

Linker problem at Intel Euclid / Ubuntu 16.04

Hi,
I'm trying to compile move_basic on Intel Euclid with Ubuntu 16.04 (ros kinetic).
During catkin_make process, I get a linker error:
[ 0%] Linking CXX executable /intel/euclid/euclid_ws/devel/lib/move_basic/move_basic
CMakeFiles/move_basic.dir/src/obstacle_detector.cpp.o: In function ObstacleDetector::range_callback(boost::shared_ptr<sensor_msgs::Range_<std::allocator<void> > const> const&)': obstacle_detector.cpp:(.text+0xd1c): undefined reference to void tf2::fromMsg<geometry_msgs::Point_<std::allocator >, tf2::Vector3>(geometry_msgs::Point_<std::allocator > const&, tf2::Vector3&)'
obstacle_detector.cpp:(.text+0xfa4): undefined reference to void tf2::fromMsg<geometry_msgs::Vector3_<std::allocator<void> >, tf2::Vector3>(geometry_msgs::Vector3_<std::allocator<void> > const&, tf2::Vector3&)' obstacle_detector.cpp:(.text+0x102b): undefined reference to void tf2::fromMsg<geometry_msgs::Vector3_<std::allocator >, tf2::Vector3>(geometry_msgs::Vector3_<std::allocator > const&, tf2::Vector3&)'
CMakeFiles/move_basic.dir/src/obstacle_detector.cpp.o: In function ObstacleDetector::scan_callback(boost::shared_ptr<sensor_msgs::LaserScan_<std::allocator<void> > const> const&)': obstacle_detector.cpp:(.text+0x188d): undefined reference to void tf2::fromMsg<geometry_msgs::Point_<std::allocator >, tf2::Vector3>(geometry_msgs::Point_<std::allocator > const&, tf2::Vector3&)'
obstacle_detector.cpp:(.text+0x1917): undefined reference to `void tf2::fromMsg<geometry_msgs::Vector3_<std::allocator >, tf2::Vector3>(geometry_msgs::Vector3_<std::allocator > const&, tf2::Vector3&)'
collect2: error: ld returned 1 exit status
move_basic/CMakeFiles/move_basic.dir/build.make:348: recipe for target '/intel/euclid/euclid_ws/devel/lib/move_basic/move_basic' failed
make[3]: *** [/intel/euclid/euclid_ws/devel/lib/move_basic/move_basic] Error 1
CMakeFiles/Makefile2:6716: recipe for target 'move_basic/CMakeFiles/move_basic.dir/all' failed
make[2]: *** [move_basic/CMakeFiles/move_basic.dir/all] Error 2
CMakeFiles/Makefile2:6728: recipe for target 'move_basic/CMakeFiles/move_basic.dir/rule' failed
make[1]: *** [move_basic/CMakeFiles/move_basic.dir/rule] Error 2
Makefile:2526: recipe for target 'move_basic' failed
make: *** [move_basic] Error 2
Invoking "make move_basic -j4 -l4" failed

I've installed all tf2 related libs.
Any idea, which library is missing at my system ?

Thanks in advance
Chrimo

Maintain timestamps for obstacle distances

The forward, left, and right obstacle distances could each have timestamps associated with them based upon the timestamps of the sonar sensors that contributed to them. The relevant code changes would be on ObstacleDetector::obstacle_dis()t and MoveBasic::moveLinear(). This would lead to smoother control of steering based on side distances.

More aggressive external force correction

In the stock configuration it seems that the robot doesn't quite correct its drift quickly enough if the goal is a close distance (<2m or so) away. There is probably a curve somewhere or a set of PIDs that determine how much the robot steers towards the right location if it finds itself off course.

That part seems to need some boost when the calculated offset isn't a large value as it doesn't get corrected enough in the mentioned case.

Terminal status for goals

It seems that occasionally when using move_basic through the action client it's possible to get the following result when preempting:

[ WARN] [1613663891.808006341, 985.273000000]: Your executeCallback did not set the goal to a terminal status.
This is a bug in your ActionServer implementation. Fix your code!
For now, the ActionServer will set this goal to aborted

Upon which move_basic seems to lock up sometimes. The solution would be to check in which case it's possible to end the callback without calling set_something for the goal before returning. Found in nav-features-aislemode branch, possibly a problem in others as well.

Sending goal succeeded when it doesn't seem to have

Something odd reported by a client, they seem to be getting a "MoveBasic: Done linear" in cases where the robot hits the goal, but more often than not the case is as follows:

  • nothing gets printed in the console
  • linear is done, but not very accurately
  • final rotation is done
  • action client receives goal succeeded despite being more than linear_tolerance away from goal, again nothing printed in the console

It may just be the extras added in the runaway-rotation-fix branch they're currently on, but it's worth a closer look as to which code path ends this way.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.