5yler / metatron Goto Github PK
View Code? Open in Web Editor NEWAutonomous Power Racing Series Racecar - ROS Metapackage
Autonomous Power Racing Series Racecar - ROS Metapackage
From the ZED ROS wrapper wiki page:
The ZED Stereo Camera is a lightweight depth camera based on passive stereo vision. It outputs up to 2208x1242 high-resolution stereo video on USB 3.0, and can work in high-speed mode in VGA at 100 FPS. The ZED SDK computes a depth map of the environment on the GPU of the host machine at the frame rate of the camera. It also gives you access to a strong odometry that learn its environment, making it able to relocate itself at any moment.
gigatron_hardware
ekf_map.launch
rtabmap_ros
RGB-D SLAMCreate a ROS node to deploy the learned neural net model in a laserscan callback method.
Copyright (c) 2017, Cult Classic Racing. All rights reserved.
Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
The current gigatron.urdf.xacro
has sensor mounting and car dimensions specific to the test hardware mounting without the race body.
imu_link
with final IMU mount location for racelaser_link
with final RPLIDAR mount location for raceback_laser_link
if using a second LIDAR mounted on the rear of the carThe end result will be that the car .urdf.xacro
files are less cluttered, and the sensor placement is easier to modify.
zed.urdf.xacro
bno055_imu.urdf.xacro
rplidar.urdf.xacro
xvlidar.urdf.xacro
wheel.urdf.xacro
base.urdf.xacro
New York Maker Faire track might not have solid barriers all around. Possible solutions:
The ZED could be integrated with depthimage_to_laserscan or pointcloud_to_laserscan. The latter is less efficient but seems to allow conversion between frames, which would allow greater flexibility in sensor mounting.
gigatron.urdf.xacro
to include:
gigatron.launch
to apply appropriate laser angle filterspointcloud_to_scan.launch
to take max, min height parameters, output scan topic name, and output frame_id
Keep gigatron/State.msg
for rosbag compatibility:
std_msgs/Header header
uint32 seq
time stamp
string frame_id
string mode
gigatron/Drive drive
float64 angle
float64 vel_left
float64 vel_right
gigatron/ExtendedState.msg
with an additional bool estop
fieldState.msg
with ExtendedState.msg
in arduino_drive_controller
(but keep the same topic name)This isn't really a repo issue, but after recompiling the kernel with USB-serial drivers enabled and flashing the TX1, all the launch file logic disabling the LIDARs for the TX1 needs to be removed.
The output of drive
is a Gigatron/Drive
message. Switch to DriveStamped
to allow for viewing the commands generated from bagged data.
Currently arduino_drive_controller
handles semiautomatic mode and autonomous mode as equivalent.
drive
scriptDrive
callback in arduino_drive_controller
arduino_drive_controller
so that all three modes are visually distinctarduino_drive_controller
Currently throws a lot of nan
outputs for steering angle and wheel velocities. Also does not account for LIDAR orientation.
nan
outputsdrive
to account for LIDAR yaw angle
tf
listenerInclude in sensor launch files, for example in imu.launch
:
<node pkg="tf" type="static_transform_publisher" name="imu_link_tf_broadcaster" args="0 0 0 0 0 0 map nav 100"/> <!-- x y z yaw pitch roll frame_id child_frame_id period_in_ms -->
We need:
base_link
to imu_link
base_link
to laser_link
Also make sure that the frame_id
of the sensor matches the one we use in the launch file by running the sensor, echoing the sensor stream topic and checking the frame_id
of the messages.
Autonomously commanding a positive steering angle makes the car turn right, and a negative steering angle makes the car turn left.
By the right-hand rule, the yaw angle increases counterclockwise, so both the command input and sensor reading output need to have their sign swapped.
Branched off from Issue #9. I think that in order to use ekf_localization_node
in the odom
frame, we would have to disable the odom->base_link
transform broadcast by odom_tf_publisher
, which currently both:
odom
topicodom->base_link
See Using robot_localization with amcl (ROS Answers)
odom_tf_publisher
to disable broadcasting odom->base_link
transformekf_odom.launch
with modified odom_tf_publisher
and collect rosbag dataodom->base_link
transform with previous resultsWe have amcl
localization up and running, as well as the gps_common
package's utm_odometry_node
which generates global odometry in the UTM frame based on GPS readings (see gps.launch
).
EKF could fuse the GPS-based odometry with our encoder odometry. See Adding A GPS Sensor EKF tutorial.
The move_base
default base_local_planner
apparently does not work well with non-holonomic robots (i.e. robots that can't move freely in any direction).
Possible alternatives are sbpl_lattice_planner
or teb_local_planner
. The former seems deprecated, with no ROS Indigo support.
navsat_transform_node
takes as input anav_msgs/Odometry
message (usually the output ofekf_localization_node
orukf_localization_node
), asensor_msgs/Imu
containing an accurate estimate of your robot's heading, and asensor_msgs/NavSatFix
message containing GPS data. It produces an odometry message in coordinates that are consistent with your robot's world frame. This value can be directly fused into your state estimate.
Tasks
static_transform_publisher
from map to utm framenavsat_transform_node
with navigation launch filesI changed the URDF. We should now include these changes in the launch file.
DO THE THING!
Currently, the reactive steering algorithm in drive
only outputs a nonzero steering angle when there are more obstacles on one side of the car. If the car is driving straight into a wall, it won't stop.
Add some logic to check if it's approaching a flat/even obstacle ahead, then slow down and turn if that's the case.
Right now, the arduino_drive_controller
node directly converts desired wheel velocities into motor RPM without accounting for motor RPM limits:
cmd_msg_.rpm_left = msg->vel_left / (_rpm_to_vel * _gear_ratio);
cmd_msg_.rpm_right = msg->vel_right / (_rpm_to_vel * _gear_ratio);
gigatron_hardware/config/<CAR>_platform.yaml
as a new max_motor_rpm
parameterarduino_drive_controller
to read in the new parameterCurrent reactive controller behaves like a drunk driver and is a hazard to self and society at large. Needs some adjustments.
The laser_filters
package LaserScanAngularBoundsFilter
implementation allows filtering between a lower_angle
and upper_angle
value, but assumes lower_angle < upper_angle
.
This doesn't work with the RPLIDAR if rotated 180 degrees, since angle_min = -pi
and angle_max = pi
and the points we want to keep for the forward facing 180 degrees would lie on the two intervals [-pi, -pi/2]
and [pi/2, pi]
LaserScanAngularBoundsFilter
with the new LaserScanAngleFilter
implementation as needed
Right now they're just in the msg
directory of gigatron
and gigatron_hardware
, but standardizing into a standalone package would be cleaner.
See the Calibration section of the README.md
in rtimulib_ros
:
Calibration
The calibration needs to be performed by the
RTIMULibCal
utility provided by the library.
In case of a several devices configuration,RTIMULibCal
can be launched with an argument to change the name of the calibration file.
Example:$ RTIMULibCal toto
Will produce the
toto.ini
file in the current directory.Then, the file calibration file
.ini
needs to be placed in theconfig
directory of the package.If the calibration file has a custom name, it must be specified with the
calibration_file_name
parameter.
We probably want to save the .ini
files in gigatron_hardware/config/<CAR_NAME>/imu_calibration.ini
or something like that
gigatron_hardware
) using the calibration_file_path
parameterimu.launch
to account for new calibration files on both carsAdapt the hare
keyboard teleoperation script to publish gigatron/Drive
messages instead of geometry_msgs/Vector3
. The purpose of this is to allow testing motor control in autonomous mode without the confounding variable of vehicle autonomy
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.