Coder Social home page Coder Social logo

metatron's People

Contributors

5yler avatar aberkowitz avatar cjdesno avatar spohorec avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

metatron's Issues

Integrate ZED with hardware and navigation launch files

From the ZED ROS wrapper wiki page:

The ZED Stereo Camera is a lightweight depth camera based on passive stereo vision. It outputs up to 2208x1242 high-resolution stereo video on USB 3.0, and can work in high-speed mode in VGA at 100 FPS. The ZED SDK computes a depth map of the environment on the GPU of the host machine at the frame rate of the camera. It also gives you access to a strong odometry that learn its environment, making it able to relocate itself at any moment.

image

Tasks

  • add ZED launch file to gigatron_hardware
  • integrate ZED odometry with ekf_map.launch
  • create launch file for rtabmap_ros RGB-D SLAM

Add BSD license to ROS packages and source code

Software License Agreement (BSD License)

Copyright (c) 2017, Cult Classic Racing. All rights reserved.

  1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.

  2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.

  3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

Create urdf.xacro file for Gigatron with race body

The current gigatron.urdf.xacro has sensor mounting and car dimensions specific to the test hardware mounting without the race body.

Tasks

  • modify imu_link with final IMU mount location for race
  • modify laser_link with final RPLIDAR mount location for race
  • add back_laser_link if using a second LIDAR mounted on the rear of the car

Create individual car component URDF macros

The end result will be that the car .urdf.xacro files are less cluttered, and the sensor placement is easier to modify.

Tasks

  • zed.urdf.xacro
  • bno055_imu.urdf.xacro
  • rplidar.urdf.xacro
  • xvlidar.urdf.xacro
  • wheel.urdf.xacro
  • base.urdf.xacro

Implement curb detection

image

New York Maker Faire track might not have solid barriers all around. Possible solutions:

  • downward-tilted LIDAR
  • ZED stereo camera

The ZED could be integrated with depthimage_to_laserscan or pointcloud_to_laserscan. The latter is less efficient but seems to allow conversion between frames, which would allow greater flexibility in sensor mounting.

Tasks

  • update gigatron.urdf.xacro to include:
    • ZED
    • upside-down XV LIDAR on front of car
    • IMU mount location
  • update gigatron.launch to apply appropriate laser angle filters
  • rewrite pointcloud_to_scan.launch to take max, min height parameters, output scan topic name, and output frame_id

Create new state message type with estop field

Keep gigatron/State.msg for rosbag compatibility:

std_msgs/Header header
  uint32 seq
  time stamp
  string frame_id
string mode
gigatron/Drive drive
  float64 angle
  float64 vel_left
  float64 vel_right

Tasks

  • define gigatron/ExtendedState.msg with an additional bool estop field
  • replace publishing State.msg with ExtendedState.msg in arduino_drive_controller (but keep the same topic name)

Enable LIDARs on Jetson TX1

This isn't really a repo issue, but after recompiling the kernel with USB-serial drivers enabled and flashing the TX1, all the launch file logic disabling the LIDARs for the TX1 needs to be removed.

Modify drive message format and visualization marker display

The output of drive is a Gigatron/Drive message. Switch to DriveStamped to allow for viewing the commands generated from bagged data.

Currently arduino_drive_controller handles semiautomatic mode and autonomous mode as equivalent.

Tasks

  • define new message type
    • change output message format in drive script
    • change Drive callback in arduino_drive_controller
  • modify arduino_drive_controller so that all three modes are visually distinct
  • propagate timestamps from messages received in arduino_drive_controller

Fix drive script

Currently throws a lot of nan outputs for steering angle and wheel velocities. Also does not account for LIDAR orientation.

Tasks

  • fix nan outputs
  • remove laser scan parameter hardcoding
  • adapt drive to account for LIDAR yaw angle
    • optionally with tf listener
    • alternatively with parameter for constant offset angle

Create static_transform_publishers from base_link to sensor links

Include in sensor launch files, for example in imu.launch:

<node pkg="tf" type="static_transform_publisher" name="imu_link_tf_broadcaster" args="0 0 0 0 0 0 map nav 100"/> <!-- x y z yaw pitch roll frame_id child_frame_id period_in_ms -->

We need:

  • base_link to imu_link
  • base_link to laser_link

Also make sure that the frame_id of the sensor matches the one we use in the launch file by running the sensor, echoing the sensor stream topic and checking the frame_id of the messages.

Fix arduino_drive_controller steering sign errors

Autonomously commanding a positive steering angle makes the car turn right, and a negative steering angle makes the car turn left.

By the right-hand rule, the yaw angle increases counterclockwise, so both the command input and sensor reading output need to have their sign swapped.

Use EKF for filtered odometry

Branched off from Issue #9. I think that in order to use ekf_localization_node in the odom frame, we would have to disable the odom->base_link transform broadcast by odom_tf_publisher, which currently both:

  • publishes to the odom topic
  • broadcasts the transform odom->base_link

See Using robot_localization with amcl (ROS Answers)

Tasks

  • add parameter to odom_tf_publisher to disable broadcasting odom->base_link transform
  • run ekf_odom.launch with modified odom_tf_publisher and collect rosbag data
  • compare accuracy of odom->base_link transform with previous results

Integrate GPS with navigation stack

navsat_transform_node takes as input a nav_msgs/Odometry message (usually the output of ekf_localization_node or ukf_localization_node), a sensor_msgs/Imu containing an accurate estimate of your robot's heading, and a sensor_msgs/NavSatFix message containing GPS data. It produces an odometry message in coordinates that are consistent with your robot's world frame. This value can be directly fused into your state estimate.

Tasks

  • set up static_transform_publisher from map to utm frame
  • integrate navsat_transform_node with navigation launch files

Avoid driving straight into walls

Currently, the reactive steering algorithm in drive only outputs a nonzero steering angle when there are more obstacles on one side of the car. If the car is driving straight into a wall, it won't stop.

Add some logic to check if it's approaching a flat/even obstacle ahead, then slow down and turn if that's the case.

Tasks

  • test RPLIDAR-based estop
  • implement laser filter with narrower angle for estop
  • modify gigabug code to allow estop testing in SEMIAUTOMATIC mode

Account for motor RPM limits when publishing Arduino

Right now, the arduino_drive_controller node directly converts desired wheel velocities into motor RPM without accounting for motor RPM limits:

      cmd_msg_.rpm_left = msg->vel_left / (_rpm_to_vel * _gear_ratio);
      cmd_msg_.rpm_right = msg->vel_right / (_rpm_to_vel * _gear_ratio);

Tasks

  • determine maximum motor RPM
    • Kilotron
    • Gigatron
  • save maximum RPM values into gigatron_hardware/config/<CAR>_platform.yaml as a new max_motor_rpm parameter
  • modify arduino_drive_controller to read in the new parameter
  • add logic to clip the commands at the motor RPM range limits

Tune reactive controller

Current reactive controller behaves like a drunk driver and is a hazard to self and society at large. Needs some adjustments.

Create new LaserScan angle filter for inverted lasers

The laser_filters package LaserScanAngularBoundsFilter implementation allows filtering between a lower_angle and upper_angle value, but assumes lower_angle < upper_angle.

This doesn't work with the RPLIDAR if rotated 180 degrees, since angle_min = -pi and angle_max = pi and the points we want to keep for the forward facing 180 degrees would lie on the two intervals [-pi, -pi/2] and [pi/2, pi]

Tasks

  • implement new scan filter that allows throwing the middle points rather than the two side intervals
  • replace launch occurrences of the LaserScanAngularBoundsFilter with the new LaserScanAngleFilter implementation as needed
    • XV LIDAR on the front of Gigatron
    • RPLIDAR on the front of Gigatron (depending on final mounting configuration)

Calibrate IMU on both cars

See the Calibration section of the README.md in rtimulib_ros:

Calibration

The calibration needs to be performed by the RTIMULibCal utility provided by the library.
In case of a several devices configuration, RTIMULibCal can be launched with an argument to change the name of the calibration file.
Example:

   $ RTIMULibCal toto

Will produce the toto.ini file in the current directory.

Then, the file calibration file .ini needs to be placed in the config directory of the package.

If the calibration file has a custom name, it must be specified with the calibration_file_name parameter.

We probably want to save the .ini files in gigatron_hardware/config/<CAR_NAME>/imu_calibration.ini or something like that

Tasks

  • calibrate IMU
    • Gigatron
      • accelerometer
      • magnetometer
    • Kilotron
      • accelerometer
      • magnetometer
        • maybe someone without a shitton of metal on both writsts?
  • check if calibration file can be loaded from custom path (i.e. from inside gigatron_hardware) using the calibration_file_path parameter
  • commit calibration results
    • Gigatron
    • Kilotron
  • update imu.launch to account for new calibration files on both cars

Update hare teleoperation script

Adapt the hare keyboard teleoperation script to publish gigatron/Drive messages instead of geometry_msgs/Vector3. The purpose of this is to allow testing motor control in autonomous mode without the confounding variable of vehicle autonomy

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.