Coder Social home page Coder Social logo

pranay-pandey / self-driving-car Goto Github PK

View Code? Open in Web Editor NEW
1.0 1.0 0.0 875.53 MB

Algorithms for autonomous navigation and track following car which can detect road signs and navigate to the end of the road.

Python 3.15% CMake 6.12% Shell 1.27% PowerShell 0.31% C 2.62% C++ 2.59% Makefile 9.15% HTML 72.18% Jinja 2.57% GLSL 0.03%
gazebo-ros object-detection pixycam ros2 yolov5

self-driving-car's Introduction

The task is to move the car using only the pixy camera provided along with the car in the arena on the laid out track which includes under the over bridge, over the over bridge, zig zag tracks and some speed bumps.

image

Final Task Path

Final Track

The complete development is done in ros2 enviroment and gazebo simulation in Ubuntu 20.04


For the installation of ros2 and all the required packages pls refer here

The script nxp_track_vision.py contains the code for the extraction of 2 vectors(num_vectors) giving the endlines of the track usig computer vision on the image captured from the pixy camera.
The script aim_line_follow.py contains the script for controlling and steering the car along the road using the information passed from the nxp_track_vision.

Algorithm details:-

  1. A variable ‘state’ checks if the car is on the over-bridge using the following condition: (when state==0)
    (msg.m0_x1>26 and msg.m1_x1<52 and msg.m0_x0<msg.m0_x1 and msg.m1_x1<msg.m1_x0 and (msg.m0_x0-window_center)*(msg.m1_x0-window_center)<0) (parameters were experimentally observed)

  2. for num_vectors == 2 if both the end tracks are observed on the same side from the center then, the car should turn away from that side (was required in the initial track)

  3. for num_vectors==2 and state==1 (on the bridge) Steer angle is decided from the average of the slope of the 2 vectors obtained or the difference in y values of the tails of vectors (observed to steer in the correct direction over the bridge)

  4. The state changes back to 0 (implies ‘not on overbridge’) after some time calculated from the datetime.now().timestamp()

  5. reducing speed at sharp steers using the exponential formula speed = speed X e^(-a*|steer|)

Improved Steering mechanism of the car -
This steering is directly related to the white pixel density on right and left side of the image window. A higher white pixel density means that area is on the road. First we used colour segmentation to get the mask image of the road and then a left and right slice image is taken from the masked image window. The number of white pixels is found in both of them respectively, and then the number of white pixels is divided by the total number of pixels in that slice image to get left and right pixel density. This value will always be between 0 and 1. we publish this values in an encoded message and receive it in aim_line_follow.py algorithm. Then we just add the left pixel density to the net steering and subtract right density from it. Since if left density is high the car should move left and hence steer should go +ve, and if right density is high the car should go right and steer should go -ve. We also apply a PID to the density values so that they don’t change too rapidly. This trick ensures that the car take smooth turns.
self.steer_vector.z = self.steer_vector.z + (pid(L) - pid(R))/2.0

odom topic is used to access car’s velocity. Since during uphill, car velocity decreases and during downhill increases than its supplied value. To counter this we used a controll structure for car whenever it reaches the bridge. We supply velocity as
self.speed_vector.x = self.speed_vector.x + kp*error

Where, error = self.speed_vector.x – (car velocity as determined by odom), kp is a constant
In this way we ensure that the car doesn’t go too slow during uphill and not too fast during downhill than the required value.

Self_driving_car.mp4

or access a video demonstration of the run from here

SDC.mp4

Final track demo

To run it in your own systems first install all the packages using the commands

and then replace the following files

aim_file_follow.py in ros2ws/src/aim_line/follow/aim_line_follow
nxp_track_vision.py in ros2ws/src/nxp_cup_vision/nxp_cup_vision

and then run the package using the command

ros2 launch sim_gazebo_bringup sim_gazebo.launch.py

self-driving-car's People

Contributors

pranay-pandey avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.