Coder Social home page Coder Social logo

georgealexakis / factory_robot Goto Github PK

View Code? Open in Web Editor NEW
18.0 5.0 5.0 31.07 MB

Factory Robot is an implementation of Autonomous Navigated Robot with QR Code Detection and Visual Servoing. The implementation consists of different navigation approaches.

License: MIT License

CMake 24.79% C++ 75.21%
ros ros-kinetic cpp 3d-reconstruction autonomous-navigation tag-detection visual-servoing mobile-robotics

factory_robot's Introduction

Factory Robot - Autonomous Navigated Robot with QR Code Detection

Factory Robot is an implementation of Autonomous Navigated Robot with QR Code Detection and Visual Servoing. The implementation consists of different navigation approaches. During the implementation has been used ROS Joystick.

Table of Contents

Requirements

Getting Started

Execution

Screenshots

Demos

License

Requirements

Below is presented the software and hardware that this implementation has been tested.

Software

  • ROS Kinetic
  • Ubuntu 16.04

Hardware

Turtlebot 2 equipped with:

  • Kobuki base
  • Microsoft Kinect V1 Xbox 360
  • RpLidar
  • Generic Router

Getting Started

Download the Source of Project

Navigate to catkin_ws/src

$ cd ~/catkin_ws/src

Get a copy of the source and build it

$ git clone https://github.com/georgealexakis/factory_robot.git (master branch)
$ cd ~/catkin_ws
$ catkin_make

Required Packages Installation

Install turtlebot package, navigation package and 3d reconstruction package with the commands below:

$ sudo apt-get install ros-kinetic-turtlebot-bringup
$ sudo apt-get install ros-kinetic-turtlebot-navigation
$ sudo apt-get install ros-kinetic-rtabmap-ros

Install visp package that is required for tag detection with the commands below:

$ sudo apt-get install ros-kinetic-visp-auto-tracker
$ sudo apt-get install ros-kinetic-vision-visp (complete stack)
$ sudo apt-get install ros-kinetic-visp (include all visp packages)

Install RGB-D camera sensor (Kinect V1) drivers and package with the commands below:

$ sudo apt-get install libfreenect-dev (Kinect V1 sensor drivers)
$ sudo apt-get install ros-kinetic-freenect-launch

Install rosbridge packages that enable the communication between the robot and remote controller, such as ROS Joystick with the commands below:

$ sudo apt-get install ros-kinetic-rosbridge-server

Execution

2d Mapping

Start mapping by running the command below. Then you have to navigate the robot all over the place you want to map.

$ roslaunch factory_robot map_building.launch

To view the map run:

$ roslaunch turtlebot_rviz_launchers view_navigation.launch (run on workstation for visualization only, not obligatory)

When mapping process finishes, run to save the map files in /tmp folder with file name "my_map" in the project folder:

$ rosrun map_server map_saver –f /tmp/my_map

3d Mapping

Start 3d mapping by running the command below. Then you have to navigate the robot to allover the place you want to map.

$ roslaunch factory_robot 3d_reconstruction_mapping.launch

To view the map run:

$ roslaunch rtabmap_ros demo_turtlebot_rviz.launch (run on workstation for visualization only, not obligatory)

When mapping process finish, the map database will be saved to "~/factory_robot/maps/ros/rtabmap.db" folder.

Positioning and Orientation

Every QR Code tag represents one position in the map. Put every QR Code tag to the desired positions. Below are presented the position of every tag and the schematic of a demo simulation. You can create any different simulation by changing the tag positiong in the map.

  1. qr1 is initial position (number 1).
  2. qr2 is opposite from qr1 position (number 2).
  3. qr3 is left from qr1 position (number 3).
  4. qr4 is right from qr1 position (number 4).

Autonomous Navigation Schematic

Put the robot to the initial position before run any command. It is necessary for localization. The initial position and orientation is presented below as the project is developed in the Robotics Lab:

Autonomous Navigation to Specific Positions

Run one of the commands below to start all robot procedures. The first command uses actionlib for autonomous navigation and the second does not. For these functionalitites, 4 positions have being imported in /config/coordinates.yaml and /config/3d_coordinates.yaml files for 4 different QR code tag "qr1, qr2, qr3, qr4". Edit /config/coordinates.yaml and /config/3d_coordinates.yaml for different map.

$ roslaunch factory_robot factory.launch

or

$ roslaunch factory_robot factory_noactionlib.launch (without actionlib)

or for 3d map navigation

$ roslaunch factory_robot factory_3d_reconstruction.launch (with actionlib)

When everything is ready the terminal will show the message "odom received" as presented below:

Odom Received”

Visual Servoing

For, visual servoing part is not necessary to have a specific map and specific position. Just a specific tag with "qr5" integrated information and black boundary. You can use a second Turtlebot 2 to carry the QR code tag. Run the command below to enable remote controlling to a second Turtlebot 2 and attach the tag on it.

$ roslaunch factory_robot servoing_parent.launch

Below is presented a demo scenario of visual servoing in the Robotics Lab.

Visual Servoing Schematic

QR Code Tags

The used QR code tags for the demo implementation are located in the folder /QRcodetags folder. Print to A4 size.

Screenshots

2d_map

2d_navigation

3d_map

3d_navigation

Demos

Autonomous Navigation Demo with QR Code Tag Triggering.

Autonomous Navigation Demo with ROS Joystick Triggering.

Visual Servoing Demo.

License

This project is licensed under the MIT License - see the LICENSE file for details.

factory_robot's People

Contributors

georgealexakis avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

factory_robot's Issues

Can I see entire rqt_graph?

Hello. I'm University student in Korea.
I'm trying to use astra camera for autonomous navigated robot with QR detection.
So, astra camera driver and factory.launch works well individually.
Then, I have to connect astra to factory.launch, but I couldn't know how could do this. So, I want to see rqt_graph, which launched all of things for autonomous navigated robot with QR detection.

rosgraph

Above image is my rqt_graph, when launched astra camera driver and factory.launch

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.