Coder Social home page Coder Social logo

junijoo / gesturecommand Goto Github PK

View Code? Open in Web Editor NEW
0.0 1.0 0.0 42.34 MB

A package of camera based hand gesture robot control system

Home Page: https://winter7eaf.github.io/gesture_localisation_robot/

CMake 3.77% Python 32.01% Shell 0.24% C++ 1.38% HTML 8.22% CSS 1.01% JavaScript 53.37%

gesturecommand's Introduction

GestureCommand: A Simulated Camera-Based Gesture Recognition System for Autonomous Table-Specific Delivery Robot

Gesture-Localisaton-Robot is a package of camera based hand gesture robot control system. This is the official repository of the final assignment of Intelligent Robotics Module at University of Birmingham.
A demonstration can be viewed on our Project website.

Robot going Table 5 Ordering robot to Table 5 using mediapipe

Contributor

Chit Lee (Github)
Juni Katsu (Github)
Cheuk Yu Lam (Github)
Abbas Mandasorwala (Github)
Kozerenko Elizaveta (Github)

Motivation

Imagining you are a barista in a coffee shop. You just made a cup of coffee, and you want it delivered to a customer at a certain table. You have a robot, and you put the cup of coffee on top of it. Your hands are not clean, so it will be problematic to press a touchscreen. How can you tell the robot where to go?

Introducing GestureCommand, a camera-based gesture recognition system for autonomous table-specific delivery robot. You can gesture the robot where to go, and it will go there.

Installation

Ideal Working Environment

  • Ubuntu 20.04
  • ROS Noetic (desktop-full)
  • Python 3.8

Install ROS Noetic

Install ROS Noetic.
Set up your catkin workspace.

Install dependencies

Before install everything, run the upgrade command so your system stays update to date.

  • sudo apt upgrade && sudo apt update
  • sudo apt install ros-$ROS_DISTRO-pr2-teleop ros-$ROS_DISTRO-map-server.

This package mainly relies on two libraries: Mediapipe Machine Learning library developed by Google and OpenCV Open source Computer Vision library (for real time hand detection).

  • pip install mediapipe
  • pip install opencv-python

Clone our repository and build

Git clone this repo to your <catkin_ws>/src
Run catkin_make

Compile laser_trace

  • Compile laser_trace.cpp (provides laser ray tracing) as follows if you are not using arm system(windows, unix...):

      cd <catkin_ws>/src/gesture_localisation_robot/src/laser_trace
      ./compile.sh #You may have to '''chmod +x compile.sh'''
    
  • replace ./compile.sh with ./compilearm.sh if you are using arm system(m1 chip mac):

If correctly compiled, you should find laser_trace.so in the directory <catkin_ws>/src/gesture_localisation_robot/src/pf_localisation. If the code does not compile you need to install PythonBoost from https://github.com/boostorg/python. This requires the download and compiling of Boost and installation of Faber.

Make scripts executable

You may need to make the varies scripts executable by running chmod +x {filename} in the directory <catkin_ws>/src/gesture_localisation_robot/scripts.

Running the Code

Run roslaunch Gesture-Localisation-Robot everything.launch
This should start

  • the map server
  • the robot simulator stage_ros
  • rviz for visualisation
  • pf_localisation for particle filter localisation
  • move_to_coords.py for the local planner which moves the robot to the goal coordinates
  • hand_track_control.py for the hand tracking and gesture recognition
  • initial_pose_publisher.py for publishing the initial pose of the robot

This will allow you to input hand gesture from 0 to 5. 0 is corresponding the Till, and 1 to 5 to Tables respectfully. Hold you hand still about 3 seconds, and the robot should start heading to the ordered table number.

gesturecommand's People

Contributors

ibmr avatar chit-uob avatar junijoo avatar winter7eaf avatar abbas-119 avatar nagasurya avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.