GestureCommand: A Simulated Camera-Based Gesture Recognition System for Autonomous Table-Specific Delivery Robot
Gesture-Localisaton-Robot
is a package of camera based hand gesture robot control system. This is the official repository of the final assignment of Intelligent Robotics Module at University of Birmingham.
A demonstration can be viewed on our Project website.
Robot going Table 5 | Ordering robot to Table 5 using mediapipe |
---|---|
![]() |
![]() |
Chit Lee (Github)
Juni Katsu (Github)
Cheuk Yu Lam (Github)
Abbas Mandasorwala (Github)
Kozerenko Elizaveta (Github)
Imagining you are a barista in a coffee shop. You just made a cup of coffee, and you want it delivered to a customer at a certain table. You have a robot, and you put the cup of coffee on top of it. Your hands are not clean, so it will be problematic to press a touchscreen. How can you tell the robot where to go?
Introducing GestureCommand, a camera-based gesture recognition system for autonomous table-specific delivery robot. You can gesture the robot where to go, and it will go there.
- Ubuntu 20.04
- ROS Noetic (desktop-full)
- Python 3.8
Install ROS Noetic.
Set up your catkin workspace.
Before install everything, run the upgrade command so your system stays update to date.
sudo apt upgrade && sudo apt update
sudo apt install ros-$ROS_DISTRO-pr2-teleop ros-$ROS_DISTRO-map-server
.
This package mainly relies on two libraries: Mediapipe Machine
Learning library developed by Google and OpenCV Open
source Computer Vision library (for real time hand detection).
pip install mediapipe
pip install opencv-python
Git clone this repo to your <catkin_ws>/src
Run catkin_make
-
Compile laser_trace.cpp (provides laser ray tracing) as follows if you are not using arm system(windows, unix...):
cd <catkin_ws>/src/gesture_localisation_robot/src/laser_trace ./compile.sh #You may have to '''chmod +x compile.sh'''
-
replace
./compile.sh
with./compilearm.sh
if you are using arm system(m1 chip mac):
If correctly compiled, you should find laser_trace.so
in the directory <catkin_ws>/src/gesture_localisation_robot/src/pf_localisation
.
If the code does not compile you need to install PythonBoost from https://github.com/boostorg/python. This requires the download and compiling of Boost and installation of Faber.
You may need to make the varies scripts executable by running chmod +x {filename}
in the directory <catkin_ws>/src/gesture_localisation_robot/scripts
.
Run roslaunch Gesture-Localisation-Robot everything.launch
This should start
- the map server
- the robot simulator stage_ros
- rviz for visualisation
- pf_localisation for particle filter localisation
- move_to_coords.py for the local planner which moves the robot to the goal coordinates
- hand_track_control.py for the hand tracking and gesture recognition
- initial_pose_publisher.py for publishing the initial pose of the robot
This will allow you to input hand gesture from 0 to 5. 0 is corresponding the Till, and 1 to 5 to Tables respectfully. Hold you hand still about 3 seconds, and the robot should start heading to the ordered table number.