Coder Social home page Coder Social logo

scazlab / human_robot_collaboration Goto Github PK

View Code? Open in Web Editor NEW
22.0 15.0 7.0 1.37 MB

Yet another repo for the baxter collaboration task.

License: GNU Lesser General Public License v2.1

CMake 3.27% C++ 73.22% Python 9.85% HTML 4.45% JavaScript 6.58% CSS 2.52% Dockerfile 0.11%
robot baxter-robot baxter-collaboration cartesian-controller ros arm hri human-robot-interaction advanced-manufacturing safety

human_robot_collaboration's Introduction

Human Robot Collaboration Build Status Issues Codacy Badge Docker Build Status Docker Automated Build

Repository that hosts software libraries and ROS packages for reproducing Human-Robot Collaboration experiments performed at the Social Robotics Lab in Yale University.

setup

The software in this repo has been developed for, and tested on, a Baxter Research Robot---a widely used platform for research in HRC. Nonetheless, it is easy to customize to any robotic platform that shares similar hardware features and is provided with a kinematic model in URDF form. If you are using this software and or one of its components, we warmly recommend you to cite the following paper:

[Roncone2017] Roncone Alessandro, Mangin Olivier, Scassellati Brian Transparent Role Assignment and Task Allocation in Human Robot Collaboration IEEE International Conference on Robotics and Automation (ICRA 2017), Singapore. [PDF] [BIB]

Installation, Compilation and Testing

To install, compile and test this software, please refer to the installation tutorial.

Tested environments

ROS distro Ubuntu Distro Compiler Status
indigo 14.04 Trusty gcc-4.9 working
indigo 14.04 Trusty clang-3.6 wontfix
kinetic 16.04 Xenial gcc disabled
kinetic 16.04 Xenial clang disabled

NOTE1: Apparently, it is very difficult to make clang work with Ubuntu 14.04 Trusty, C++14 enabled, and all of the old ROS libraries (indigo, OpenCV2.4, etc). I disabled compilation because I would rather prefer using the latest C++14 features. NOTE2: As of today (June 7th 2017), kinetic builds are disabled, since travis does not support Ubuntu 16.04 Xenial Xerus yet. 16.04 support should come on Q2 2017.

Execution on the robot

Initial steps (mainly for Scazlab students)

  1. Turn on the robot. Wait for the robot to finish its start-up phase.
  2. Be sure that the system you're running the code has access to the Baxter robot. This is usually done by running the baxter.sh script that should be provided in your Baxter installation. See here for more info. @ScazLab students → for what concerns the Baxter robot on the ScazLab, this means that every time you have to run some ROS software to be used on the robot you should open a new terminal, and do the following: cd ros_devel_ws && ./baxter.sh. A change in the terminal prompt should acknowledge that you now have access to baxter.local. Please be aware of this issue when you operate the robot.
  3. Untuck the robot. @ScazLab students → we have an alias for this, so you just have to type untuck

This repository currently allows for two modes of operation:

  1. A. Cartesian Controller server → It allows for controlling each of the arms in operational space.
  2. B. High-level actions → It enables some high-level actions to be asked to the robot, such has hold or pick object.

These two modes can be enabled concurrently, but this feature is disabled by default: in order to be able to communicate with the robot both in the high-level interface and the low-level controller, you need to create your own action_provider. See the src folder for more information on that.

Mode A. Cartesian Controller Server

In this mode, the user can ask the robot to go to a specific 3D Position or 6D Pose (position + orientation), and the robot will simply go there (if physically possible). To guarantee safety, the robot still has the standard safety systems enabled by default. More advanced uses are allowed, but not exposed to the user: if you want to tinker with advanced features, we recommend to specialize the RobotInterface class.

In order to use the Cartesian Controller Server, you have to launch it with:

roslaunch human_robot_collaboration baxter_controller.launch

This should create two topics the user can request operational space configurations to. They are /baxter_controller/left/go_to_pose for left arm, and /baxter_controller/left/go_to_pose for right arm. In the following, there are some examples on how to require them from terminal (e.g. for the left arm):

  • [6D Pose] : rostopic pub /baxter_controller/left/go_to_pose human_robot_collaboration_msgs/GoToPose "{pose_stamp: {pose:{position:{ x: 0.55, y: 0.55, z: 0.2}, orientation:{ x: 0, y: 1, z: 0, w: 0}}}, ctrl_mode: 0}" --once
  • [3D Position] : rostopic pub /baxter_controller/left/go_to_pose human_robot_collaboration_msgs/GoToPose "{pose_stamp: {pose:{position:{ x: 0.55, y: 0.55, z: 0.2}, orientation:{ x: -100, y: -100, z: -100, w: -100}}}, ctrl_mode: 0}" --once. This differs from the previous case since now every value of the orientation quaternion is set to -100. This is to communicate the Cartesian Controller to reach the desired position while maintaining the current orientation.

Obviously, these same messages can be sent directly within your code. Please take a look at the GoToPose.msg file for further info.

Mode B. High-Level Actions

We implemented a low-level, state-less controller able to operate each of the arms independently (and communicate with the other one if needed). A library of high-level predefined actions (in the form of ROS services) is available for the user to choose from; such actions range from the simple, single arm pick object to the more complex hold object (which requires a physical collaboration with the human partner) or hand over (which demands a bi-manual interaction between the two arms).

To enable this mode, run roslaunch human_robot_collaboration flatpack_furniture.launch or roslaunch human_robot_collaboration modular_furniture.launch. These are two predefined launch files that we use for two different experiments we ran in the lab. If you would like to create and use your own high-level actions, we suggest you to specialize the ArmCtrl class. See the flatpack_furniture library for inspiration on how to do it.

Now, the user should be able to request actions to either one of the two arms by using the proper service (/action_provider/service_left for left arm, /action_provider/service_right for right arm). Here are some examples to make the demo work from terminal:

  • rosservice call /action_provider/service_right "{action: 'hold'}"
  • rosservice call /action_provider/service_left "{action: 'get', objects: [17]}"

Similarly to Mode A, these same services can be requested directly within your code. Please take a look at the DoAction.srv file for further info.

Non-exhaustive list of supported actions

  • list_actions (both arms): it returns a list of the available actions for the specific arm.
  • home (both arms): moves the arm to a specific joint configuration (i.e. it does not use IK).
  • release (both arms): opens the gripper (or releases the vacuum gripper).
  • hand_over (both arms): performs an handover from the left to the right hand. The left arm picks an object up at a specific orientation (for now it works only with the central frame, ID number 24), and passes it to the right arm, which then holds it until further notice.
  • get (left arm): the robot gets an object at a random orientation, and moves it on the table (without releasing it).
  • pass (left arm): if the robot has an object in its vacuum gripper, it moves it toward the human partner and waits for force interaction in order to release the gripper. The force filter we applied is a low-pass filter, so in order to trigger the release action, the human is suggested to apply an high-frequency, spike-y interaction.
  • hold (right arm): the robot moves into a configuration in which it is optimal to hold an object as a supportive action for the human partner. After that, it waits for force interaction to close the gripper, and again to open the gripper when the human has finished. See above for a description of the proper force interaction that should be applied to the arm in order to trigger the behaviors.

Python controller

The BaseController class abstracts python access to the robot resources, including the high-level action action_provider service. It is based on a few Python helpers for

Misc

  • To kill an action from the terminal, you can simulate a button press on the arm's cuff: rostopic pub --once /robot/digital_io/left_lower_button/state baxter_core_msgs/DigitalIOState "{state: 1, isInputOnly: true}".
  • You can also kill an action from the web interface, by pressing the ERROR button. It writes to the same topic and achieves the same behavior.
  • To go robot-less (that is try to execute the software without the robot, for testing purposes), you can choose one of the following options:
  • Call the action_provider with the argument --no_robot, e.g. rosrun human_robot_collaboration baxter_controller --no_robot. In this mode, only the service to request actions is enabled. It will always return with a 2s delay and it will always succeed.
  • Change the use_robot flag to false in the launch file.
  • Launch the launch file with the argument use_robot:=false, e.g. roslaunch human_robot_collaboration baxter_controller.launch use_robot:=false

Experiments

ICRA 2017

To reproduce the experiment from the following paper on the Baxter research robot, we provide the script human_robot_collaboration/scripts/icra_experiment.

[Roncone2017] Roncone Alessandro, Mangin Olivier, Scassellati Brian Transparent Role Assignment and Task Allocation in Human Robot Collaboration IEEE International Conference on Robotics and Automation (ICRA 2017), Singapore.

It needs to be given the path to an offline computed policy, as possible with github.com/ScazLab/task-models.

human_robot_collaboration's People

Contributors

alecive avatar brahmgardner avatar jakebrawer avatar jhong17 avatar kalebishop avatar meiyingqin avatar omangin avatar sarah-widder avatar sarimabbas avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

human_robot_collaboration's Issues

Improve the gripper class

Right now, we have a semi-working situation with the vacuum gripper (i.e. it works, but we didn't have time to code it properly). It would be nice to fully port the sw to a robust scenario.

[ROSThreadImage] Accept any type of encoding (not only `bgr8`)

As per title. Right now ROSThreadImage works only with bgr8 images (ie three channels, BGR). It would be nice to expand the class to work also with other types, most notably mono8 (ie one channel only).

The class shoulud be extended to allow for a new member that encapsulates the type of encoding. This should be passed in the constructor (bgr8 will still be the default), and used in the class. A new test should be added that tests the class also with a mono image.

Which TTS to choose?

Fake issue to list all of our options:

  1. Google TTS → It's the best but it is very difficult to implement without a proper Google Cloud subscription. The best semi-legal way I found is here https://gist.github.com/alotaiba/1728771 and specifically in oaeide's comment: wget -q -U Mozilla -O output.mp3 "http://translate.google.com/translate_tts?ie=UTF-8&total=1&idx=0&textlen=32&client=tw-ob&q=Test&tl=En-gb". For now it works, but it may break any minute.
  2. SVOX PICO → It seems the best choice because it is open source, available in C++, and I know a repo that already uses it (this one https://github.com/robotology/speech/tree/master/svox-speech from my previous lab). It should be pretty easy to port to ROS.
  3. https://github.com/UbiquityRobotics/speech_commands has a pretty basic TTS (included with the STT). It is more complex, it is web based, but also this one might work.
  4. Festival is already there, it's free to use, but it is pretty bad.

Citation

Add citation in the README.md after April, 30th that references our ICRA paper.

Improve chooseObjectID()

As per 52c9dbc , now the ArmCtrl class accepts an array of objects. The SW then chooses randomly one of those objects among those available, but it should also choose among those that are set in the launch file.

[armCtrl] Improve choice of available objects

If the arm Controller is tasked with acting upon a list of object, the choice happens right inside the service, but it would be more robust if the list of objects to act upon is transferred out of the service call and directly inside the class so that we the controller can choose which one to act on with more flexibility. That would improve the success rate of the service during action cleanup for example.

Improve object_db for the Cartesian Estimator

Right now the object database is a rosparam encoded as an array of arrays of arrays, which is far from being the best solution.

In the branch feature/new_object_db I already changed the launch file to call them as YAML structures, but the backend still needs to be coded.

ImportError: cannot import name speech

when i run "Compile tests and run them" step . i meet wiith a problem as ImportError: cannot import name speech. i can not resolve this problem,can you help me?
Uploading 2018-08-24 12:57:55屏幕截图.png…

Segmentation fault on hsv_detector

For no reason at all, sometimes the hsv_detector segfaults. I traced the issue back to some ROS internals. Here is some gdb information:

Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x7fffd9d63700 (LWP 8265)]
0x00007ffff78a4366 in ros::Subscription::pubUpdate(std::vector<std::string, std::allocator<std::string> > const&) ()
   from /opt/ros/indigo/lib/libroscpp.so
(gdb) where
#0  0x00007ffff78a4366 in ros::Subscription::pubUpdate(std::vector<std::string, std::allocator<std::string> > const&) ()
   from /opt/ros/indigo/lib/libroscpp.so
#1  0x00007ffff7849126 in ros::TopicManager::pubUpdate(std::string const&, std::vector<std::string, std::allocator<std::string> > const&) ()
   from /opt/ros/indigo/lib/libroscpp.so
#2  0x00007ffff7850b44 in ros::TopicManager::pubUpdateCallback(XmlRpc::XmlRpcValue&, XmlRpc::XmlRpcValue&) () from /opt/ros/indigo/lib/libroscpp.so
#3  0x00007ffff783f7ba in ros::XMLRPCCallWrapper::execute(XmlRpc::XmlRpcValue&, XmlRpc::XmlRpcValue&) () from /opt/ros/indigo/lib/libroscpp.so
#4  0x00007ffff540987f in XmlRpc::XmlRpcServerConnection::executeMethod(std::string const&, XmlRpc::XmlRpcValue&, XmlRpc::XmlRpcValue&) ()
   from /opt/ros/indigo/lib/libxmlrpcpp.so
#5  0x00007ffff540c22c in XmlRpc::XmlRpcServerConnection::executeRequest() () from /opt/ros/indigo/lib/libxmlrpcpp.so
#6  0x00007ffff540950c in XmlRpc::XmlRpcServerConnection::writeResponse() () from /opt/ros/indigo/lib/libxmlrpcpp.so
#7  0x00007ffff54096d0 in XmlRpc::XmlRpcServerConnection::handleEvent(unsigned int) () from /opt/ros/indigo/lib/libxmlrpcpp.so
#8  0x00007ffff540761e in XmlRpc::XmlRpcDispatch::work(double) () from /opt/ros/indigo/lib/libxmlrpcpp.so
#9  0x00007ffff783c5aa in ros::XMLRPCManager::serverThreadFunc() () from /opt/ros/indigo/lib/libroscpp.so
#10 0x00007ffff421ca4a in ?? () from /usr/lib/x86_64-linux-gnu/libboost_thread.so.1.54.0
#11 0x00007ffff4db9184 in start_thread (arg=0x7fffd9d63700) at pthread_create.c:312
#12 0x00007ffff6b7337d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:111
(gdb) bt full
#0  0x00007ffff78a4366 in ros::Subscription::pubUpdate(std::vector<std::string, std::allocator<std::string> > const&) ()
   from /opt/ros/indigo/lib/libroscpp.so
No symbol table info available.
#1  0x00007ffff7849126 in ros::TopicManager::pubUpdate(std::string const&, std::vector<std::string, std::allocator<std::string> > const&) ()
   from /opt/ros/indigo/lib/libroscpp.so
No symbol table info available.
#2  0x00007ffff7850b44 in ros::TopicManager::pubUpdateCallback(XmlRpc::XmlRpcValue&, XmlRpc::XmlRpcValue&) () from /opt/ros/indigo/lib/libroscpp.so
No symbol table info available.
#3  0x00007ffff783f7ba in ros::XMLRPCCallWrapper::execute(XmlRpc::XmlRpcValue&, XmlRpc::XmlRpcValue&) () from /opt/ros/indigo/lib/libroscpp.so
No symbol table info available.
#4  0x00007ffff540987f in XmlRpc::XmlRpcServerConnection::executeMethod(std::string const&, XmlRpc::XmlRpcValue&, XmlRpc::XmlRpcValue&) ()
   from /opt/ros/indigo/lib/libxmlrpcpp.so
No symbol table info available.
#5  0x00007ffff540c22c in XmlRpc::XmlRpcServerConnection::executeRequest() () from /opt/ros/indigo/lib/libxmlrpcpp.so
No symbol table info available.
#6  0x00007ffff540950c in XmlRpc::XmlRpcServerConnection::writeResponse() () from /opt/ros/indigo/lib/libxmlrpcpp.so
No symbol table info available.
#7  0x00007ffff54096d0 in XmlRpc::XmlRpcServerConnection::handleEvent(unsigned int) () from /opt/ros/indigo/lib/libxmlrpcpp.so
No symbol table info available.
#8  0x00007ffff540761e in XmlRpc::XmlRpcDispatch::work(double) () from /opt/ros/indigo/lib/libxmlrpcpp.so
No symbol table info available.
#9  0x00007ffff783c5aa in ros::XMLRPCManager::serverThreadFunc() () from /opt/ros/indigo/lib/libroscpp.so
No symbol table info available.
#10 0x00007ffff421ca4a in ?? () from /usr/lib/x86_64-linux-gnu/libboost_thread.so.1.54.0
No symbol table info available.
#11 0x00007ffff4db9184 in start_thread (arg=0x7fffd9d63700) at pthread_create.c:312
        __res = <optimized out>
        pd = 0x7fffd9d63700
        now = <optimized out>
        unwind_buf = {cancel_jmp_buf = {{jmp_buf = {140736848082688, 5626924391969783832, 1, 0, 140736848083392, 140736848082688, -5626849403075100648, 
                -5626913084039954408}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}}
        not_first_call = <optimized out>
        pagesize_m1 = <optimized out>
        sp = <optimized out>
        freesize = <optimized out>
        __PRETTY_FUNCTION__ = "start_thread"
#12 0x00007ffff6b7337d in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:111
No locals.
(gdb) list
1	#include <stdio.h>
2	
3	#include <ros/ros.h>
4	#include "robot_perception/cartesian_estimator_hsv.h"
5	
6	int main(int argc, char ** argv)
7	{
8	    ros::init(argc, argv, "hsv_detector");
9	    ros::NodeHandle _n("hsv_detector");
10	

Unfortunately, the only thing I found online talking about this is here. I would need to manually compile ROS and debug it in order to fix it, but for now time is not enough.

Fake issue

Ignore this. I created it to be able to upload a picture to use in the README.md file of this repository.

Improving and creating unit tests

Checklist

  • /baxter_collaboration_lib/src/robot_utils

    • utils.cpp
    • baxter_trac_ik.cpp
    • ros_thread_image.cpp
  • /baxter_collaboration_lib/src/robot_interface

    • robot_interface.cpp
    • arm_ctrl.cpp
    • gripper.cpp
  • /baxter_collaboration_lib/src/robot_perception

    • aruco_client.cpp
    • cartesian_estimator.cpp
    • cartesian_estimator_client.cpp
    • cartesian_estimator_hsv.cpp

Force interaction is broken (again)

The force interaction seems to behave inconsistently. @omangin are you sure that the approach you suggested to @sarah-widder is the way to go?

She fixed the right arm for passing the screwdriver, but now it does not work anymore for holding. Same thing to the left arm. I am starting to wonder if it is worth to revert back to the previous version of the SW.

Add speech interaction

Theoretically, svox-tts can be added to the repository. Also, it would be nice to surface the text that has been spoken to the Baxter's display.

Restructure Repository

After the deadline, the repository will be restructured, to improve modularity, dependencies and ease of use. Similarly to aruco_ros, we will have three different packages:

  • baxter_collaboration_lib → with the libraries
  • baxter_collaboration_msgs → with messages and services
  • baxter_collaboration → with the source code used in our experiments

Compilation would be faster as well, at least in some cases.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.