Coder Social home page Coder Social logo

Comments (8)

imankgoyal avatar imankgoyal commented on June 2, 2024

Hi @FinnJob,

Thanks for your kind words.

In our real world experiments we just use one camera. We calibrate the robot-camera extrinsics and transform the perceived point clouds to the robot base frame. In the robot base frame, the robotic arm is front.

For two cameras, you could potentially calibraate each one of them wrt to the robot base frame.

For questions about the software we use for calibrating wrt to the robot base frame, @ychao-nvidia might be the best person.

Best,
Ankit

from rvt.

FinnJob avatar FinnJob commented on June 2, 2024

Thank you for your replay. It helps a lot!

from rvt.

FinnJob avatar FinnJob commented on June 2, 2024

May I ask what camera you are using for implementation? Currently, we are using the RealSense camera, but the accuracy decreases when the distance exceeds 50cm. Do you have any suggestions?

from rvt.

imankgoyal avatar imankgoyal commented on June 2, 2024

We used an Azure Kinect camera. Every camera will have a range where the point cloud is reasonable. We found Azure Kinect to be good for our setup.

from rvt.

FinnJob avatar FinnJob commented on June 2, 2024

OK. Thank you for your replay.

from rvt.

FinnJob avatar FinnJob commented on June 2, 2024

When conducting landing experiments with UR3, we have encountered an issue where commanding the robotic arm to traverse a list of TCP poses sometimes leads to unexpected solutions. This results in significant joint changes, deviating from the expected motion as seen in the simulation environment. Could you provide insights on how you have addressed or mitigated this problem in your setup? We are seeking guidance on achieving more consistent and predictable execution of actions on the robotic arm in a real-world scenario, similar to the behavior observed in the simulation environment. Any advice or recommendations would be greatly appreciated. Thank you.

from rvt.

imankgoyal avatar imankgoyal commented on June 2, 2024

Unfortunately, I have not used UR3, so I am unable to help here. For our experiments we used Franka and we used Frankapy to move the robot to specific poses.

from rvt.

imankgoyal avatar imankgoyal commented on June 2, 2024

Closing because of inactivity.

from rvt.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.