Coder Social home page Coder Social logo

Comments (19)

alexwhittemore avatar alexwhittemore commented on July 29, 2024 1

As I'm sitting here trying the setup with the IMU filter enabled, I realize now that why we hadn't tried this before is that the filter launch seems to stomp on our urdf and make "imu"'s parent link "odom" instead of, as described in our urdf, "base_link." Trying to figure out how to rectify that now.

EDIT FYI: the solution here is to set the "publish_tf" parameter to "false" in the ImuFilterNodelet section of your imu launch file.

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

No, everything in this repo uses right-handed coordinates.

I'm assuming you're talking about the phidgets_imu package. The way the Phidgets Spatial 3/3/3 IMU reports the sensor values, you either need to turn the sensor on its head (i.e., the chips on the board facing downwards), or specify an upside-down orientation in the URDF to get sensible values.

If you can give me some more details, I can probably better help you.

from phidgets_drivers.

alexwhittemore avatar alexwhittemore commented on July 29, 2024

Thanks for the response! I'm working with Ryan on the project we've got using the 3/3/3. It might require a bit more effort to accurately characterize what we found, but here's the gist. What I certainly KNOW to be true is that our robot seems to function perfectly right now, having negated the sign of the y acceleration axis (or, un-negating it, on line 203 of imu_ros_i.cpp). When I tried simply altering the orientation of the IMU physically, or compensating for physical orientation by changing the URDF, we could always get the angular accelerations to work, but then the robot is oriented upside-down (and thus, flipping it right-side up with a URDF correction would reverse one of the rotation axes).

The IMU is oriented on the robot such that the USB jack is facing back and the top of the board is facing right (-x and -y in ROS coordinates). Therefore, the phidgets Y axis is aligned in the ROS Z axis, hence the negating Y and not Z. The relevant transform in our .urdf is

<parent link="base_link" />
<child link="imu" />
<origin xyz="0.017563 -0.05386 0.081079" rpy="1.570796 -1.570796 3.141593" />

I guess, the ultimate question here is, if the IMU's (linear accelerometer) coordinate system out of the driver is right-handed, why does Cartographer fail, and why does one linear axis negation with otherwise proper URDF definition fix the problem? (oh, yeah, this IMU is feeding a cartographer_ros-based mapping system)

And I don't mean that to sound accusatory or anything - more, I don't have a lot of hands-on low level experience developing with IMUs, so it's entirely plausible that there's something simple I'm just missing. But for how little work I otherwise need to do fitting these pieces together, I'm not sure what I might be missing. If I'd written my own handler or filter or something like that, there'd of course be a whole lot more room for user error on my part!

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

On one of our robots, "Calvin", the IMU is oriented such that the chips on the board are facing up. I don't remember which way the USB port faces. Anyhow, the orientation wrt. base_footprint that we use is rpy="${M_PI} 0 ${M_PI/2}":

https://github.com/uos/calvin_robot/blob/93b506a10c6fb8000517ebd09c9f7041e2ab9cbf/calvin_description/robots/calvin.urdf.xacro#L72

We feed the data from phidgets_imu to imu_filter_madgwick, and from there to robot_pose_ekf. With the orientation above, everything works as expected. When we use a neutral orientation (`rpy="0 0 0") in the URDF, the robot appears upside down.

https://github.com/uos/calvin_robot/blob/93b506a10c6fb8000517ebd09c9f7041e2ab9cbf/calvin_bringup/launch/imu.launch

That's why I'm relatively confident that things should work. I'm not ruling out that you've found a bug though, but then it must be canceling out with some other bug, because it's working in our case. Could you try again while remembering that "up" (from the software side) is the side of the IMU without the chips?

from phidgets_drivers.

alexwhittemore avatar alexwhittemore commented on July 29, 2024

I haven't tested this yet - it's on my to-do list, but at a lower priority. But from what I recall trying it before, I believe the outcome is that, yes, the robot comes into the world right-side-up, but having rotated about the X axis, the Z angular acceleration suddenly flips, and the robot rotates the wrong direction in that axis. I'll double check this to be quite certain, but like I said, it'll probably be a few days.

What I wonder is whether the madgwick filter package intelligently corrects for this, and that's why it's gone under the radar.

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

Just to be clear: You're feeding the output of the phidgets_imu package directly to cartographer? I don't think you should do that. For once, the orientation field of the IMU message isn't filled out. Also, raw output of an IMU is quite noisy, so it's expected that the acceleration suddenly flips. It's the job of a filter like imu_filter_madgwick to reduce that noise and provide an orientation estimate.

from phidgets_drivers.

alexwhittemore avatar alexwhittemore commented on July 29, 2024

That's right. I believe Cartographer explicitly ignores the orientation field, since it's using scan matching from LIDAR as one of the inputs to determine orientation, along with the IMU data. I'll probably experiment today with giving it the post-filter topic, but unless the filter also modifies the raw acceleration data, I expect identical performance.

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

Came here to tell you about publish_tf = false. Yes, that's what you should do.

unless the filter also modifies the raw acceleration data, I expect identical performance.

It doesn't modify the raw data, it only adds the orientation.

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

Maybe it's still a worthwhile exercise to set up the IMU the way we did: put it into the URDF, run imu_filter_madgwick to add orientation, run robot_pose_ekf to fuse IMU and odometry data. Then drive the robot around and visualize it in the "odom" frame in RViz. If that all works, you know that you've set up your transforms etc. correctly, and then you can feed the stuff to cartographer.

from phidgets_drivers.

alexwhittemore avatar alexwhittemore commented on July 29, 2024

That doesn't directly apply to our "robot," but I think what I've done to this point is functionally equivalent in terms of data rendered: This robot's only purpose is to map the area in 3D, so we're much more interested in Cartographer's mapping capabilities than its localization capabilities. Opposite most robots, I realize. But necessary either way for a scan head that's moving through large spaces. So the mobility platform for this is a person, walking, and holding the "robot" in their hand.

So on the one hand, I HAVE enabled imu_filter_madgwick, and confirmed that Cartographer doesn't care - it ignores orientation data totally and only processes raw input. Fusing odometry data is irrelevant in the sense that cartographer itself would be providing the odometry output.

On the other hand, what I Can confirm is that, after getting that far, everything does actually work - the robot moves inside the 'map' frame to match angular and translational movements in reality, as you would expect. And without the sign inversion, it fails miserably (with either linear acceleration off in one axis, or rotational off in one axis, depending on how I've oriented the IMU in urdf vs reality. The final thing that leads me to suspect that imu_filter_madgewick is covering for left-handed source coordinates is that the Phidgets manual itself at https://www.phidgets.com/?tier=3&catid=10&pcid=8&prodid=32#Orientation describes expected linear acceleration measurements, which, best I can tell, are left-handed. And the raw output from phidgets_imu matches these expectations.

Assuming that I'm right about all of that, and that madgewick is covering it up in most applications, I still don't actually know whether it's a bug or a feature. Spitting out exactly what the manual says to expect seems like a reasonable call to me. But then, how should a user expect to use raw data in an application such as mine, and what am I missing there.

I wouldn't submit my sign change as a patch, unless it's significantly more-correct for all the other use-cases. But it seems to me like that should be the case, no?

EDIT: FWIW, I'm testing a different IMU today in ROS, so I suppose it'll be informative how this other HW/SW stack behaves.

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

I've dug out our PhidgetSpatial 1044 3/3/3 IMU, and I've done some tests myself. You can see the results here:

https://vimeo.com/238773722

The tests show that the IMU is indeed publishing a right-handed coordinate system, and linear velocity and angular acceleration are published correctly. Does your IMU behave differently? If yes, could you record a video / rosbag of it?

from phidgets_drivers.

alexwhittemore avatar alexwhittemore commented on July 29, 2024

Thank you for taking the time to look! In the interest of minimizing confusion, I'm going to wait to comment at all until I can run the same test with my IMU here, but that'll have to wait until next week.

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

Sure! When you do, please open a new comment (I don't get notifications on edits).

from phidgets_drivers.

alexwhittemore avatar alexwhittemore commented on July 29, 2024

@mintar I'm sorry I've never followed up on this, the precise issue at hand isn't a current priority (though we may have to revisit it). But I'm curious, when using this IMU, what do you specify as the origin? It seems to me that the board itself doesn't specify one, so I wonder if the extrinsics between different axes are considered at the ROS driver level, or what?

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

Sorry for not responding earlier. I don't quite understand your question. Could you elaborate?

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

Closing due to inactivity.

from phidgets_drivers.

doisyg avatar doisyg commented on July 29, 2024

Same question as @alexwhittemore , I cannot find information about where is the origin of the measurement frame relative to the board.
Anybody knows here? What are you using for the imu frame, the center of the board?

from phidgets_drivers.

mintar avatar mintar commented on July 29, 2024

I cannot find this information either; Phidget's documentation doesn't say so (or I didn't find it). This ROS driver simply publishes the accelerometer/gyro/compass readings as reported by the device. You'll have to specify the imu frame via TF (for example using URDF).

My guess is that the device doesn't do any transformations, so the imu frame is somewhere on the accelerometer chip. However, since the device is so small, I don't think the exact position of the imu frame matters so much. If you place it in the center of the board, the distance to the true center will be much less than 2.5 cm. Also, note that this only matters for the accelerometer anyway; the compass and gyro are unaffected by a fixed translation error.

from phidgets_drivers.

doisyg avatar doisyg commented on July 29, 2024

Without more accurate information, I am also using the center of the board. As you say the error is max 2.5cm and only matters for the acceleros, not such a big deal.

from phidgets_drivers.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.