Comments (5)
Hello @MengshiLi, that is correct, each LiDAR point cloud is provided in the egovehicle coordinate frame.
We do not provide the corresponding up/down LiDAR sensor from which each LiDAR return originates since we treat the LiDAR as one 64-beam LiDAR, rather than 2 separate 32-beam LiDARs. We've already placed all points into a common reference frame to make the data easier to use for our users, and we've motion compensated the sweeps for vehicle ego-motion.
However, we do provide the LiDAR laser index for each point. Another user has also been working on disambiguating the two LiDAR sweeps; their code can be found here and may be of interest to you.
from argoverse-api.
Thank you so much for the quick reply, @johnwlambert . On top of your clarification, when you combine two Lidar's output as a single 64-beam Lidar, how do you handle the timing difference between these two Lidars? Do you assume that these two Lidars are perfectly matched in time? Or is the timing difference between them so tiny that it can be ignored?
Also in the paper (http://openaccess.thecvf.com/content_CVPR_2019/html/Chang_Argoverse_3D_Tracking_and_Forecasting_With_Rich_Maps_CVPR_2019_paper.html), you mentioned some supplemental material several times, but we can't find it online. Could you pls share its access address? Thanks in advance!
from argoverse-api.
Hi @MengshiLi
"how do you handle the timing difference between these two Lidars? Do you assume that these two Lidars are perfectly matched in time? Or is the timing difference between them so tiny that it can be ignored?"
As John said, the points from the two lidars are motion compensated separately. Motion compensation is based entirely on the perceived movement of the AV. So if the AV is static, motion compensation isn't happening, and we're not doing anything different between the two lidar sweeps. If the AV IS moving, then we take into account that the lidar moved perhaps 0.5m between sweeps and adjust the location of the lidar returns accordingly. In practice, this works quite well for static objects in the world. However, when both the AV and an object are moving then the motion compensation won't help. So you can see cases where a moving object is "smeared" out a bit because the returns from that object came back 50ms apart and the object had moved half a meter in that time. It may be advantageous to process the two lidars independently in some cases. This situation isn't really unique to having two lidars, by the way. The entire notion of a lidar "frame" is artificial, because the points are being continuously acquired. Even with a single lidar, you can get a frame that starts and stops on the same object and thus the same "smearing" is observed at the "seam". So even with a single lidar you might want to reason about fine time granularities than "frames".
Regarding supplemental material, CVPR does host that. It's on the page you linked to :). See the "Supp" link at the very bottom.
from argoverse-api.
Hi @James-Hays
Thanks so much for the detailed explanation. So with time granularity being small enough, it is reasonable to align two Lidars' return beams that arrive approximately at the same time, and store them in a combined file, named by the nearest timestamp. Is this understanding correct?
from argoverse-api.
I don't think there's a simple answer to your question. For some applications, such as deep lidar object detection, I would imagine the merged, 10hz point clouds are sufficient. If you are instead trying to recover very precise shape models for dynamic objects then I think you want to maintain understanding of the time that each lidar point was detected.
from argoverse-api.
Related Issues (20)
- Is It Possible To Use Merely Part Of The Datasets HOT 1
- Download Link for HD-Map is Expired HOT 3
- How can I get a high-resolution image of the whole map?
- How to include the confidences in the forecasting results?
- Could We Install Argoverse-API through conda?
- Evaluation of Motion Forecasting Submission Failed
- laser_number range from 0-31
- Argoverse kitti HOT 1
- Visualize Filtered Centerline on Image is Wrong.
- Does this version not support for python 3.6.10? HOT 1
- [bug] official website unable to open HOT 2
- Argoverse 1 configuration in windows 10
- About OBJECT_TYPE HOT 4
- Map representation HOT 1
- argoverse’s evaluation service on eval.ai is down
- Outdated link under 3) Argoverse Tracking and Argoverse Forecasting
- Preparing metadata (pyproject.toml) did not run successfully. HOT 2
- How do I get the global coordinates of my car? HOT 2
- The picture does not match the coordinate point
- No module named 'argoverse' is reported while argoverse has successfully installed HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from argoverse-api.