Comments (5)
For stereoBM, the maxDisparity=64
, other parameters are the same as OpenCV's default. No speckle filter is used.
Yes, the output grayscale images are already calibrated. That's why we don't provide distortion coefficients.
from guidance-sdk.
The default parameter for opencv stereoBM is numDisparities=0
and blockSize=21
via StereoBM::create
maxDisparity = numDisparities - minDisparties
correct?
By default the blockSize is much larger than what the guidance system output. I had to tune blockSize value down to 9 to get it resemble the guidance depth output.
Ptr<StereoBM> sbm = StereoBM::create(64, 9);
smb->compute(left, right, depth);
Also, the output of stereoBM
is a disparity map. Thus the colors should be inverse of the depth map.
In order to generate the depth map from disparity map I need to perform the following calculation.
depth = baseline * focal_length / disparity
depth should be a int16, so what should the units of baseline and focal_length used for the calculation? What are the units of the depth image (mm?).
If the units are in mm will my equation look like this? (247.35mm focal length and 15cm baseline)
depth = 150 * 247 / disparity
Anyways, here is the depth image I generated with the following code:
Ptr<StereoBM> sbm = StereoBM::create(64, 9);
sbm->compute(g_greyscale_image_left, g_greyscale_image_right, g_depth);
g_depth = (247.35 * 150)/g_depth;
g_depth.convertTo(depth8, CV_8UC1);
imshow(frame_id[CAMERA_PAIR_INDEX] + " depth opencv", depth8);
Seems like the scaling is just off a bit.
from guidance-sdk.
When just compare the disparity maps, the images looks almost exactly the same. Now the issue is how is the guidance core computing depth map from the disparity map?
Also, when I pull disparity (either generated from opencv or guidance core) and depth from opencv (calculated based on depth = 150 * 247 / disparity) from the publish topic. The image is computer blown out white. This isn't the case with depth generated by the guidance core.
The following images were published by guidanceNode then pulled from topic via guidanceNodeTest
These depth images were generated by guidance core
These depth images were generated by opencv
from guidance-sdk.
Just an update.
I'm an idiot and completely forgot about the fractional bits. Both the disparity map generated by the guidance core and opencv StereoBM are 16-bit with 4-bit fractions. Thus, one has to divide the raw disparity data by 16 to convert to floating points.
Also, the units of depth map is in meters and not mm.
Now according to guidance documentation, the depth image also has fractional bits, but it is 7-bit. Thus to match the guidance core depth image output, you have to multiple the floating point depth image by 128.
For anyone using ROS, the standard for depth images is meters in floating points. Thus you'll have to divide the guidance core's depth by 128 first to convert it into float points.
from guidance-sdk.
Thank @uwleahcim for the explanations. Yes, Guidance has 7-bit fractions. We have given an example of displaying real depth value at the center of image in usb_example/main.cpp.
from guidance-sdk.
Related Issues (20)
- Black pixels in bottom part of image HOT 5
- A strange phenomenon of SVO algorithm HOT 1
- Issue Building on Raspberry Pi HOT 4
- frame index and timestamps HOT 1
- imu data is always zeros HOT 8
- Compiling sample on Mac OSX Yosemite - Unsupported file format HOT 1
- Is this a bug? I think there should be an assignment sentence.
- Use member function as a callback. HOT 3
- could you help me with Guidance in data? HOT 1
- Is it possible to obtain depth images and greyscale images of all five sensors in the same time when running usb_example? HOT 2
- Arduino or STM port
- Compiling usb example on Jetson TX1 with latest L4T Image HOT 10
- no callback for select_motion
- Undefined reference HOT 2
- Error while trying to compile USB example HOT 6
- How to use Guidance SDK with OnboardSDK HOT 5
- ***stack smashing detected*** HOT 1
- Fly M100 without GPS use Guidance
- uart_example hangs on read_serial HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from guidance-sdk.