Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversion between disparity map and depth map #7431

Closed
anguyen216 opened this issue Sep 28, 2020 · 7 comments
Closed

Conversion between disparity map and depth map #7431

anguyen216 opened this issue Sep 28, 2020 · 7 comments

Comments

@anguyen216
Copy link


Required Info
Camera Model { D400 }
Operating System & Version {Win (8.1/10)
Kernel Version (Linux Only) (e.g. 4.14.13)
SDK Version { legacy / 2.. }
Language {python3 }

Issue Description

How do I convert a frame's depth map (extracted using the following code) to disparity map. Especially in the case where I only have access to the depth frame data and not the composite depth frame object.

# get depth map
frame = pipe.wait_for_frames()
depth = frame.get_depth_frame()
depth_data = np.asanyarray(depth.get_data())

I followed the following tutorial to transform the depth frame using disparity transform and get the following values (first matrix in the image below). What is the meaning of these values? The tutorial mentioned casting the values into floats, but that doesn't make much sense to me. Can somebody elaborate more on the casting part?

dp_filter = rs.disparity_transform()
dp = dp_filter.process(depth)
dp_data = np.asanyarray(dp.get_data())

Also, the documentation on D400 camera mentioned that the depth information is in meters. However, the values in the depth matrix are all over 1000 (refer to the second matrix in the image below). I don't suppose that it is indeed 1000 meters. How do I get the depth values in meters?

120255993_990640564774557_251655592704506115_n

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Sep 28, 2020

Hi @anguyen216 Depth values displayed as a large 16-bit uint value (in a range between 0 and 65535) can be converted to meters by multiplying the value by the depth unit scale. You can retrieve the current depth unit scale from the camera, though it is easier just to multiply by 0.001 (the default depth unit scale value of the 400 Series cameras). For example, a unint of 65535 becomes 65.535 meters when multiplied by 0.001. This is discussed further in this link:

#3508

Regarding displaying a disparity map with Python, the tutorial in the link below for doing so with Python and OpenCV may be a useful alternate reference to the guide that you already found:

https://docs.opencv.org/3.4/dd/d53/tutorial_py_depthmap.html

The subject of disparity map generation is also discussed in detail at this link:

https://community.intel.com/t5/Items-with-no-label/Issues-with-custom-disparity-generation/td-p/625261

I will refer your question about casting values into floats to @dorodnic as he was the RealSense team member who discussed it originally

#1778 (comment)

@ev-mp
Copy link
Collaborator

ev-mp commented Sep 29, 2020

@anguyen216 , to clarify:
1.

I followed the following tutorial to transform the depth frame using disparity transform and get the following values (first matrix in the image below). What is the meaning of these values?

dp_filter = rs.disparity_transform()
dp = dp_filter.process(depth)
dp_data = np.asanyarray(dp.get_data())

The values obtained with disparity_transform are in RS2_FORMAT_DISPARITY32 format with the Depth<=>Disparity conversion defined as:
depth = (baseline * focal_length *32) / disparity (all units in meters).
You may think of them as actual disparity multiplied by a const.
The values are floats as expected from the above equation, but due to the (huge) actual numbers it is anticipated that the number of post-decimal digits will be zero or close to it.

Duplicate of #3344, #2139

Also, the documentation on D400 camera mentioned that the depth information is in meters. However, the values in the depth matrix are all over 1000 (refer to the second matrix in the image below). I don't suppose that it is indeed 1000 meters. How do I get the depth values in meters?

The depth information that is retrieved using Librealsense API is in meters, e.g. frame.get_distance(x,y), rs2_deproject_pixel_to_point(....
However the "raw" depth frame is unitless according to the established Z16 frame format , and as mentioned by @MartyG-RealSense, shall be converted to actual depth (in meters) by multiplying each "raw" value with depth units value.

@MartyG-RealSense
Copy link
Collaborator

Hi @anguyen216 Do you require further assistance with this case, please? Thanks!

@anguyen216
Copy link
Author

@MartyG-RealSense @ev-mp Thank you both for helping.
I'm still exploring this issue. Please leave the thread open for a few more days, I'll try to update as soon as I can. Thank you!

@anguyen216
Copy link
Author

@MartyG-RealSense Thank you for all the help! I have resolved all my issues so far.

@MartyG-RealSense
Copy link
Collaborator

Great news - thanks very much @anguyen216 for the update!

@fenil25
Copy link

fenil25 commented Nov 22, 2020

@anguyen216 , to clarify:
1.

I followed the following tutorial to transform the depth frame using disparity transform and get the following values (first matrix in the image below). What is the meaning of these values?

dp_filter = rs.disparity_transform()
dp = dp_filter.process(depth)
dp_data = np.asanyarray(dp.get_data())

The values obtained with disparity_transform are in RS2_FORMAT_DISPARITY32 format with the Depth<=>Disparity conversion defined as:
depth = (baseline * focal_length *32) / disparity (all units in meters).
You may think of them as actual disparity multiplied by a const.
The values are floats as expected from the above equation, but due to the (huge) actual numbers it is anticipated that the number of post-decimal digits will be zero or close to it.

Duplicate of #3344, #2139

Also, the documentation on D400 camera mentioned that the depth information is in meters. However, the values in the depth matrix are all over 1000 (refer to the second matrix in the image below). I don't suppose that it is indeed 1000 meters. How do I get the depth values in meters?

The depth information that is retrieved using Librealsense API is in meters, e.g. frame.get_distance(x,y), rs2_deproject_pixel_to_point(....
However the "raw" depth frame is unitless according to the established Z16 frame format , and as mentioned by @MartyG-RealSense, shall be converted to actual depth (in meters) by multiplying each "raw" value with depth units value.

What is the value of this constant that the disparity is multiplied with?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants