-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unpack_yuy2_cuda_helper assert when running d415 color max resolution #7415
Comments
Hi @jeffeDurand I'm not sure that the assert CUDA error provides a direction to the exact cause of the problem and may be more like a general "CUDA failed" message. This is because virtually the same error occurred in the past with a seemingly unrelated issue where CUDA was not working properly if an internet connection was not present. You may not also need to define that full list of CMake flags in a CMake build instruction, since a number of them will be False by default anyway. The link below describes the default state (true or false) of the flags. It likely will have no influence on your CUDA problem but making the list shorter will be more convenient. https://github.com/IntelRealSense/librealsense/wiki/Build-Configuration As a starting point, may I ask if you encounter the same problem if you run color and depth at 30 FPS instead of 15 please? |
Hi MartyG. Thanks for replying. I already saw that previous issue and didn't find nothing useful with it. I'm aware that some cmake flags are already false/true by default. But it helps me to have them defined when I need to turn them off/on without having to go such again in the documentation to know their names. I've put them there in here to just make sure we don't lose time around the question "what flags do you pass to cmake". I've already tried with fps at 30. Same issue. Any other things I could try? |
May I ask which Jetson board you are using, please (e.g Xavier). |
Xavier Nx |
And do you start with 1280x720 color and switch it later to 1920x1080 whilst the program is running, or is color always set to 1920x1080 from the start? |
It starts with the wanted one. Never switching res during runtime. |
I recall a situation on Jetson somewhat similar to this, where the RealSense user had no problems when using 1280x720 for both color and depth but the program froze when using 1920x720 for color. In their case, they found that disabling USB auto-suspend in the usbcore parameters solved their problem. |
I've tried to disable auto-suspend, but it didn't yield results. However, I've tried running RS2_STREAM_COLOR without running RS2_STREAM_DEPTH and the problem disappeared. Also, setting the option RS2_OPTION_EMITTER_ENABLED to 1 (laser) while running RS2_STREAM_DEPTH seems also to bypass the issue. Using RS2_OPTION_EMITTER_ENABLED either 2 (autolaser) or 3 (+led) seems to recreate the issue. |
Good to hear that you had positive results. In regard to laser settings for the IR Emitter: 0 is Off, 1 (laser) is the normal On mode, 2 (laser auto) is deprecated and was related to on-off toggling I believe; and 3 (led) I have never seen a description for its function |
I replied too fast yesterday. We still get the issue when the emitter is on. But never when it's turned off. Can it be related to alignment block with the cuda implementation while depth/color would be not of the same res size? |
Perhaps you could put an error catch into your program that performs a hardware_reset() instruction to perform a hardware reset of the camera when the error occurs. If the program works fine without CUDA except for the FPS drop, another approach would be to drop CUDA and use the SDK's GLSL processing blocks instead. This performs a similar function to CUDA, offloading work from the CPU to the GPU. It is vendor-neutral, meaning that it should work with any GPU brand, though it may not provide a noticeable improvement on low-end devices. There is an example program called rs-gl here: https://github.com/IntelRealSense/librealsense/tree/master/examples/gl |
Thanks MartyG, I will try that. |
Hi @jeffeDurand Do you require further assistance with this case, please? Thanks! |
Hi MartyG, at the moment, we switched back to the cpu version of the sdk with the lowest depth resolution until I have more time to investigate the assert problem. I've looked at tHe GLSL processing blocks, but I guess this would not help the case since the assert happens before while rs2 sdk is doing the cuda conversion, and not after. My next attempt will be to try to get the raw input from the cam and do the conversion using the hardware jetson<s yuv to rgb conversion instead. |
Thanks very much for the update @jeffeDurand - please do update here when you have some more news. Good luck! |
Hi @jeffeDurand Do you have an update for me about this case please? Thanks! |
We have switch to the cpu version instead of cuda. We have also use the opencv yuv to rgb conversion which is faster than the rs2 implementation or using the jetson hardware conversion. I guess we will stick with this for now. |
Okay, thank you very much for the update! |
Hi @jeffeDurand Do you still require assistance with this case pleae? Thanks! |
Issue Description
We are using the cuda implementation of the realsense library to read frames for our project. When we are using the color sensor with the default resolution (1280x720), everything is fine. However, when we try to switch to get the high res mode working :
We sometimes end up with this assert in the librealsense sdk :
../src/cuda/cuda-conversion.cu:240: void rscuda::unpack_yuy2_cuda_helper(const uint8_t*, uint8_t*, int, rs2_format): Assertion
result == cudaSuccess' failed.`While debugging, I found that the reported cuda error code was a 700 (out-of-bounds access).
It is not a 100% repro case on my end when starting from a new instance of my app. It sometimes work at first and everything is fine. But when this error occurs once, it will fails 100% on every other restart of my app. The interesting part : when I unplug the usb-c cable, it might start to work again fine.
I've already implemented the hardware_reset method on the devices upon start of my application to no avail. I've also tried without success to lsusb_reset the cam before starting my application.
We are not lacking cpu/gpu memory on the device.
I never see this problem while running with :
cfg.enable_stream(RS2_STREAM_COLOR, 1280, 720, RS2_FORMAT_RGB8, 15);
cfg.enable_stream(RS2_STREAM_DEPTH, 1280, 720, RS2_FORMAT_Z16, 15);
Here's my cmake flags :
cmake -GNinja
-DBUILD_SHARED_LIBS=ON
-DBUILD_EXAMPLES=OFF
-DBUILD_UNIT_TESTS=OFF
-DBUILD_PYTHON_BINDINGS=OFF
-DBUILD_MATLAB_BINDINGS=OFF
-DBUILD_OPENNI2_BINDINGS=OFF
-DBUILD_NODEJS_BINDINGS=OFF
-DBUILD_CSHARP_BINDINGS=OFF
-DBUILD_UNITY_BINDINGS=OFF
-DBUILD_WITH_TM2=OFF
-DBUILD_WITH_CUDA=ON
-DBUILD_PYTHON_DOCS=OFF
-DBUILD_NETWORK_DEVICE=OFF
-DCMAKE_CUDA_COMPILER=/usr/local/cuda/bin/nvcc
-DANDROID_USB_HOST_UVC=OFF
..
I've also tried to run without the cuda implementation. It runs fine, but our overal fps drops halfway without the cuda implementation and we are then below our fps target.
We are then blocked in our development and can't ship our product while not using the full resolution of the camera. It is a critical moment for us we really need this to be resolved.
The text was updated successfully, but these errors were encountered: