-
-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loading massive (10GB) ply files #2145
Comments
Thanks for submitting. This is gonna be fun to debug 😅 . |
Ok first thing to try is to compile the current version of master and see if you have the same problem with it. I assume that is gonna be the same thing but we need to rule that one out early enough. |
I can confirm that it still fails in the same way after building pcl from master. I will continue to follow any further instructions with this build. |
What is the approximate number of points stored in this file? |
from the header:
|
And what is the type of the point cloud you are reading into? If it is more than 23 bytes, then this overflows: Line 98 in c1e395b
|
By type, do you mean, pcl::PointNormal or the type of the numbers in the point cloud? Here's the rest of the head:
|
I mean the point type, e.g. |
Actually, having a closer look at what is going on, the data is first read into the |
I think this confirms:
|
I apologize if I misunderstood your directive. I'm new to c++ and gdb in general. I'm still trying to get the debug print near that line |
Just print the result of the multiplication of point step * height * width into the terminal. The same statement that is used to resize the data vector. |
|
Can you also print point step and cloud width? |
|
So the problem is here indeed. We need to allocate cloud_->data.resize (static_cast<uint64_t>(cloud_->point_step) * cloud_->width * cloud_->height); |
It did load correctly! I ran into other issues, but those don't seem to be related to this issue. I will implement the change as a patch in my own software, but will this be added into master as well? |
The idea was for you to submit a PR here and we commit the change. @taketwo shouldn't we cast to size_t ? |
I can do that. Let me know which type is best to use. I'll test both. |
Yes, |
This change is not working for me. All of the points I read into memory are all in 0 value which means the actual parse is skipped somewhere because of the size |
Your Environment
Expected Behavior
pcl::io::LoadPLYFile should be able to handle large files without crashing.
Current Behavior
While trying to load a 10ish GB .ply point cloud using the following:
I get a seg fault. Here's the gdb output:
You can find the whole code source here: https://github.com/OpenDroneMap/OpenDroneMap/blob/master/modules/odm_meshing/src/OdmMeshing.cpp#L219
Possible Solution
I'm not exactly sure what the problem is, hoping y'all more experienced pcl devs can help
Code to Reproduce
You can find the isolated project here: https://github.com/dakotabenjamin/odm_meshing and follow the build instructions. I can share the large dataset privately.
Context
We are trying to scale up our photogrammetry software to be able to handle very large (5000+ image) datasets
The text was updated successfully, but these errors were encountered: