Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory error with numpy's linear algebra with very large datasets #64

Closed
SimonMolinsky opened this issue Dec 31, 2020 · 2 comments
Closed
Assignees
Labels
bug Something isn't working help needed Let's work on this together!

Comments

@SimonMolinsky
Copy link
Member

If dataset has more than 10 000 points then calculations of semivariogam crashes.

@SimonMolinsky SimonMolinsky added the bug Something isn't working label Dec 31, 2020
@SimonMolinsky SimonMolinsky self-assigned this Dec 31, 2020
@SimonMolinsky SimonMolinsky changed the title Memoty error with numpy's linear algebra with very large datasets Memory error with numpy's linear algebra with very large datasets Dec 31, 2020
@SimonMolinsky
Copy link
Member Author

Problem is too complex to solve it without large changes in package structure. It is still open and it will be reviewed in the future. If you, dear reader, are an expert in computational algebra and memory management then we need you!

@SimonMolinsky SimonMolinsky added the help needed Let's work on this together! label Jan 2, 2021
@SimonMolinsky
Copy link
Member Author

Ok, I close this issue for now, but if it occurs at some point in the future, I will rewrite more chunks of code with dask.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help needed Let's work on this together!
Projects
None yet
Development

No branches or pull requests

1 participant