Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEA] Possible improvements for sparse KNN #1187

Closed
viclafargue opened this issue Jan 26, 2023 · 6 comments
Closed

[FEA] Possible improvements for sparse KNN #1187

viclafargue opened this issue Jan 26, 2023 · 6 comments
Labels
feature request New feature or request

Comments

@viclafargue
Copy link
Contributor

viclafargue commented Jan 26, 2023

Possible improvements for sparse KNN, some of which might be necessary to implement MNMG sparse KNN :

@cjnolet
Copy link
Member

cjnolet commented Jan 26, 2023

Hey @viclafargue these improvements look useful. Are you also planning to implement them as part of the sparse knn update work?

@viclafargue
Copy link
Contributor Author

viclafargue commented Jan 26, 2023

Hey @cjnolet, I've already been working on making sparse KNN compile with int64_t, and also looked into ways to have different types for inputs and outputs indices. However, sparse KNN is quite complex and I feel like RAFT/KNN experts would be better suited to work at least on the two last points mentioned above. That said, if no one is available I can work on these.

@cjnolet
Copy link
Member

cjnolet commented Jan 26, 2023

@viclafargue yes it would be very helpful if you have cycles to take these on. As you probably know, we are are short in resources across the board, but when issues like this arise in cuml it's somewhat expected that the corresponding raft pieces will be updated as well.

@cjnolet
Copy link
Member

cjnolet commented Feb 6, 2023

@viclafargue, I'm reivewing your PR and I'm trying to think through why this is necessary:

Provide the possibility to specify different types for indices of sparse matrix inputs, and indices of dense nearest neighbors output

Can you explain the need for this?

@viclafargue
Copy link
Contributor Author

viclafargue commented Mar 16, 2023

Sorry @cjnolet, I missed your question. This is actually something optional, I was thinking that we could systematically return indices as int64 as with the dense algorithm. But, when I think of it, compiling more templates is probably not the best option. In addition, input indices could be int32 or int64, and it's getting complicated to create comms for MNMG KNN to send both datatypes. We should probably just systematically convert input indices to int64 in the MNGM KNN Python layer and only allow int64 inputs from the MNGM KNN C++ API.

rapids-bot bot pushed a commit that referenced this issue Jul 26, 2023
Answers #1187

Authors:
  - Victor Lafargue (https://github.com/viclafargue)

Approvers:
  - Tamas Bela Feher (https://github.com/tfeher)

URL: #1640
@viclafargue
Copy link
Contributor Author

Closing this issue as it does not seem to be relevant anymore.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants