You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In order to run the NLP models, there are a lot more dependencies required, in addition to downloading the model (both sizes in GBs) and more loading time.
There should be two install options, based on the analysis performed to make packaging and installation easier-
default
deep
Where the default does the issue detection and issue type classification with minimum dependencies (only scancode-toolkit dependencies), and faster.
And then the deep option which uses the NLP models. Also, enable GPU computation based on whether required libraries (Cuda) are installed and a GPU available.
In order to run the NLP models, there are a lot more dependencies required, in addition to downloading the model (both sizes in GBs) and more loading time.
There should be two install options, based on the analysis performed to make packaging and installation easier-
default
deep
Where the
default
does the issue detection and issue type classification with minimum dependencies (onlyscancode-toolkit
dependencies), and faster.And then the
deep
option which uses the NLP models. Also, enable GPU computation based on whether required libraries (Cuda) are installed and a GPU available.Related - #21
The text was updated successfully, but these errors were encountered: