You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 26, 2021. It is now read-only.
I have a very big dataset. When I excute: bin/lightlda -num_vocabs 200000 -num_topics 2000 -num_iterations 100 -alpha 0.1 -beta 0.01 -mh steps 2 -num_local_workers 6 -num_blocks 1 -max_num_document 15000000 -input_dir /mnt/data -data_capacity 192000.
I have this error: Bad Alloc caught: failed memory allocation for documents_buffer in DataBlock.
I think it is because my server don't have enough memory, how can I deal with it?
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I have a very big dataset. When I excute: bin/lightlda -num_vocabs 200000 -num_topics 2000 -num_iterations 100 -alpha 0.1 -beta 0.01 -mh steps 2 -num_local_workers 6 -num_blocks 1 -max_num_document 15000000 -input_dir /mnt/data -data_capacity 192000.
I have this error: Bad Alloc caught: failed memory allocation for documents_buffer in DataBlock.
I think it is because my server don't have enough memory, how can I deal with it?
The text was updated successfully, but these errors were encountered: