You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 24, 2024. It is now read-only.
The newly-added perplexity will segfault with large enough input. This is likely because the current implementation of the LLaMA model isn't really set up to handle going over context, and updating it (#210) should fix this.
The other models segfault too (from a quick test), but perplexity is primarily useful for comparing against llama.cpp, so I'm not as fussed. It would be nice to figure this out at some point, though.
The text was updated successfully, but these errors were encountered:
Just a heads-up: updating LLaMA did not in fact fix this. We have other bugs we need to fix here, but the segfault is within ggml, so it's harder to track down.
The newly-added perplexity will segfault with large enough input. This is likely because the current implementation of the LLaMA model isn't really set up to handle going over context, and updating it (#210) should fix this.
The other models segfault too (from a quick test), but perplexity is primarily useful for comparing against llama.cpp, so I'm not as fussed. It would be nice to figure this out at some point, though.
The text was updated successfully, but these errors were encountered: