-
Notifications
You must be signed in to change notification settings - Fork 10.1k
Issues: ggerganov/llama.cpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Eval bug: input is too large to process. increase the physical batch size
bug-unconfirmed
#11105
opened Jan 6, 2025 by
Tian14267
Feature Request: RuntimeError: Unsloth: The file 'llama.cpp/llama-quantize' or 'llama.cpp/quantize' does not exist. But we expect this file to exist! Maybe the llama.cpp developers changed the name?
enhancement
New feature or request
#11100
opened Jan 6, 2025 by
San-Jain18
4 tasks done
Vulkan related question: what's the different between server and cli?
#11099
opened Jan 6, 2025 by
FNsi
Misc. bug: gguf pypy package dumps /scripts/ and /examples/ into site-packages
bug-unconfirmed
#11089
opened Jan 5, 2025 by
Ph0rk0z
Misc. bug: ggml_backend_sycl_graph_compute: error: op not supported node_1586 (FLASH_ATTN_EXT)
bug-unconfirmed
#11084
opened Jan 5, 2025 by
alfrentgen
Misc. bug: [Mac M4]llama-server cannot run in release-4409 but can run in 4406
bug-unconfirmed
#11083
opened Jan 5, 2025 by
bobleer
Compile bug: Trying to compile on a raspi w v2 and failing to compile
bug-unconfirmed
#11079
opened Jan 4, 2025 by
RichNeese
Eval bug: ggml_sycl_cpy: unsupported type combination (q8_0 to f32)
bug-unconfirmed
#11078
opened Jan 4, 2025 by
paoletto
Feature Request: Add Parameters Section To change parameters in llama-android
enhancement
New feature or request
#11073
opened Jan 4, 2025 by
Dhruvanand24
4 tasks done
Compile bug: Error Domain=MTLLibraryErrorDomain Code=3
bug-unconfirmed
#11071
opened Jan 3, 2025 by
p-w-rs
Misc. bug: Package llama-b4409-bin-ubuntu-x64.zip is defective
bug-unconfirmed
#11068
opened Jan 3, 2025 by
T-Shilov
Feature Request: Top-nσ sampler
enhancement
New feature or request
#11057
opened Jan 3, 2025 by
FellowTraveler
4 tasks done
Feature Request: Add Chat Template In Llama-Android
enhancement
New feature or request
#11056
opened Jan 3, 2025 by
Dhruvanand24
4 tasks done
Feature Request: Add support for Kokoro TTS
enhancement
New feature or request
#11050
opened Jan 3, 2025 by
broke-end-dev
4 tasks done
Feature Request: Snap package version of llama.cpp for testing and where compilation may not be possible on the target machine.
enhancement
New feature or request
#11048
opened Jan 2, 2025 by
QuickHare
4 tasks done
Eval bug: <| <|end_of_text|> token on the end of output
bug-unconfirmed
#11043
opened Jan 2, 2025 by
gnusupport
Misc. bug: RISCV output bug when using rvv with vlen > 256bit
bug-unconfirmed
#11041
opened Jan 2, 2025 by
grigohas
Previous Next
ProTip!
Follow long discussions with comments:>50.