Skip to content

Commit

Permalink
llama : remove check flash_attn with lora (#11104)
Browse files Browse the repository at this point in the history
  • Loading branch information
ngxson authored Jan 6, 2025
1 parent 96a1dc2 commit 09186fa
Showing 1 changed file with 0 additions and 6 deletions.
6 changes: 0 additions & 6 deletions src/llama.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -11519,13 +11519,7 @@ int32_t llama_lora_adapter_set(
struct llama_context * ctx,
struct llama_lora_adapter * adapter,
float scale) {
if (ctx->cparams.flash_attn) {
LLAMA_LOG_ERROR("%s: flash_attn is not compatible with LoRA\n", __func__);
return -1;
}

ctx->lora_adapters[adapter] = scale;

return 0;
}

Expand Down

0 comments on commit 09186fa

Please sign in to comment.