Skip to content

Commit

Permalink
fixes flairNLP#3623: save PEFT config to transformer embeddings and e…
Browse files Browse the repository at this point in the history
…xport as param so that it loads correctly.
  • Loading branch information
MattGPT-ai committed Mar 4, 2025
1 parent c6b053d commit d8a7dd8
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions flair/embeddings/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -1128,6 +1128,7 @@ def is_supported_t5_model(config: PretrainedConfig) -> bool:
if "Please use the model as it is" not in str(e):
raise e

self.peft_config = peft_config
if peft_config is not None:
# add adapters for finetuning
try:
Expand Down Expand Up @@ -1376,6 +1377,7 @@ def to_params(self):
"subtoken_pooling": self.subtoken_pooling,
"cls_pooling": self.cls_pooling,
"config_state_dict": config_dict,
"peft_config": self.peft_config,
}

return model_state
Expand Down

0 comments on commit d8a7dd8

Please sign in to comment.