-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add config_utils tests #262
Add config_utils tests #262
Conversation
5495027
to
168c9a3
Compare
Thank you for the PR! In the test function
As this check and raising of error is done by Apart from this doubt, everything looks good to me! Thanks! |
Yea, you're right @Abhishek-TAMU . That test doesn't increase test coverage for Additional testing wasn't necessary but I'm sure wouldn't hurt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing tests Angel, thank you so much for adding them! Have a few small comments on things to change, such as your test methods are testing such good and varied use cases. But since they are testing different functionality, you should separate out each test case, each test method should test a specific aspect/use-case of the function or method. Use long and descriptive names for testing functions to show what you are testing.
# Default values: r=8, lora_alpha=32, lora_dropout=0.05, target_modules=["q_proj", "v_proj"] | ||
tuning_config = peft_config.LoraConfig(r=3, lora_alpha=3) | ||
|
||
config = config_utils.get_hf_peft_config("CAUSAL_LM", tuning_config, "") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another test could be, what happens if a tokenizer is passed? Since tokenizer_name_or_path
is by default set to model_name_or_path
, then this value will always get passed when it's run in sft_trainer.py. I believe tokenizer is used in PromptTuning but would be ignored for LoRA?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the only comment that I think wasn't addressed, what do you think about adding a test where you run with lora and pass in a tokenizer (which we expect to be ignored)?
tuning_config = peft_config.PromptTuningConfig(prompt_tuning_init="RANDOM") | ||
config = config_utils.get_hf_peft_config(None, tuning_config, None) | ||
assert isinstance(config, PromptTuningConfig) | ||
assert config.tokenizer_name_or_path is None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting! So that means others like prompt_tuning_init_text
which is also only used if prompt_tuning_init is TEXT would also become None?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated the function call to be more readable. The function call is
config = config_utils.get_hf_peft_config(task_type=None, tuning_config=tuning_config, tokenizer_name_or_path=None)
So the tokenizer_name_or_path
is allowed to be None only when prompt_tuning_init
is type RANDOM. When prompt_tuning_init
is TEXT, you cannot pass None for tokenizer. It raises an exception. Reference
Other fields are not affected.
Signed-off-by: Angel Luu <[email protected]>
Signed-off-by: Angel Luu <[email protected]>
Signed-off-by: Angel Luu <[email protected]>
19e39f9
to
906ce02
Compare
Signed-off-by: Angel Luu <[email protected]>
Signed-off-by: Angel Luu <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, many thanks Angel! Love having more test coverage
Description of the change
Add unit tests for the config_utils with 100% test coverage
Related issue number
Part of #75
How to verify the PR
Was the PR tested