Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

args中没有fp16属性 #17

Open
King-Jullien opened this issue Apr 20, 2024 · 0 comments
Open

args中没有fp16属性 #17

King-Jullien opened this issue Apr 20, 2024 · 0 comments

Comments

@King-Jullien
Copy link

在给出的run_text2text_csl.py文件中 和 args.fp16 有关的属性全部都要出错,我手动在命令行添加了--fp16 true但是报错,报错信息为args属性中没有--fp16,只有以下属性
[-h] [--pretrained_model_path PRETRAINED_MODEL_PATH]
[--output_model_path OUTPUT_MODEL_PATH] --train_path TRAIN_PATH
--dev_path DEV_PATH [--test_path TEST_PATH]
[--config_path CONFIG_PATH]
[--embedding {word,pos,seg,sinusoidalpos,dual} [{word,pos,seg,sinusoidalpos,dual} ...]]
[--tgt_embedding {word,pos,seg,sinusoidalpos,dual} [{word,pos,seg,sinusoidalpos,dual} ...]]
[--max_seq_length MAX_SEQ_LENGTH] [--relative_position_embedding]
[--share_embedding] [--remove_embedding_layernorm]
[--factorized_embedding_parameterization]
[--encoder {transformer,rnn,lstm,gru,birnn,bilstm,bigru,gatedcnn,dual}]
[--decoder {None,transformer}]
[--mask {fully_visible,causal,causal_with_prefix}]
[--layernorm_positioning {pre,post}]
[--feed_forward {dense,gated}]
[--relative_attention_buckets_num RELATIVE_ATTENTION_BUCKETS_NUM]
[--remove_attention_scale] [--remove_transformer_bias]
[--layernorm {normal,t5}] [--bidirectional] [--parameter_sharing]
[--has_residual_attention] [--has_lmtarget_bias]
[--target {sp,lm,mlm,bilm,cls} [{sp,lm,mlm,bilm,cls} ...]]
[--tie_weights] [--pooling {mean,max,first,last}]
[--prefix_lm_loss] [--learning_rate LEARNING_RATE]
[--warmup WARMUP] [--lr_decay LR_DECAY]
[--optimizer {adamw,adafactor}]
[--scheduler {linear,cosine,cosine_with_restarts,polynomial,constant,constant_with_warmup,inverse_sqrt,tri_stage}]
[--batch_size BATCH_SIZE] [--seq_length SEQ_LENGTH]
[--dropout DROPOUT] [--epochs_num EPOCHS_NUM]
[--report_steps REPORT_STEPS] [--seed SEED] [--log_path LOG_PATH]
[--log_level {ERROR,INFO,DEBUG,NOTSET}]
[--log_file_level {ERROR,INFO,DEBUG,NOTSET}]
[--tokenizer {bert,bpe,char,space,xlmroberta}]
[--vocab_path VOCAB_PATH] [--merges_path MERGES_PATH]
[--spm_model_path SPM_MODEL_PATH] [--do_lower_case {true,false}]
[--tgt_seq_length TGT_SEQ_LENGTH] [--metrics METRICS]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant