Skip to content

Commit

Permalink
Uncomment removal of exp_dir (NVIDIA#7198)
Browse files Browse the repository at this point in the history
The removal of result dir after the test 'Megatron GPT with KERPLE Pretraining and Resume Training TP=2' was commented out by mistake that was leading to the next test using the checkpoint from this test. Hence rolling it back.

Signed-off-by: Abhishree <[email protected]>
Signed-off-by: dorotat <[email protected]>
  • Loading branch information
athitten authored and dorotat-nv committed Aug 24, 2023
1 parent 8aa94b9 commit ea82ef0
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -3321,8 +3321,8 @@ assert_frame_equal(training_curve, gt_curve, rtol=1e-3, atol=1e-3)"'''
//model.activations_checkpoint_num_layers=1 \
//model.data.data_prefix=[.5,/home/TestData/nlp/megatron_gpt/data/gpt/simple_wiki_gpt_preproc_text_document,.5,/home/TestData/nlp/megatron_gpt/data/gpt/simple_wiki_gpt_preproc_text_document] \
//model.data.index_mapping_dir=examples/nlp/language_modeling/gpt_index_mappings"
sh "rm -rf examples/nlp/language_modeling/gpt_pretrain_results"
sh "rm -rf examples/nlp/language_modeling/gpt_index_mappings"
sh "rm -rf examples/nlp/language_modeling/gpt_pretrain_results"
sh "rm -rf examples/nlp/language_modeling/gpt_index_mappings"
}
}
stage('L2: Megatron GPT with Rope Pretraining using Flash Attention and Resume Training TP=2') {
Expand Down Expand Up @@ -3580,8 +3580,8 @@ assert_frame_equal(training_curve, gt_curve, rtol=1e-3, atol=1e-3)"'''
//model.activations_checkpoint_num_layers=1 \
//model.data.data_prefix=[.5,/home/TestData/nlp/megatron_gpt/data/gpt/simple_wiki_gpt_preproc_text_document,.5,/home/TestData/nlp/megatron_gpt/data/gpt/simple_wiki_gpt_preproc_text_document] \
//model.data.index_mapping_dir=examples/nlp/language_modeling/gpt_index_mappings"
//sh "rm -rf examples/nlp/language_modeling/gpt_pretrain_results"
//sh "rm -rf examples/nlp/language_modeling/gpt_index_mappings"
sh "rm -rf examples/nlp/language_modeling/gpt_pretrain_results"
sh "rm -rf examples/nlp/language_modeling/gpt_index_mappings"
}
}
stage('L2: Megatron GPT Pretraining and Resume Training PP=2') {
Expand Down

0 comments on commit ea82ef0

Please sign in to comment.