we visualize our training details via wandb (https://wandb.ai/site).
-
you'll need to login
$ wandb login
you can find you API key in (https://wandb.ai/authorize), and copy & paste it in terminal.
-
you can (optionally) add the key to the "code/config/config.py"for the server use, with
C.wandb_key = ""
our code is trained using one nvidia 3090 GPU, but our code also supports distributed data parallel mode in pytorch. We set batch_size=8 for all the experiments, with learning rate 1e-5 and 900 * 900 resolution.
we follow Meta-OoD and use the deeplabv3+ checkpoint in here. you'll need to put it in "ckpts/pretrained_ckpts" directory, and please note that downloading the checkpoint before running the code is necessary for our approach.
for training, simply execute
$ python code/main.py
please download our checkpoint from here and specify the checkpoint path ("ckpts/pebal_weight_path") in config file.
python code/test.py