-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
generate_adv_images.py - Modifications for CIFAR-10 Compatibility #3
Comments
Hi @TalRub104, We followed the adversarial finetuning recipe provided in (https://github.com/dedeswim/vits-robustness-torch)](https://github.com/dedeswim/vits-robustness-torch)](https://github.com/dedeswim/vits-robustness-torch) for adversarial fine-tuning of vision models, using the same hyperparameters for the number of epochs, learning rate, etc. The |
Hi, I wanted to clarify a few points:
After 100 epochs with trades_beta=6, I achieved the following results: Top-1 Accuracy: 67.29 Could you confirm if these results align with your expectations, or if there are additional adjustments required to reproduce the results reported in the paper? |
Hi,
When training Vision Mamba (VMamba) on CIFAR-10 from scratch using TRADES, I would like to evaluate its robustness accuracy. What changes should I make to the get_val_loader function from classification/generate_adv_images.py?
The current function is as follows:
def get_val_loader(data_path, batch_size): transform = transforms.Compose([ transforms.Resize(256, interpolation=transforms.InterpolationMode.BICUBIC), transforms.CenterCrop(224), transforms.ToTensor(), ]) # Load ImageNet validation dataset val_dataset = ImageNet5k(root=os.path.join(data_path, "val"), transform=transform) val_loader = DataLoader(val_dataset, batch_size=batch_size, shuffle=False) return val_loader, val_dataset
Additionally, it would be great if you could share your fine-tuning setup for CIFAR-10, both with and without TRADES, including details such as the number of epochs, learning rate, etc.
The text was updated successfully, but these errors were encountered: