Skip to content

Commit

Permalink
Add gMLP-S weights, 79.6 top-1
Browse files Browse the repository at this point in the history
  • Loading branch information
rwightman committed Jun 23, 2021
1 parent 85f894e commit 20a2be1
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 1 deletion.
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,9 @@ I'm fortunate to be able to dedicate significant time and money of my own suppor

## What's New

### June 23, 2021
* Reproduce gMLP model training, `gmlp_s16_224` trained to 79.6 top-1, matching [paper](https://arxiv.org/abs/2105.08050).

### June 20, 2021
* Release Vision Transformer 'AugReg' weights from [How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers](https://arxiv.org/abs/2106.10270)
* .npz weight loading support added, can load any of the 50K+ weights from the [AugReg series](https://console.cloud.google.com/storage/browser/vit_models/augreg)
Expand Down
4 changes: 3 additions & 1 deletion timm/models/mlp_mixer.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,9 @@ def _cfg(url='', **kwargs):
mean=IMAGENET_DEFAULT_MEAN, std=IMAGENET_DEFAULT_STD),

gmlp_ti16_224=_cfg(),
gmlp_s16_224=_cfg(),
gmlp_s16_224=_cfg(
url='https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmlp_s16_224_raa-10536d42.pth',
),
gmlp_b16_224=_cfg(),
)

Expand Down

0 comments on commit 20a2be1

Please sign in to comment.