Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Request for Integrating the new NAS algorithm: Cream #2705

Merged
merged 74 commits into from
Nov 27, 2020
Merged
Changes from 1 commit
Commits
Show all changes
74 commits
Select commit Hold shift + click to select a range
a426e1a
integrate CREAM NAS algorithm
penghouwen Jul 20, 2020
57a2c40
Update README.md
penghouwen Jul 20, 2020
c15832f
Update Cream.md
penghouwen Jul 31, 2020
806937c
Update Cream.md
penghouwen Jul 31, 2020
b13fed0
Update Cream.md
penghouwen Jul 31, 2020
d7c3217
Update requirements.txt
penghouwen Jul 31, 2020
1951db0
Update Cream.md
penghouwen Jul 31, 2020
bce9cf2
Update requirements.txt
penghouwen Jul 31, 2020
cda252b
Update Cream.md
penghouwen Jul 31, 2020
29b6d5d
Update Cream.md
penghouwen Jul 31, 2020
e548cbc
Update Cream.md
penghouwen Jul 31, 2020
d5c95c6
Update Cream.md
penghouwen Jul 31, 2020
c73b95c
Update trainer.py
penghouwen Aug 1, 2020
0adaf7c
Update mutator.py
penghouwen Aug 1, 2020
85f01e9
Update Cream.md
penghouwen Aug 3, 2020
047fd86
Update Cream.md
penghouwen Aug 3, 2020
b22f92c
Update Cream.md
penghouwen Aug 3, 2020
3ee7591
Update Cream.md
penghouwen Aug 3, 2020
be81d53
Update Cream.md
penghouwen Aug 3, 2020
9535466
Fix pipeline for merging into NNI
ultmaster Aug 4, 2020
0892e66
Fix typo
ultmaster Aug 4, 2020
999d18c
Merge pull request #1 from ultmaster/fix-cream-before-merge
penghouwen Aug 4, 2020
2e13a23
Fix pipeline
ultmaster Aug 5, 2020
22a3f46
Add files via upload
penghouwen Aug 7, 2020
b1777f0
Update Cream.md
penghouwen Aug 7, 2020
4fcbaa9
Update CDARTS.md
penghouwen Aug 7, 2020
2205433
Update Cream.md
penghouwen Aug 7, 2020
b277b96
Update CDARTS.md
penghouwen Aug 7, 2020
8d413bf
Update CDARTS.md
penghouwen Aug 7, 2020
cc9f336
Update distributed_train.sh
penghouwen Sep 3, 2020
9494493
Update distributed_test.sh
penghouwen Sep 3, 2020
6a332ff
Update Cream.md
penghouwen Sep 3, 2020
b35ccac
init
Sep 27, 2020
c872739
Merge pull request #2 from mapleam/master
penghouwen Sep 27, 2020
9289614
Update supernet.py
Z7zuqer Sep 27, 2020
ab9d398
1)remove timm
Sep 27, 2020
82eee8d
Merge pull request #3 from mapleam/master
penghouwen Sep 27, 2020
2559697
Delete cream.jpg
penghouwen Oct 22, 2020
e48d293
Add files via upload
penghouwen Oct 22, 2020
a71563b
Update Cream.md
penghouwen Oct 22, 2020
c00c58e
version 1.0
Z7zuqer Nov 18, 2020
4d72a70
version 2.0
Z7zuqer Nov 21, 2020
60e5197
Merge pull request #4 from mapleam/master
penghouwen Nov 23, 2020
37518fa
Update Cream.md
penghouwen Nov 23, 2020
e04200c
Update Cream.md
penghouwen Nov 23, 2020
47dce8c
Update Cream.md
penghouwen Nov 23, 2020
5231d3b
Update Cream.md
penghouwen Nov 23, 2020
ce698c3
Update Cream.md
penghouwen Nov 23, 2020
0d63ceb
Update Cream.md
penghouwen Nov 23, 2020
274fb23
version 3.0
Z7zuqer Nov 23, 2020
931c47b
Merge branch 'master' into master
Z7zuqer Nov 23, 2020
59b1339
Merge pull request #5 from mapleam/master
penghouwen Nov 23, 2020
c162f39
Update Cream.md
penghouwen Nov 23, 2020
de8c261
Update Cream.md
penghouwen Nov 23, 2020
ae45787
Update Cream.md
Z7zuqer Nov 23, 2020
43101c1
Update retrain.py
Z7zuqer Nov 23, 2020
36ddeaf
Update test.py
Z7zuqer Nov 23, 2020
97451af
Update retrain.py
Z7zuqer Nov 23, 2020
96cfb17
Merge branch 'master' into master
Z7zuqer Nov 23, 2020
d735a25
Merge pull request #6 from mapleam/master
penghouwen Nov 23, 2020
8d24833
version 4.0
Z7zuqer Nov 23, 2020
a53cc5f
Merge remote-tracking branch 'origin/master'
Z7zuqer Nov 23, 2020
cce57e5
version 4.0
Z7zuqer Nov 23, 2020
d9cfd2f
Merge pull request #7 from mapleam/master
penghouwen Nov 24, 2020
0f8f8bf
Update Cream.md
penghouwen Nov 24, 2020
879bfeb
Update Cream.md
penghouwen Nov 24, 2020
85b17b4
Merge branch 'master' into master
penghouwen Nov 24, 2020
0cf817b
Move code dir
ultmaster Nov 24, 2020
fdeb0b9
Fix trainer and retrain optimizer
ultmaster Nov 25, 2020
d11e4cf
Update Cream.md
penghouwen Nov 25, 2020
06af2cb
Fix syntax warning
ultmaster Nov 26, 2020
6cb3b97
Fix syntax warning (again)
ultmaster Nov 26, 2020
9996098
Fix docs build warnings
ultmaster Nov 26, 2020
02b8e72
Merge branch 'master' of github.com:penghouwen/nni into cream-master
ultmaster Nov 26, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Merge branch 'master' into master
  • Loading branch information
penghouwen authored Nov 24, 2020
commit 85b17b48a082ff9d2300b75b0f54a921e0fc00bd
116 changes: 59 additions & 57 deletions docs/en_US/NAS/CDARTS.md
Original file line number Diff line number Diff line change
@@ -1,57 +1,59 @@
# CDARTS

## Introduction

[CDARTS](https://arxiv.org/pdf/2006.10724.pdf) builds a cyclic feedback mechanism between the search and evaluation networks. First, the search network generates an initial topology for evaluation, so that the weights of the evaluation network can be optimized. Second, the architecture topology in the search network is further optimized by the label supervision in classification, as well as the regularization from the evaluation network through feature distillation. Repeating the above cycle results in a joint optimization of the search and evaluation networks, and thus enables the evolution of the topology to fit the final evaluation network.

In implementation of `CdartsTrainer`, it first instantiates two models and two mutators (one for each). The first model is the so-called "search network", which is mutated with a `RegularizedDartsMutator` -- a mutator with subtle differences with `DartsMutator`. The second model is the "evaluation network", which is mutated with a discrete mutator that leverages the previous search network mutator, to sample a single path each time. Trainers train models and mutators alternatively. Users can refer to [paper](https://arxiv.org/pdf/2006.10724.pdf) if they are interested in more details on these trainers and mutators.

## Reproduction Results

This is CDARTS based on the NNI platform, which currently supports CIFAR10 search and retrain. ImageNet search and retrain should also be supported, and we provide corresponding interfaces. Our reproduced results on NNI are slightly lower than the paper, but much higher than the original DARTS. Here we show the results of three independent experiments on CIFAR10.

| Runs | Paper | NNI |
| ---- |:-------------:| :-----:|
| 1 | 97.52 | 97.44 |
| 2 | 97.53 | 97.48 |
| 3 | 97.58 | 97.56 |


## Examples

[Example code](https://github.com/microsoft/nni/tree/master/examples/nas/cdarts)

```bash
# In case NNI code is not cloned. If the code is cloned already, ignore this line and enter code folder.
git clone https://github.com/Microsoft/nni.git

# install apex for distributed training.
git clone https://github.com/NVIDIA/apex
cd apex
python setup.py install --cpp_ext --cuda_ext

# search the best architecture
cd examples/nas/cdarts
bash run_search_cifar.sh

# train the best architecture.
bash run_retrain_cifar.sh
```

## Reference

### PyTorch

```eval_rst
.. autoclass:: nni.nas.pytorch.cdarts.CdartsTrainer
:members:

.. autoclass:: nni.nas.pytorch.cdarts.RegularizedDartsMutator
:members:

.. autoclass:: nni.nas.pytorch.cdarts.DartsDiscreteMutator
:members:

.. autoclass:: nni.nas.pytorch.cdarts.RegularizedMutatorParallel
:members:
```

# CDARTS

## Introduction

[CDARTS](https://arxiv.org/pdf/2006.10724.pdf) builds a cyclic feedback mechanism between the search and evaluation networks. First, the search network generates an initial topology for evaluation, so that the weights of the evaluation network can be optimized. Second, the architecture topology in the search network is further optimized by the label supervision in classification, as well as the regularization from the evaluation network through feature distillation. Repeating the above cycle results in a joint optimization of the search and evaluation networks, and thus enables the evolution of the topology to fit the final evaluation network.

In implementation of `CdartsTrainer`, it first instantiates two models and two mutators (one for each). The first model is the so-called "search network", which is mutated with a `RegularizedDartsMutator` -- a mutator with subtle differences with `DartsMutator`. The second model is the "evaluation network", which is mutated with a discrete mutator that leverages the previous search network mutator, to sample a single path each time. Trainers train models and mutators alternatively. Users can refer to [paper](https://arxiv.org/pdf/2006.10724.pdf) if they are interested in more details on these trainers and mutators.

## Reproduction Results

This is CDARTS based on the NNI platform, which currently supports CIFAR10 search and retrain. ImageNet search and retrain should also be supported, and we provide corresponding interfaces. Our reproduced results on NNI are slightly lower than the paper, but much higher than the original DARTS. Here we show the results of three independent experiments on CIFAR10.

| Runs | Paper | NNI |
| ---- |:-------------:| :-----:|
| 1 | 97.52 | 97.44 |
| 2 | 97.53 | 97.48 |
| 3 | 97.58 | 97.56 |


## Examples

[Example code](https://github.com/microsoft/nni/tree/master/examples/nas/cdarts)

```bash
# In case NNI code is not cloned. If the code is cloned already, ignore this line and enter code folder.
git clone https://github.com/Microsoft/nni.git

# install apex for distributed training.
git clone https://github.com/NVIDIA/apex
cd apex
python setup.py install --cpp_ext --cuda_ext

# search the best architecture
cd examples/nas/cdarts
bash run_search_cifar.sh

# train the best architecture.
bash run_retrain_cifar.sh
```

## Reference

### PyTorch

```eval_rst
.. autoclass:: nni.nas.pytorch.cdarts.CdartsTrainer
:members:

.. autoclass:: nni.nas.pytorch.cdarts.RegularizedDartsMutator
:members:

.. autoclass:: nni.nas.pytorch.cdarts.DartsDiscreteMutator
:members:

.. autoclass:: nni.nas.pytorch.cdarts.RegularizedMutatorParallel
:members:
```

You are viewing a condensed version of this merge commit. You can view the full changes here.