Skip to content
/ S3NAS Public
forked from cap-lab/S3NAS

Fast NPU-aware Neural Architecture Search

License

Notifications You must be signed in to change notification settings

zzzz737/S3NAS

 
 

Repository files navigation

S3NAS: Fast NPU-aware Neural Architecture Search

We conduct NAS Following three steps : Supernet design, Single-Path NAS with modification, Scaling and post-processing.

Results

We depict our search results on MIDAP below. Height of each block in the picture is proportional to expansion ratio. SE-applied blocks are depicted as dotted blocks.

Requirements

Usage

  1. Set up ImageNet dataset

    To setup the ImageNet follow the instructions from here

    Or you can just copy from other bucket using gsutil -m cp -r, or transfer from other bucket.

  2. Set up the profiled latency files

    latency_folder
    |-- Conv2D
    |-- Dense
    |-- GlobalAvgPool
    |-- MBConvBlock
    |-- MixConvBlock
        |-- r1_k3,5_s22_e2,4_i32_o32_c100_noskip_relu_imgsize112
        |-- ...
    

    each latency file contains a dictionary with latency value. For example, the content of r1_k3,5_s22_e2,4_i32_o32_c100_noskip_relu_imgsize112 may be {"latency": 364425}

    to use our profiled latency files for MIDAP, please type

    git submodule update --init --recursive
    
  3. Set up flags and run

    Refer to the script files in base_experiment_scripts, or set up flags yourself. When you use scripts in base_experiment_scripts, please MODIFY

    • Google Cloud Storage Bucket
    • Model file name
    • Google Cloud TPU name
    • Target latency
    • Latency folder name

    We provide script templates for NAS / train / post_process

  4. Run the script file.

Citation

If it helps your research, please cite

@misc{lee2020s3nas,
    title={S3NAS: Fast NPU-aware Neural Architecture Search Methodology},
    author={Jaeseong Lee and Duseok Kang and Soonhoi Ha},
    year={2020},
    eprint={2009.02009},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

About

Fast NPU-aware Neural Architecture Search

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 97.8%
  • Shell 2.2%