From 178f73971bf18944091f29594e9c08aa285b2091 Mon Sep 17 00:00:00 2001 From: Scarlett Li <39592018+scarlett2018@users.noreply.github.com> Date: Fri, 6 Sep 2019 13:33:58 +0800 Subject: [PATCH] minor changes on title and discreption. --- docs/en_US/AdvancedFeature/GeneralNasInterfaces.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/en_US/AdvancedFeature/GeneralNasInterfaces.md b/docs/en_US/AdvancedFeature/GeneralNasInterfaces.md index 81ff43535c..f3850e6188 100644 --- a/docs/en_US/AdvancedFeature/GeneralNasInterfaces.md +++ b/docs/en_US/AdvancedFeature/GeneralNasInterfaces.md @@ -1,6 +1,6 @@ -# General Programming Interface for Neural Architecture Search (experimental feature) +# NNI Programming Interface for Neural Architecture Search (NAS) -_*This is an experimental feature, currently, we only implemented the general NAS programming interface. Weight sharing will be supported in the following releases._ +_*This is an **experimental feature**. Currently, we only implemented the general NAS programming interface. Weight sharing will be supported in the following releases._ Automatic neural architecture search is taking an increasingly important role on finding better models. Recent research works have proved the feasibility of automatic NAS, and also found some models that could beat manually designed and tuned models. Some of representative works are [NASNet][2], [ENAS][1], [DARTS][3], [Network Morphism][4], and [Evolution][5]. There are new innovations keeping emerging. However, it takes great efforts to implement those algorithms, and it is hard to reuse code base of one algorithm for implementing another.