From edd167cb6f41ee0f8be2b6e6cfee41fe5f26d3c6 Mon Sep 17 00:00:00 2001 From: liuzhe Date: Mon, 13 Apr 2020 11:56:54 +0800 Subject: [PATCH 1/5] prepare release v1.5 --- README.md | 4 +-- docs/en_US/Release.md | 41 ++++++++++++++++++++++++ docs/en_US/Tutorial/InstallationLinux.md | 4 +-- docs/en_US/Tutorial/InstallationWin.md | 6 ++-- docs/en_US/conf.py | 2 +- 5 files changed, 49 insertions(+), 8 deletions(-) diff --git a/README.md b/README.md index c56522fe01..da21f88334 100644 --- a/README.md +++ b/README.md @@ -25,7 +25,7 @@ The tool manages automated machine learning (AutoML) experiments, **dispatches a * Researchers and data scientists who want to easily **implement and experiment new AutoML algorithms**, may it be: hyperparameter tuning algorithm, neural architect search algorithm or model compression algorithm. * ML Platform owners who want to **support AutoML in their platform**. -### **NNI v1.4 has been released!  ** +### **NNI v1.5 has been released!  ** ## **NNI capabilities in a glance** @@ -236,7 +236,7 @@ The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is * Download the examples via clone the source code. ```bash - git clone -b v1.4 https://github.com/Microsoft/nni.git + git clone -b v1.5 https://github.com/Microsoft/nni.git ``` * Run the MNIST example. diff --git a/docs/en_US/Release.md b/docs/en_US/Release.md index 6644d03da8..6ffdeeb38d 100644 --- a/docs/en_US/Release.md +++ b/docs/en_US/Release.md @@ -1,5 +1,46 @@ # ChangeLog +## Release 1.5 - 4/13/2020 + +### New Features and Documentation + +#### Hyper-Parameter Optimizing + +* New tuner: [Population Based Training (PBT)](https://github.com/microsoft/nni/blob/master/docs/en_US/Tuner/PBTTuner.md) +* Trials can now report infinity, NaN, and string metrics + +#### Neural Architecture Search + +* New NAS algorithm: [TextNAS](https://github.com/microsoft/nni/blob/master/docs/en_US/NAS/TextNAS.md) +* ENAS and DARTS now support [visualization](https://github.com/microsoft/nni/blob/master/docs/en_US/NAS/Visualization.md) through web UI. + +#### Model Compression + +* New Pruner: [GradientRankFilterPruner](https://github.com/microsoft/nni/blob/master/docs/en_US/Compressor/Pruner.md#gradientrankfilterpruner) +* Compressors will validate configuration by default +* Model compression examples are refactored and improved +* Added documentation for [implementing compressing algorithm](https://github.com/microsoft/nni/blob/master/docs/en_US/Compressor/Framework.md) + +#### Training Service + +* Kubeflow now supports pytorchjob crd v1 (thanks @jiapinai) +* Experimental DLTS support + +#### Overall Documentation Improvement + +* Documentation is significantly improved on grammar, spelling, and wording (thanks @AHartNtkn) + +### Fixed Bugs + +* ENAS cannot have more than one LSTM layers (thanks @marsggbo) +* NNI manager's timers will never unsubscribe (thanks @guilhermehn) +* NNI manager may exhaust head memory (thanks @Sundrops) +* Batch tuner does not support customized trials (#2075) +* Experiment cannot be killed if it failed on start (#2080) +* A bug in lottery ticket pruner +* Some glitches in web UI +* And more + ## Release 1.4 - 2/19/2020 ### Major Features diff --git a/docs/en_US/Tutorial/InstallationLinux.md b/docs/en_US/Tutorial/InstallationLinux.md index adc475fce8..e41cf5bf22 100644 --- a/docs/en_US/Tutorial/InstallationLinux.md +++ b/docs/en_US/Tutorial/InstallationLinux.md @@ -19,7 +19,7 @@ Installation on Linux and macOS follow the same instructions, given below. Prerequisites: `python 64-bit >=3.5`, `git`, `wget` ```bash - git clone -b v1.4 https://github.com/Microsoft/nni.git + git clone -b v1.5 https://github.com/Microsoft/nni.git cd nni ./install.sh ``` @@ -35,7 +35,7 @@ The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is * Download the examples via cloning the source code. ```bash - git clone -b v1.4 https://github.com/Microsoft/nni.git + git clone -b v1.5 https://github.com/Microsoft/nni.git ``` * Run the MNIST example. diff --git a/docs/en_US/Tutorial/InstallationWin.md b/docs/en_US/Tutorial/InstallationWin.md index 9180492164..a9f3deb3e4 100644 --- a/docs/en_US/Tutorial/InstallationWin.md +++ b/docs/en_US/Tutorial/InstallationWin.md @@ -19,7 +19,7 @@ Anaconda or Miniconda is highly recommended to manage multiple Python environmen Prerequisites: `python 64-bit >=3.5`, `git`, `PowerShell`. ```bash - git clone -b v1.4 https://github.com/Microsoft/nni.git + git clone -b v1.5 https://github.com/Microsoft/nni.git cd nni powershell -ExecutionPolicy Bypass -file install.ps1 ``` @@ -31,7 +31,7 @@ The following example is built on TensorFlow 1.x. Make sure **TensorFlow 1.x is * Download the examples via clone the source code. ```bash - git clone -b v1.4 https://github.com/Microsoft/nni.git + git clone -b v1.5 https://github.com/Microsoft/nni.git ``` * Run the MNIST example. @@ -136,4 +136,4 @@ Note: * [How to run an experiment on multiple machines?](../TrainingService/RemoteMachineMode.md) * [How to run an experiment on OpenPAI?](../TrainingService/PaiMode.md) * [How to run an experiment on Kubernetes through Kubeflow?](../TrainingService/KubeflowMode.md) -* [How to run an experiment on Kubernetes through FrameworkController?](../TrainingService/FrameworkControllerMode.md) \ No newline at end of file +* [How to run an experiment on Kubernetes through FrameworkController?](../TrainingService/FrameworkControllerMode.md) diff --git a/docs/en_US/conf.py b/docs/en_US/conf.py index e657f2c274..89d0aabcdc 100644 --- a/docs/en_US/conf.py +++ b/docs/en_US/conf.py @@ -28,7 +28,7 @@ # The short X.Y version version = '' # The full version, including alpha/beta/rc tags -release = 'v1.4' +release = 'v1.5' # -- General configuration --------------------------------------------------- From 1bda4a1a383019dbd5c9cbfe09f2711d31417710 Mon Sep 17 00:00:00 2001 From: liuzhe Date: Mon, 13 Apr 2020 13:16:35 +0800 Subject: [PATCH 2/5] update --- docs/en_US/Release.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/en_US/Release.md b/docs/en_US/Release.md index 6ffdeeb38d..c783b190b4 100644 --- a/docs/en_US/Release.md +++ b/docs/en_US/Release.md @@ -7,7 +7,7 @@ #### Hyper-Parameter Optimizing * New tuner: [Population Based Training (PBT)](https://github.com/microsoft/nni/blob/master/docs/en_US/Tuner/PBTTuner.md) -* Trials can now report infinity, NaN, and string metrics +* Trials can now report infinity and NaN as result, as well as metrics in string format #### Neural Architecture Search @@ -24,7 +24,7 @@ #### Training Service * Kubeflow now supports pytorchjob crd v1 (thanks @jiapinai) -* Experimental DLTS support +* Experimental [DLTS](https://github.com/microsoft/nni/blob/master/docs/en_US/TrainingService/DLTSMode.md) support #### Overall Documentation Improvement @@ -39,7 +39,7 @@ * Experiment cannot be killed if it failed on start (#2080) * A bug in lottery ticket pruner * Some glitches in web UI -* And more +* And more! ## Release 1.4 - 2/19/2020 From 80294a33944f3bc3990218a0974d68d5a9900bcd Mon Sep 17 00:00:00 2001 From: liuzhe Date: Mon, 13 Apr 2020 14:02:14 +0800 Subject: [PATCH 3/5] update --- docs/en_US/Release.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/en_US/Release.md b/docs/en_US/Release.md index c783b190b4..562c7c0b5c 100644 --- a/docs/en_US/Release.md +++ b/docs/en_US/Release.md @@ -7,7 +7,7 @@ #### Hyper-Parameter Optimizing * New tuner: [Population Based Training (PBT)](https://github.com/microsoft/nni/blob/master/docs/en_US/Tuner/PBTTuner.md) -* Trials can now report infinity and NaN as result, as well as metrics in string format +* Trials can now report infinity and NaN as result #### Neural Architecture Search @@ -37,9 +37,9 @@ * NNI manager may exhaust head memory (thanks @Sundrops) * Batch tuner does not support customized trials (#2075) * Experiment cannot be killed if it failed on start (#2080) +* Non-number type metrics break web UI (#2278) * A bug in lottery ticket pruner -* Some glitches in web UI -* And more! +* Other minor glitches ## Release 1.4 - 2/19/2020 From 68beb0b947ad6f8cd6617d8a258ac1cafc06f63b Mon Sep 17 00:00:00 2001 From: liuzhe Date: Tue, 14 Apr 2020 10:32:34 +0800 Subject: [PATCH 4/5] add "external contributor" --- docs/en_US/Release.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/en_US/Release.md b/docs/en_US/Release.md index 562c7c0b5c..cd419c7531 100644 --- a/docs/en_US/Release.md +++ b/docs/en_US/Release.md @@ -23,18 +23,18 @@ #### Training Service -* Kubeflow now supports pytorchjob crd v1 (thanks @jiapinai) +* Kubeflow now supports pytorchjob crd v1 (thanks external contributor @jiapinai) * Experimental [DLTS](https://github.com/microsoft/nni/blob/master/docs/en_US/TrainingService/DLTSMode.md) support #### Overall Documentation Improvement -* Documentation is significantly improved on grammar, spelling, and wording (thanks @AHartNtkn) +* Documentation is significantly improved on grammar, spelling, and wording (thanks external contributor @AHartNtkn) ### Fixed Bugs -* ENAS cannot have more than one LSTM layers (thanks @marsggbo) -* NNI manager's timers will never unsubscribe (thanks @guilhermehn) -* NNI manager may exhaust head memory (thanks @Sundrops) +* ENAS cannot have more than one LSTM layers (thanks external contributor @marsggbo) +* NNI manager's timers will never unsubscribe (thanks external contributor @guilhermehn) +* NNI manager may exhaust head memory (thanks external contributor @Sundrops) * Batch tuner does not support customized trials (#2075) * Experiment cannot be killed if it failed on start (#2080) * Non-number type metrics break web UI (#2278) From 2c55bad9c5317a4ff5885b93673f98675449fe23 Mon Sep 17 00:00:00 2001 From: liuzhe Date: Tue, 14 Apr 2020 11:14:51 +0800 Subject: [PATCH 5/5] update by quanlu --- docs/en_US/Release.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/en_US/Release.md b/docs/en_US/Release.md index cd419c7531..8d44d8616b 100644 --- a/docs/en_US/Release.md +++ b/docs/en_US/Release.md @@ -18,6 +18,7 @@ * New Pruner: [GradientRankFilterPruner](https://github.com/microsoft/nni/blob/master/docs/en_US/Compressor/Pruner.md#gradientrankfilterpruner) * Compressors will validate configuration by default +* Refactor: Adding optimizer as an input argument of pruner, for easy support of DataParallel and more efficient iterative pruning. This is a broken change for the usage of iterative pruning algorithms. * Model compression examples are refactored and improved * Added documentation for [implementing compressing algorithm](https://github.com/microsoft/nni/blob/master/docs/en_US/Compressor/Framework.md)