Disruption prediction using Auton-Survival
Note
Tested to work with Python 3.10.7 on Windows and Linux
- Clone this repo
cd ~/projects
git clone https://github.com/MIT-PSFC/disruption-survival-analysis.git
- Make a new virtual environment and verify using correct python
cd disruption-survival-analysis
python -m venv .venv
source .venv/Scripts/activate
which python
python --version
- Install dependencies
pip install -r requirements.txt
- Run tests to ensure install is properly working
python -m unittest tests/test_*
Datasets used in this repo take the following form:
shot, time, time_until_disrupt, feature1, feature2, ...
The first three columns are required.
- shot: shot number
- time: measurement time of features (in seconds)
- time_until_disrupt:
- disruptive shots: Time until disruption
- non-disruptive shots: NaN
Ordering of the columns and rows makes no difference, library should handle it as long as all the data there.
To add a new dataset, follow instructions in Make Datasets.ipynb
Follow instructions in Write Sweep Configs.ipynb
or use write_sweep_configs.py
After sweep configs are generated, run the following command to start a hyperparameter tuning session:
python optuna_job.py models/[device]/[dataset_path]/[sweep].yaml
Open multiple terminals and execute the command to have several jobs performing sweeps at once.
- probably need to re-activate virtual environment for each terminal
- scripts to execute many jobs via SLURM coming soon
To view results of hyperparameter tuning trials, execute the following command:
optuna-dashboard sqlite:///models/[device]/[dataset]/[study].db
Warning
The present implementation uses SQLite3, which may suffer performance issues when doing many parallel runs. This shouldn't be the bottleneck as I expect model training to take significantly longer than database writes, but to avoid possibility of deadlock we may switch to using MySQL in the future.
Open Run Experiments.ipynb
notebook
Set the properties of the models that have hyperparameters tuned.
Run all cells, look at graphs.