Skip to content

Releases: deeperlearner/pytorch-template

v5.0.0

28 Oct 06:23
Compare
Choose a tag to compare

This release update is not backward compatible.
The main update is that this version supports multiple trainers.
In the "trainers" section of the config, you can specify more than one trainer.
In this way, you can do something like pretrain, finetune framework.

v4.1.0

12 Sep 08:00
Compare
Choose a tag to compare

This release supports that training multi-process on k-fold cross validation. And the multiprocessing can be used with optuna.
Multiprocessing on k-fold cross validation is especially effective on Recurrent Neural Network. (RNN)
I have tested on my own data with GRU model. Using multiprocessing saves 54% of optuna training time!

v3.0.1

16 Aug 09:24
Compare
Choose a tag to compare

This version supports more complete on optuna and tester.
In tune/objective.py and testers/tester.py can define for different kinds of tasks.

v1.0.0

22 Jul 06:35
Compare
Choose a tag to compare

pytorch-template first release