Skip to content

Trainer and Transformers wrapper package for simplified sequence classification fine-tuning

Notifications You must be signed in to change notification settings

japarty/looptune

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

looptune

This package is prepared as a wrapper for a trainer finetuning pipeline, to simnplify proces of iterated finetuning even more.

Installation:

  1. Install repository:
    • after downloading repository locally: pip install -e <path to looptune repo main dir>/. (for example, if running from a notebook in the notebooks dir that would be ../.)
    • install from git pip install git+https://github.com/japarty/looptune
  2. I you want to run with GPU (afaik only Nvidia is supported), install torch accordingly to: https://pytorch.org/

Notebooks

  1. finetune_example - simple finetuning of model for multilabel emotion classification, actively changed as it's also used to testing features to keep example up to date

Notes

For now it's limited to sequence classification (or at least it was the only one tested).

Ultimately, I want to provide it as a dockerized webapp.

Issues

About

Trainer and Transformers wrapper package for simplified sequence classification fine-tuning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published