Skip to content

Commit

Permalink
Fix load_from_checkpoint docs (Lightning-AI#978)
Browse files Browse the repository at this point in the history
We don't (yet) support storing hparams a a dict. It *must*
be an `argparse.Namespace` for checkpoint saving and
loading to work.
  • Loading branch information
neggert authored and tullie committed Apr 3, 2020
1 parent 9e3d7ab commit 8706a22
Showing 1 changed file with 0 additions and 9 deletions.
9 changes: 0 additions & 9 deletions pytorch_lightning/core/lightning.py
Original file line number Diff line number Diff line change
Expand Up @@ -1168,15 +1168,6 @@ class MyModel(LightningModule):
def __init__(self, hparams):
self.learning_rate = hparams.learning_rate
# --------------
# Case 2
# when using a dict
model = MyModel({'learning_rate': 0.1})
class MyModel(LightningModule):
def __init__(self, hparams):
self.learning_rate = hparams['learning_rate']
Args:
checkpoint_path (str): Path to checkpoint.
map_location (dict | str | torch.device | function):
Expand Down

0 comments on commit 8706a22

Please sign in to comment.