Skip to content

Commit

Permalink
Tensorboard Docu about Hyperparams saving (#5158)
Browse files Browse the repository at this point in the history
* Add documentation to tensorboard

* Remove unnecessary whitespaces

* Update pytorch_lightning/loggers/tensorboard.py

Co-authored-by: Jirka Borovec <[email protected]>

* Add metrics to tensorboard logger

* Whitespace removed

Co-authored-by: Jirka Borovec <[email protected]>
  • Loading branch information
Skyy93 and Borda authored Jan 15, 2021
1 parent d15f7a0 commit d62ca82
Showing 1 changed file with 15 additions and 2 deletions.
17 changes: 15 additions & 2 deletions pytorch_lightning/loggers/tensorboard.py
Original file line number Diff line number Diff line change
Expand Up @@ -144,8 +144,21 @@ def experiment(self) -> SummaryWriter:
return self._experiment

@rank_zero_only
def log_hyperparams(self, params: Union[Dict[str, Any], Namespace],
metrics: Optional[Dict[str, Any]] = None) -> None:
def log_hyperparams(
self,
params: Union[Dict[str, Any], Namespace],
metrics: Optional[Dict[str, Any]] = None,
) -> None:
"""
Record hyperparameters. TensorBoard logs with and without saved hyperparameters
are incompatible, the hyperparameters are then not displayed in the TensorBoard.
Please delete or move the previously saved logs to display the new ones with hyperparameters.
Args:
params: a dictionary-like container with the hyperparameters
metrics: Dictionary with metric names as keys and measured quantities as values
"""

params = self._convert_params(params)

# store params to output
Expand Down

0 comments on commit d62ca82

Please sign in to comment.