Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tune](deps): Bump pytorch-lightning from 1.0.3 to 1.3.7.post0 in /python/requirements #34

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Jun 26, 2021

Bumps pytorch-lightning from 1.0.3 to 1.3.7.post0.

Release notes

Sourced from pytorch-lightning's releases.

Hotfix Patch Release

[1.3.7post0] - 2021-06-23

Fixed

  • Fixed backward compatibility of moved functions rank_zero_warn and rank_zero_deprecation (#8085)

Contributors

@​kaushikb11 @​carmocca

Standard weekly patch release

[1.3.7] - 2021-06-22

Fixed

  • Fixed a bug where skipping an optimizer while using amp causes amp to trigger an assertion error (#7975) This conversation was marked as resolved by carmocca
  • Fixed deprecation messages not showing due to incorrect stacklevel (#8002, #8005)
  • Fixed setting a DistributedSampler when using a distributed plugin in a custom accelerator (#7814)
  • Improved PyTorchProfiler chrome traces names (#8009)
  • Fixed moving the best score to device in EarlyStopping callback for TPU devices (#7959)

Contributors

@​yifuwang @​kaushikb11 @​ajtritt @​carmocca @​tchaton

Standard weekly patch release

[1.3.6] - 2021-06-15

Fixed

  • Fixed logs overwriting issue for remote filesystems (#7889)
  • Fixed DataModule.prepare_data could only be called on the global rank 0 process (#7945)
  • Fixed setting worker_init_fn to seed dataloaders correctly when using DDP (#7942)
  • Fixed BaseFinetuning callback to properly handle parent modules w/ parameters (#7931)

Contributors

@​awaelchli @​Borda @​kaushikb11 @​Queuecumber @​SeanNaren @​senarvi @​speediedan

Standard weekly patch release

[1.3.5] - 2021-06-08

Added

  • Added warning to Training Step output (#7779)

Fixed

... (truncated)

Changelog

Sourced from pytorch-lightning's changelog.

Changelog

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog.

[1.4.0] - 2021-MM-DD

Added

  • Add support for named parameter groups in LearningRateMonitor (#7987)

  • Add dataclass support for pytorch_lightning.utilities.apply_to_collection (#7935)

  • Added support to LightningModule.to_torchscript for saving to custom filesystems with fsspec (#7617)

  • Added KubeflowEnvironment for use with the PyTorchJob operator in Kubeflow

  • Added LightningCLI support for config files on object stores (#7521)

  • Added ModelPruning(prune_on_train_epoch_end=True|False) to choose when to apply pruning (#7704)

  • Added support for checkpointing based on a provided time interval during training (#7515)

  • Added dataclasses for progress tracking ( #6603, #7574)

  • Added support for passing a LightningDataModule positionally as the second argument to trainer.{validate,test,predict} (#7431)

  • Added argument trainer.predict(ckpt_path) (#7430)

  • Added clip_grad_by_value support for TPUs (#7025)

  • Added support for passing any class to is_overridden (#7918)

  • Added sub_dir parameter to TensorBoardLogger (#6195)

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [pytorch-lightning](https://github.com/PyTorchLightning/pytorch-lightning) from 1.0.3 to 1.3.7.post0.
- [Release notes](https://github.com/PyTorchLightning/pytorch-lightning/releases)
- [Changelog](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/CHANGELOG.md)
- [Commits](Lightning-AI/pytorch-lightning@1.0.3...1.3.7post0)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jun 26, 2021
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Jul 3, 2021

Superseded by #35.

@dependabot dependabot bot closed this Jul 3, 2021
@dependabot dependabot bot deleted the dependabot/pip/python/requirements/pytorch-lightning-1.3.7.post0 branch July 3, 2021 07:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants