Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[EZ] Pass seed to data sampler. #2266

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

EugenHotaj
Copy link
Contributor

We currently hardcode zero. I assume this is not intentional.

Context

What is the purpose of this PR? Is it to

  • add a new feature
  • fix a bug
  • update tests and/or documentation
  • other (please add here)

Please link to any issues this PR addresses.

Changelog

What are the changes made in this PR?

  • As titled

Test plan

Please make sure to do each of the following if applicable to your PR. If you're unsure about any one of these just ask and we will happily help. We also have a contributing page for some guidance on contributing.

  • run pre-commit hooks and linters (make sure you've first installed via pre-commit install)
  • add unit tests for any new functionality
  • update docstrings for any new or updated methods or classes
  • run unit tests via pytest tests
  • run recipe tests via pytest tests -m integration_test
  • manually run any new or modified recipes with sufficient proof of correctness
  • include relevant commands and any other artifacts in this summary (pastes of loss curves, eval results, etc.)

UX

If your function changed a public API, please add a dummy example of what the user experience will look like when calling it.
Here is a docstring example
and a tutorial example

  • I did not change any public API
  • I have added an example to docs or docstrings

We currently hardcode zero. I assume this is not intentional.
Copy link

pytorch-bot bot commented Jan 14, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/2266

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 14, 2025
@RdoubleA
Copy link
Contributor

Getting the seed from self.seed is risky, since we are using self.seed = training.set_seed() in our recipes, which will actually return a different seed on every rank. I am not sure if the sampler should have the same seed across ranks, but I imagine to ensure all ranks are not shuffling the data differently we would want the same seed. It would be better to set it from cfg.seed directly to keep the same data seed across all ranks.

Tagging @andrewkho for thoughts on setting the seed in DistributedSampler.

@EugenHotaj
Copy link
Contributor Author

@RdoubleA I might be misunderstanding the code but I thought set_seed returned the seed you passed it in from the config (or if you don't pass anything, a global seed). I guess my intention here is to just pass in the seed from the config. Lmk if there is a better way to do it (maybe pass in cfg.seed directly?)

@RdoubleA
Copy link
Contributor

Ah my bad, you're right. It's returning the global seed not the local seed. In that case this should be ok

@RdoubleA
Copy link
Contributor

@EugenHotaj actually we might need to do something like this to get the seed directly from the config and maintaining BC with existing configs:
seed=cfg.get("seed", None) or 0

@EugenHotaj
Copy link
Contributor Author

@RdoubleA done

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants