Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug when subset of model parameters is passed into optimizer with FSDP #3502

Merged
merged 16 commits into from
Aug 1, 2024

Conversation

sashaDoubov
Copy link
Contributor

@sashaDoubov sashaDoubov commented Jul 30, 2024

What does this PR do?

This PR addresses #3493, where we were adding all model parameters into the optimizer state after FSDP wrapping. This change uses the previous flow for param_groups > 1, where we:

  1. Get the mapping between params in the optimizer group and their names by using the pointers (found with id) of the tensors
  2. Store the extra param group info, such as LR, wd, etc. in a separate dict
  3. Iterate over the fsdp wrapped parameters, and populates the param_groups based on the names of the fsdp wrapped tensor and the mapping from before.

This change relies on use_orig_params=True, so an error is thrown if a subset of parameters is passed into the param_groups.

NOTE: at the moment, tensor parallelism breaks with multiple param_groups (as the ptrs of TP'd model.parameters() != ptrs of elements in optimizer param_groups), so a special case was added to handle this.

What issue(s) does this change relate to?

Fixes #3493

Before submitting

  • Have you read the contributor guidelines?
  • Is this change a documentation change or typo fix? If so, skip the rest of this checklist.
  • Was this change discussed/approved in a GitHub issue first? It is much more likely to be merged if so.
  • Did you update any related docs and document your change?
  • Did you update any related tests and add any new tests related to your change? (see testing)
  • Did you run the tests locally to make sure they pass?
  • Did you run pre-commit on your change? (see the pre-commit section of prerequisites)

@sashaDoubov sashaDoubov marked this pull request as ready for review July 31, 2024 19:24
@sashaDoubov sashaDoubov requested a review from dakinggg July 31, 2024 19:27
composer/distributed/dist_strategy.py Outdated Show resolved Hide resolved
composer/distributed/dist_strategy.py Outdated Show resolved Hide resolved
composer/distributed/dist_strategy.py Outdated Show resolved Hide resolved
Copy link
Contributor

@mvpatel2000 mvpatel2000 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

why are gc collect necessary? will approve after that

tests/trainer/test_tp.py Outdated Show resolved Hide resolved
tests/trainer/test_tp.py Outdated Show resolved Hide resolved
@sashaDoubov
Copy link
Contributor Author

@mvpatel2000 I've now removed the gc.collect() calls , including the old one in test_fsdp_param_groups

@mvpatel2000 mvpatel2000 merged commit 08b5731 into dev Aug 1, 2024
14 checks passed
@mvpatel2000 mvpatel2000 deleted the sasha/fix-subset-opt-group branch August 1, 2024 21:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

FSDP Wrapping Alters Optimizer's Parameter Tracking Behavior
2 participants