Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【paddle.fleet】fleet add _get_applied_meta_list and _get_applied_graph_list #27952

Merged
merged 3 commits into from
Oct 16, 2020

Conversation

wangxicoding
Copy link
Contributor

@wangxicoding wangxicoding commented Oct 14, 2020

PR types

Others

PR changes

APIs

Describe

fleet add _get_applied_meta_list and _get_applied_graph_list interface, which can get applied meta optimizer list and graph optimizer meta list. Fix auto amp_configs decr_ratio.
e.g.

import paddle
from paddle import fluid
import os
import paddle.distributed.fleet as fleet
paddle.enable_static()

fleet.init(is_collective=True)

input_x = paddle.fluid.layers.data(name="x", shape=[32], dtype='float32')
input_y = paddle.fluid.layers.data(name="y", shape=[1], dtype='int64')

fc_1 = paddle.fluid.layers.fc(input=input_x, size=64, act='tanh')
fc_2 = paddle.fluid.layers.fc(input=fc_1, size=256, act='tanh')
prediction = paddle.fluid.layers.fc(input=[fc_2], size=2, act='softmax')
cost = paddle.fluid.layers.cross_entropy(input=prediction, label=input_y)
avg_cost = paddle.fluid.layers.mean(x=cost)

strategy = paddle.distributed.fleet.DistributedStrategy()
strategy.amp = True
strategy.recompute = True
strategy.recompute_configs = {"checkpoints": ["fc_0.tmp_2", "fc_1.tmp_2"]}
strategy.lars = True

optimizer = paddle.fluid.optimizer.Momentum(learning_rate=0.01, momentum=0.9)
optimizer = fleet.distributed_optimizer(optimizer, strategy=strategy)
optimizer.minimize(avg_cost)

applied_meta_list = fleet._get_applied_meta_list()
print("applied_meta_list: {}".format(applied_meta_list))

We will get applied_meta_list: ['AMPOptimizer', 'RecomputeOptimizer', 'LarsOptimizer']. Which mean meta optimizer path is amp-->recompute-->lars

@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

Copy link
Member

@guru4elephant guru4elephant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wangxicoding wangxicoding merged commit fb641c9 into PaddlePaddle:develop Oct 16, 2020
wangxicoding added a commit to wangxicoding/Paddle that referenced this pull request Oct 19, 2020
guru4elephant pushed a commit that referenced this pull request Oct 21, 2020
@wangxicoding wangxicoding deleted the add_applied_meta_list branch November 26, 2020 09:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants