-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support imperative learning rate scheduler #15584
Support imperative learning rate scheduler #15584
Conversation
velconia
commented
Jan 29, 2019
•
edited
Loading
edited
- move imperative mnist ut to test_imperative_mnist
- add optimizer lr scheduler ut into test_imperative_optimizer.py
- implement lr scheduler in imperative mode
… imperative_lr_scheduler
… imperative_lr_scheduler
test=develop
test=develop
__all__ = ['PiecewiseDecay'] | ||
|
||
|
||
class LearningRateDecay(object): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does this support static graph? why not use existing lr_scheduler?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
static lr and dynamic lr is used via interface in learning_rate_scheduler.py
under the fluid dir
… imperative_lr_scheduler
…/Paddle into imperative_lr_scheduler test=develop
test=develop
test=develop
python/paddle/fluid/optimizer.py
Outdated
# create learning rate Variable | ||
if isinstance(self._learning_rate, float): | ||
self._learning_rate_map[framework.default_main_program( | ||
)] = layers.create_global_var( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can this be created many times?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no, this should only be called once Optimizer.init
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
persistable=True) | ||
# get learning rate Variable from LearningRateDecay | ||
elif isinstance(self._learning_rate, LearningRateDecay): | ||
self._learning_rate_map[framework.default_main_program( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why use main program?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to keep the same with static mode code
… imperative_lr_scheduler test=develop
test=develop
test=develop
… imperative_lr_scheduler test=develop
test=develop
test=develop
test=develop