Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix g_param shape mismatch in WeightNormParamAttr #18940

Merged
merged 2 commits into from
Aug 5, 2019

Conversation

SunGaofeng
Copy link
Contributor

1, fix g_param shape mismatch between startup_program and main_program
2, fix 'dim' attributes from int to list(int), when append reduce_sum op

transpose, shape=[transpose.shape[0], -1], block=block)
norm = __norm_op(reshape, dim=[1], block=block)
reshape2 = __reshape_op(norm, shape=out_shape, block=block)
__transpose_op(reshape2, perm, out=out, block=block)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the shape of output is a vector, maybe we can remove this line __transpose_op(reshape2, perm, out=out, block=block) by redefining out_shape in reshape2 as [1...1, transpose.shape[0], 1...1] .

their direction. Weight Norm has been implemented as discussed in this
paper: `Weight Normalization: A Simple Reparameterization to Accelerate
Training of Deep Neural Networks
<https://arxiv.org/pdf/1602.07868.pdf>`_.

Args:
dim(list): The parameter's name. Default None.
dim(int): Dimension over which to compute the norm. Default None.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might it be the Dimension except which

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is referred to pytorch, which said 'dim (int, optional): dimension over which to compute the norm
', just keep consistent with other framework

@SunGaofeng SunGaofeng merged commit 4da1c4f into PaddlePaddle:develop Aug 5, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants