Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support Ajbrock SN and Update BigGAN with best FID and IS #159

Merged
merged 33 commits into from
Dec 27, 2021

Conversation

plyfager
Copy link
Collaborator

  1. Support SN implemented by ajbrock.
  2. Updated BigGAN trained on imagenet128x128 which reached best FID(8.69) and IS(129.07).

@plyfager
Copy link
Collaborator Author

plyfager commented Nov 11, 2021

TODOs:

  • Docs and citation statement of biggan_snmodule.
  • Complete Unittest.

@plyfager plyfager changed the title [WIP] Support Ajbrock SN and Update BigGAN with best FID and IS [Feature] Support Ajbrock SN and Update BigGAN with best FID and IS Nov 15, 2021
@plyfager plyfager requested a review from nbei November 15, 2021 03:51
mmgen/models/architectures/biggan/biggan_snmodule.py Outdated Show resolved Hide resolved
ajbrock(https://github.com/ajbrock/BigGAN-PyTorch/blob/master/layers.py)
will be followed.
If set to `torch`, implementation by `PyTorch` will be followed.
Defaults to `torch`.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we change the default value to biggan? Because the default value should always lead to the best performance. Since this may bring BC-breakings, we may carefully deal with it.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I set the the default value to ajbrock.


Args:
W (torch.Tensor): Module weight.
u_ (list[torch.Tensor]): list of left singular vector.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may revise the copied codes from another repo. This name is not so good.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

W --> weight, u_ --> u_list.

return svs, us, vs


class SN(object):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest that you can use nn.Module here. Have you done the unit test for this module independently?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SN(object) --> SpectralNorm(nn.Module). Unittest has been completed.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I Kept SpectralNorm inheriting from object since otherwise the initialization way would be rather ugly.

Copy link
Collaborator Author

@plyfager plyfager Nov 25, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Take SNLinear for example, the super function of nn.Linear will call the init function of SpectralNorm, which will cause an error.

"""Get singular values."""
return [getattr(self, 'sv%d' % i) for i in range(self.num_svs)]

def W_(self):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In OpenMMLab, we perfer to use lowercase letters for the name of fucntions.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

W_ --> sn_weight.

eps (float, optional): Vector Normalization epsilon for
avoiding divide by zero. Defaults to 1e-12.
"""

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we refer to other code, we should polish it rather than directly copy it.

@nbei
Copy link
Collaborator

nbei commented Nov 29, 2021

You may need to merge the conflict from the master branch. Then, I can merge this PR.

mmgen/models/architectures/biggan/biggan_snmodule.py Outdated Show resolved Hide resolved
mmgen/models/architectures/biggan/biggan_snmodule.py Outdated Show resolved Hide resolved
with torch.no_grad():
for i, sv in enumerate(svs):
self.sv[i][:] = sv
return self.weight / svs[0]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please leave a comment here to specify why we just use svs[0] rather than another element in svs.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just change svs[0] to svs[-1] instead.

@codecov
Copy link

codecov bot commented Dec 15, 2021

Codecov Report

Merging #159 (aa0dc33) into master (3cd123b) will increase coverage by 0.06%.
The diff coverage is 82.11%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #159      +/-   ##
==========================================
+ Coverage   75.52%   75.59%   +0.06%     
==========================================
  Files         127      128       +1     
  Lines        8814     8927     +113     
  Branches     1748     1781      +33     
==========================================
+ Hits         6657     6748      +91     
- Misses       1719     1729      +10     
- Partials      438      450      +12     
Flag Coverage Δ
unittests 75.59% <82.11%> (+0.06%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...chitectures/biggan/generator_discriminator_deep.py 77.87% <64.70%> (-1.37%) ⬇️
...ls/architectures/biggan/generator_discriminator.py 77.72% <66.66%> (-1.30%) ⬇️
mmgen/models/architectures/biggan/modules.py 95.98% <86.95%> (-1.10%) ⬇️
...gen/models/architectures/biggan/biggan_snmodule.py 89.23% <89.23%> (ø)
...chitectures/stylegan/generator_discriminator_v2.py 86.66% <0.00%> (-1.12%) ⬇️
mmgen/models/architectures/stylegan/mspie.py 91.58% <0.00%> (+0.99%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f21c398...aa0dc33. Read the comment docs.

@plyfager plyfager merged commit 4e0cfae into open-mmlab:master Dec 27, 2021
LeoXing1996 pushed a commit that referenced this pull request Jul 16, 2022
[Feature] Support Ajbrock SN and Update BigGAN with best FID and IS
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants