Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Which tool/method you used to count flops? #99

Open
yushuinanrong opened this issue Jul 17, 2021 · 5 comments
Open

Which tool/method you used to count flops? #99

yushuinanrong opened this issue Jul 17, 2021 · 5 comments

Comments

@yushuinanrong
Copy link

No description provided.

@tb2-sy
Copy link

tb2-sy commented Oct 24, 2021

just google yourself

@yushuinanrong
Copy link
Author

yushuinanrong commented Oct 24, 2021

@tb2-sy
What does matter in my research (probably the same for many other researchers) is to make sure that I can reproduce the reported numbers. I'm aware of many flop counting tools and they often produce different results. That's why I bring this up in the issue. Hope you can understand my point and learn to mind your attitude next time when you post anything online.

@zeliu98
Copy link
Contributor

zeliu98 commented Dec 20, 2021

The code can be found here:

def flops(self):
flops = 0
flops += self.patch_embed.flops()
for i, layer in enumerate(self.layers):
flops += layer.flops()
flops += self.num_features * self.patches_resolution[0] * self.patches_resolution[1] // (2 ** self.num_layers)
flops += self.num_features * self.num_classes
return flops

@bobopit
Copy link

bobopit commented Jan 11, 2022

The code can be found here:

def flops(self):
flops = 0
flops += self.patch_embed.flops()
for i, layer in enumerate(self.layers):
flops += layer.flops()
flops += self.num_features * self.patches_resolution[0] * self.patches_resolution[1] // (2 ** self.num_layers)
flops += self.num_features * self.num_classes
return flops

Hello, I have a question about FLOPs. On the ADE20K, the FLOPs of Swin-B are 1841G, but the backbone of Swin-B is only tens of G. Is it because of the large amount of calculation of the decoder?

@auniquesun
Copy link

The code can be found here:

def flops(self):
flops = 0
flops += self.patch_embed.flops()
for i, layer in enumerate(self.layers):
flops += layer.flops()
flops += self.num_features * self.patches_resolution[0] * self.patches_resolution[1] // (2 ** self.num_layers)
flops += self.num_features * self.num_classes
return flops

It seems this code snippet is for #Parameters calculation, not for FLOPs. Can you give more explanations? Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants