Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Iteration setting in the paper and the code #56

Open
dalindalin opened this issue May 2, 2018 · 4 comments
Open

Iteration setting in the paper and the code #56

dalindalin opened this issue May 2, 2018 · 4 comments

Comments

@dalindalin
Copy link

hey, I have some questions about the number of training iterations.
The paper says when training on ICDAR2015, you first pre-train on Synthtext for 60k iterations, then train on ICDAR2015 for 8k iterations in the stage 1 and 4k iterations in the stage 2. then

  1. What's the batch_size for each iteration in the paer? is iteration == epoch in your paper?
  2. the number of iterations is diffrent from the iteration parameters in the modelConfig.py, which is 40k/100k-40k/120k-100k with a batch_size of 32, so which experiment is the default iteration setting for? Can I reproduce the result on ICDAR2015 in the paper with the same setting?
@MhLiao
Copy link
Owner

MhLiao commented May 19, 2018

  1. The batch size is set to 32. 1 iteration means 1 batch, not an epoch.
  2. You can refer to the paper to adjust your settings.

@dalindalin
Copy link
Author

got it,thank you~ @MhLiao

@basaltzhang
Copy link

so the batch size of 384x384 and 768x768 are both 32, is that right?

@ZDDEAN
Copy link

ZDDEAN commented Sep 20, 2019

嘿,我对训练迭代次数有一些疑问。
该论文称,在ICDAR2015上进行培训时,首先在Synthtext上进行6k次迭代预训练,然后在第2阶段训练ICDAR2015进行8k次迭代,然后在第2阶段进行4k次迭代训练。

  1. paer中每次迭代的batch_size是什么?是你的论文中的迭代==时代?
  2. 迭代次数与modelConfig.py中的迭代参数不同,后者为40k / 100k-40k / 120k-100k,batch_size为32,所以哪个实验是默认的迭代设置?我可以使用相同的设置在ICDAR2015中重现结果吗?

请问如何能分享一下synthtext格式转换为icdar格式的脚本吗,谢谢鸭

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants