Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

epoch #159

Closed
fido20160817 opened this issue Jun 10, 2022 · 4 comments
Closed

epoch #159

fido20160817 opened this issue Jun 10, 2022 · 4 comments

Comments

@fido20160817
Copy link

where to set max epoch. The code is too implicit for me to follow...

@joanrod
Copy link

joanrod commented Jun 26, 2022

It is set to 1000 by default by pytorch lightning. You need to set it in the .yaml file, with something like:

lightning:
    trainer:
        (...)
        max_epochs: 2000

@fido20160817
Copy link
Author

many thanks!

@sanghamitrajohri
Copy link

I cannot see any such thing in the yaml file.
model:
base_learning_rate: 4.5e-6
target: taming.models.vqgan.VQModel
params:
embed_dim: 256
n_embed: 1024
ddconfig:
double_z: False
z_channels: 256
resolution: 256
in_channels: 3
out_ch: 3
ch: 128
ch_mult: [ 1,1,2,2,4] # num_down = len(ch_mult)-1
num_res_blocks: 2
attn_resolutions: [16]
dropout: 0.0

lossconfig:
  target: taming.modules.losses.vqperceptual.VQLPIPSWithDiscriminator
  params:
    disc_conditional: False
    disc_in_channels: 3
    disc_start: 30001
    disc_weight: 0.8
    codebook_weight: 1.0

data:
target: main.DataModuleFromConfig
params:
batch_size: 3
num_workers: 8
train:
target: taming.data.faceshq.FacesHQTrain
params:
size: 256
crop_size: 256
validation:
target: taming.data.faceshq.FacesHQValidation
params:
size: 256
crop_size: 256

@matthew-wave
Copy link

Just add new code directly under the. yaml file you need. For example, my dataset is used to train myself, so I added it in the custom_vqgan.yaml file.
微信图片_20240507111543

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants