Skip to content

Improving BiGAN training with marginal likelihood equalization - Official PyTorch implementation

License

Notifications You must be signed in to change notification settings

sailfish009/imp_bigan

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Improving BiGAN training with marginal likelihood equalization - PyTorch implementation

Getting started

Clone the package probvis, which is needed to visualize the results. Then create a conda environment as follows:

conda env create -f ./environments/env_macos.yml

and you should be ready to to go.

Usage

Parameters

  • ld: Parameter that controls the shape of the non-uniform distribution, namely $\lambda_{dist}$
  • lp: Parameter that controls the percentage of the non-uniform samples per mini-bartch, namely $\lambda_{perc}$
  • lr: Parameter that controls reconstruction regularization, namely $\lambda_{cyc}$
  • gan_type: Select the model to train: mdgan, pmdgan, epmdgan.

Examples

python3 main.py --n_epochs 800 --z_dim 256 --dataset c10 --data_path ../Data --gan_type epmdgan --gpu 0 --ckpt_dir experiments/c10/3  --lr 3 --lp 0.8 --ld 4

BibTex citation

Acknowledgements

This code uses the folowing repositories:

  • Downloading/Loading LSUN data set: code paper

  • Computing the FID score: code

  • Regularize GAN with Spectral Normalization: code paper

  • Computing Precision & Recall: code paper

Contact

Pablo Sanchez - For any questions, comments or help to get it to run, please don't hesitate to mail me: pablo.sanchez-martin@tuebingen.mpg.de

About

Improving BiGAN training with marginal likelihood equalization - Official PyTorch implementation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%