Skip to content

deepanshgoyal33/Idd

Repository files navigation

IDD-Lite-Unet

Requirement

  • Pytorch=0.4
  • Visdom
  • Opencv
  • matplotlib
  • pydensecrf

To directly create the conda virtual environemt please use the requirements.yml

Training procedure

  1. Download the dataset in a folder "dataset" and arrange the data in the following structure: Combine the training and validation dataset. (annotations and the images are to be put in separate folders)
├── dataset
│   ├── images
│   │   ├── train
│   │   ├── val
│   │   ├── test
│   ├── annotation
│   │   ├── train
│   │   ├── val
  1. Start the visdom server by the command
python -m visdom.server
  1. Execute the command
bash run.sh

Weights

[https://drive.google.com/file/d/1yxCbStft75gTOriWQLGtqS_5l_OAvYnR/view?usp=sharing]

Testing procedure

  1. For single scale testing:
python test.py
  1. For multi-scale testing:
python multiscale_testing.py
  1. For single scale testing with CRF post processing:
python test_with_postprocessing.py

Visualization

Left side : predicted classes : Right side : Ground Truth

Visualization during training

Acknowledgement

One hundred layer Tiramisu [https://github.com/bfortuner/pytorch_tiramisu]