Skip to content

Official Implementation of NeurIPS 2023 Contextually Affinitive Neighborhood Refinery for Deep Clustering

Notifications You must be signed in to change notification settings

cly234/DeepClustering-ConNR

Repository files navigation

[NeurIPS 2023] Contextually Affinitive Neighborhood Refinery for Deep Clustering

This is the official Implementation of NeurIPS 2023 paper:

Contextually Affinitive Neighborhood Refinery for Deep Clustering, authored by Chunlin Yu, Ye Shi, and Jingya Wang†

🍎 [ArXiv Paper] 🍇 [Slideslive Video]

🚀 Getting Started

Compilation

The prerequisite for contextually affinitive neighborhood retrieval:

cd extension
sh make.sh

Run

  • To begin clustering, simply run:
sh run.sh

where you can modify the config file (i.e. cifar10_r18_connr) or the number of devices ( i.e. CUDA_VISIBLE_DEVICES=0,1,2,3) in run.sh.

  • For more customized uses, you can directly modify the config file in configs/.

  • To skip the warm-up training and simply resume ConNR clustering from warm-up checkpoints, we provide the warm-up checkpoints in [Goolge Drive]. To resume training:

  1. save the warm-up checkpoints into the folder ckpt/your_run_name/save_models/

  2. modify the corresponding variables resume_name and resume_epochin config file:

  3. resume ConNR clustering by running sh run.sh. The final checkpoints of ConNR clustering are saved in [Goolge Drive]

♥️ Acknowledgement

Our framework is based on ProPos, and our ConNR implmentation is inspired by GNN Reranking.

Great thanks and gratitude for their brilliant works and valued contributions!

✒️ Citation

@article{yu2024contextually,
  title={Contextually Affinitive Neighborhood Refinery for Deep Clustering},
  author={Yu, Chunlin and Shi, Ye and Wang, Jingya},
  journal={Advances in Neural Information Processing Systems},
  volume={36},
  year={2024}
}

About

Official Implementation of NeurIPS 2023 Contextually Affinitive Neighborhood Refinery for Deep Clustering

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published