Skip to content
/ W-HMR Public

W-HMR: Monocular Human Mesh Recovery in World Space with Weak-Supervised Calibration

License

Notifications You must be signed in to change notification settings

yw0208/W-HMR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

W-HMR: Human Mesh Recovery in World Space with Weak-supervised Camera Calibration and Orientation Correction

report report

Features

W-HMR is a human body pose and shape estimation method in world space. It predicts the SMPL body model in both camera and world coordinates for a monocular image. [Camera Calibration] predicts the focal length for better mesh-image alignment. [Orientation Correction] make recovered body reasonable in world space.

This implementation:

  • has the demo code for W-HMR implemented in PyTorch.

News 🚩

[March 21, 2024] Release codes and pretrained weights for demo.

[March 26, 2024] Pre-processed labels are available now.

[April 24, 2024] Fix some import bugs and loading error.

[April 24, 2024] Release more necessary files and preprocessed labels.

[September 5, 2024] Release codes for training and evaluation.

TODOs

  • Release demo codes.

  • Release pre-processed labels.

  • Release evluation codes.

  • Release training codes.

Getting Started

Requirements

W-HMR has been implemented and tested on Ubuntu 18.04 with python == 3.8.

Install the requirements following environment.yml.

💃 If you meet any difficulty configuring your environment or any bug, please feel free to contact me. I would be glad to help!

Running the Demo

W-HMR

First, you need to download the required data (i.e our trained model and SMPL model parameters) from Google Drive or Baidu Netdisk. It is approximately 700MB. Unzip it and put it in the repository root. Then, running the demo is as simple as:

python demo/whmr_demo.py --image_folder data/sample_images --output_folder output/sample_images

Sample demo output:

On the right, they are the output in camera and world coordinate. We put them in world space and generate ground planes to show how accurate the orientation is.

Training

Note: The training code has not been organized, and errors may occur when running it directly. If you encounter a bug, please contact me.

python train.py --regressor pymaf_net --misc TRAIN.BATCH_SIZE 64

When you want to train a model, run the above command. We provide a variety of options to allow you to customize the training. For more information about training settings, please refer to train_options.py

Evaluation

Evaluation on AGORA

python evaluate/val_results.py

Run the above command and you will get the evaluation results on AGORA. Then you can zip them and upload the results.zip to AGORA evaluation platform. Remember to modify the path in val_results.py.

Evaluation on Other Datasets

python evaluate/eval.py --checkpoint=[checkpoint_path] --dataset=[dataset_name]

Run the above command and the results will be output to terminal.

Pre-processed Dataset Labels

All the data used in our paper is publicly available. You can just download them from their official website following our dataset introduction in the paper.

But for your convenience, I provide some download links for pre-processed labels here.

The most important source is PyMAF. You can download the pre-processed labels of 3DPW, COCO, LSP, MPII and MPI-INF-3DHP, which include pseudo 3D joint label fitted by EFT.

We also use some augmented data from CLIFF and HMR 2.0.

I also processed some dataset labels (e.g. AGORA and HuMMan), you can download them from Google Drive or Baidu Pan. You should unzip the dataset_extras.zip and put these files in the ./data/dataset_extras/ folder.

As for training W-HMR for global mesh recovery, I add pseudo-labels of global pose to some datasets. You can download them from Google Drive or Baidu Netdisk. Then unzip it and put files in the ./data/dataset_extras/ folder.

Acknowledgments

Part of the code is borrowed from the following projects, including ,PyMAF, AGORA, PyMAF-X, PARE, SPEC, MeshGraphormer, 4D-Humans, VitPose. Many thanks to their contributions.

Citation

If you find this repository useful, please consider citing our paper and lightning the star:

@article{yao2023w,
        title={W-HMR: Monocular Human Mesh Recovery in World Space with Weak-Supervised Calibration},
        author={Yao, Wei and Zhang, Hongwen and Sun, Yunlian and Liu, Yebin and Tang, Jinhui},
        journal={arXiv preprint arXiv:2311.17460},
        year={2023}
      }

About

W-HMR: Monocular Human Mesh Recovery in World Space with Weak-Supervised Calibration

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published