forked from xinntao/ESRGAN
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
1 changed file
with
32 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
# Frequently Asked Questions | ||
|
||
### 1. How to reproduce your results in the [PIRM18-SR Challenge](https://www.pirm2018.org/PIRM-SR.html) (with low perceptual index)? | ||
|
||
First, the released ESRGAN model in the GitHub (`RRDB_ESRGAN_x4.pth`) is **different** from the model we submitted in the competition. | ||
We found that the lower perceptual index does not always guarantee a better visual quality. | ||
The aims for the competition and our ESRGAN work will be a bit different. | ||
We think the aim for the competition is the lower perceptual index and the aim for our ESRGAN work is the better visual quality. | ||
|
||
Therefore, in the PIRM18-SR Challenge competition, we used several tricks for the best perceptual index (see Section 4.5 in the [paper](https://arxiv.org/abs/1809.00219)). | ||
|
||
Here, we provid the models and codes used in the competition, which is able to produce the results on the `PIRM test dataset` (we use MATLAB 2016b/2017a): | ||
|
||
| Group | Perceptual index | RMSE | | ||
| ------------- |:-------------:| -----:| | ||
| SuperSR | 1.978 | 15.30 | | ||
|
||
> 1. Download the model and codes from [GoogleDrive](https://drive.google.com/file/d/1l0gBRMqhVLpL_-7R7aN-q-3hnv5ADFSM/view?usp=sharing) | ||
> 2. Put LR input images in the `LR` folder | ||
> 3. Run `python test.py` | ||
> 4. Run `main_reverse_filter.m` in MATLAB as a post processing | ||
> 5. The results on my computer are: Perceptual index: **1.9777** and RMSE: **15.304** | ||
|
||
### 2. How do you get the perceptual index in your ESRGAN paper? | ||
In our paper, we provide the perceptual index in two places. | ||
|
||
1). In the Fig. 2, the perceptual index on PIRM self validation dataset is obtained with the **model we submitted in the competition**. | ||
Since the pupose of this figure is to show the perception-distortion plane. And we also use the post-precessing here same as in the competition. | ||
|
||
2). In the Fig.7, the perceptual indexs are provided as references and they are tested on the data generated by the released ESRGAN model `RRDB_ESRGAN_x4.pth` in the GiuHub. | ||
Also, there is **no** post-processing when testing the ESRGAN model for better visual quality. |