Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion Matrix and Results Should Be Updated Or Clarified #1

Closed
josephius opened this issue Mar 24, 2020 · 9 comments
Closed

Confusion Matrix and Results Should Be Updated Or Clarified #1

josephius opened this issue Mar 24, 2020 · 9 comments
Assignees
Labels
bug Something isn't working

Comments

@josephius
Copy link
Collaborator

It looks like additional training and test examples were added but the Confusion Matrix and Results have not been updated to reflect this. I recommend either updating the results, or if the results are not available yet (possibly still training the new model?) a quick note added to make sure that there isn't confusion about the Confusion Matrix, which only shows 8 ground truth COVID-19 samples still. As there are two false positives in the Confusion Matrix, it's possible to assume that the results have been miscalculated with false negatives as false positives, which would reverse the precision and recall.

@lindawangg
Copy link
Owner

Hi Joseph, I just pushed a clarification to the results section of the README.md. Still working on the new dataset.

@josephius
Copy link
Collaborator Author

Thanks! I appreciate the quick response. Sorry if it's a bit of a nitpick, but given that there are at least a couple of news media articles linking directly to this repo now, I figure it makes sense to avoid possible misunderstandings.

@josephius
Copy link
Collaborator Author

Reopening this so that it will be displayed more prominently to avoid duplicate issues.

@josephius josephius reopened this Mar 27, 2020
@lindawangg
Copy link
Owner

We just released a new dataset and model, along with the results. Please confirm if the results are the consistent with what you get from the model. Thanks!

@PaulMcInnis PaulMcInnis added the bug Something isn't working label Mar 30, 2020
@josephius
Copy link
Collaborator Author

So I'm getting the following bug running the small train, small eval, and large train checkpoints:

ValueError: At least two variables have the same name: conv1_conv/bias

The large eval checkpoint works though and gets:

[[94. 6. 0.]
[ 9. 90. 1.]
[ 1. 0. 9.]]

which is consistent with the results in the readme.

@jamesrenhoulee
Copy link
Collaborator

Hi josephius,

I believe that error just means you have loaded in the same model twice, try restarting the notebook and only loading one model at a time. Alternatively, if you're using tensorflow and loading graphs, you can use: tf.reset_default_graph() and declare a new tf.Graph() within each Session.

@AasiaRehman
Copy link

Cannot feed value of shape (1,) for Tensor 'input_1:0', which has shape '(?, 224, 224, 3)'
What is this error about??

@PaulMcInnis
Copy link
Collaborator

@AasiaRehman you are not feeding the correct data item to the image input tensor. Perhaps you are feeding the label by accident.

@josephius
Copy link
Collaborator Author

@jamesrenhoulee
Okay, so it works if I only run the query script on one model. I was running it on each model in sequence before. Makes sense.

The results appear to be consistent with the confusion matrices in the readme.

COVID-Net

Small (Train)
[[95. 5. 0.]
[ 8. 91. 1.]
[ 1. 1. 8.]]

Small (Eval)
[[95. 5. 0.]
[ 8. 91. 1.]
[ 1. 1. 8.]]

Large (Train)
[[94. 6. 0.]
[ 9. 90. 1.]
[ 1. 0. 9.]]

Large (Eval)
[[94. 6. 0.]
[ 9. 90. 1.]
[ 1. 0. 9.]]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants