You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 13, 2022. It is now read-only.
Hi:
I've trained a model.To evaluate the performance of the model, I used unused experimental samples and control samples as test sets.
As a result, I find that the model tends to give higher modified probability for each candidate base.
What's worse,The control sample (without any modified base) was also incorrectly identified with a large number of modified bases.
If this is a hyperparameters problem for the model architecture,I wish someone would give me some advice.
The text was updated successfully, but these errors were encountered:
weir12
changed the title
Trained models tend to give too high a modified probability value for each candidate base
Trained model tend to give too high a modified probability for each candidate base
Dec 2, 2019
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi:
I've trained a model.To evaluate the performance of the model, I used unused experimental samples and control samples as test sets.
As a result, I find that the model tends to give higher modified probability for each candidate base.
What's worse,The control sample (without any modified base) was also incorrectly identified with a large number of modified bases.
If this is a hyperparameters problem for the model architecture,I wish someone would give me some advice.
The text was updated successfully, but these errors were encountered: