Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Detecting annotation discrepancies #52

Closed
fruce-ki opened this issue Mar 8, 2018 · 1 comment
Closed

Detecting annotation discrepancies #52

fruce-ki opened this issue Mar 8, 2018 · 1 comment
Assignees
Labels
enhancement Suggestions for better performance or better presentation.
Milestone

Comments

@fruce-ki
Copy link
Collaborator

fruce-ki commented Mar 8, 2018

In response to #49 .

Although users are explicitly instructed not to mix up annotations, it seems prudent to have RATs at least produce some warnings.

This can be done

  • at the end of the run, leveraging NA in certain fields
  • or at the beginning of the run as a thorough pre-check of ID sets across the inputs and lead to aborting the run, potentially with a force option to override the abort.
@fruce-ki fruce-ki added the enhancement Suggestions for better performance or better presentation. label Mar 8, 2018
@fruce-ki fruce-ki self-assigned this Mar 8, 2018
@fruce-ki
Copy link
Collaborator Author

fruce-ki commented Mar 8, 2018

I've decided it is best to encourage good practices instead of trying to clean up after bad ones.
Therefore, any mismatch of transcript IDs between the annotation and the quantifications will now result in aborting the run.

I remain unsure still whether to allow an explicit override. It would certainly simplify updating the unittests, as the testing dataset explicitly contains cases with inconsistent annotation that make the new abort condition cause all the tests to fail.

@fruce-ki fruce-ki added this to the 0.6.3 milestone Mar 8, 2018
fruce-ki added a commit that referenced this issue Mar 9, 2018
- Explicitly check the transcript IDs in the annotation and
quantifications for inconsistencies, and abort if any are found. - Add
abort override option for special use cases.
- Update docs and tests
- Tidy up the input check tests, break them into smaller tests
- Get rid of obsolete and almost certainly broken functions for data
simulation.
@fruce-ki fruce-ki closed this as completed Mar 9, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Suggestions for better performance or better presentation.
Projects
None yet
Development

No branches or pull requests

1 participant