-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The dumps pipeline produces a file with may triples that ROBOT can not parse. #23
Comments
I note that the text on the Robot doc says "this is often because", suggesting there are other possible causes. |
If @hkir-dev can grep an example from the dump which is tiny and causes the ROBOT warning, we can easily find the reason.. A very typical problem I have encountered is complex "source" or "target" in reification, which must be atomic. But one look at a failing minimal example and we can tell.. |
Minimal example attached.
|
The output seems to be noise, see ontodev/robot#965 The triples parse and convert just fine. |
Sorry didnt mean to close it - reflex. Up to you :) |
=>
Possible cause:
http://robot.obolibrary.org/errors
Could the pipeline be somehow producing axioms following rdf reification?
Investigating an example:
Here's one of the unparsed triples reported:
<http://virtualflybrain.org/reports/VFB_00029522> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> _:genid-nodeid-genid-4841781bfabd4a7b98ac303327e83a13-node1fpl1cvm3x508939 .
I expected this to be something to do with blank nodes used for reification used in axiom annotation, however, looking at that blank node in the triplestore it looks like a simple, unannotated type axiom:
Looking at PDB, this type restriction appears to be present
Any idea what's going on? Half a million unparsed triples is at least a bad smell, even if we're not sure of what the consequences might be.
The text was updated successfully, but these errors were encountered: