You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm working to use https://github.com/json-schema-org/json-schema-spec to validate SigMF metadata.
I'm finding that the schema-dictated maximum value for several properties to be very problematic, and I wonder if there's a good rationale for the current value:
18446744073709552000 = 2^64 + 384
I would like to suggest that the value 2**63-1 be used instead, as this makes it much easier for libraries like json-schema-spec to implement such a constraint.
This would cause a breakage for all those files out there with values > 9223372036854775807, it's true. But, given the number of elementary particles in the universe, I don't think that's going to be a problem in practice.
Like, it's one thing to enforce such a limit in python; it's something else to do so in generic C++. The implementation of json-schema-spec uses long to hold such a constraint, on a JSON element of type unsigned int (that's a nlohmann/json thing).
What is the origin of 18446744073709552000 anyway?
The text was updated successfully, but these errors were encountered:
gmabey
added a commit
to gmabey/SigMF
that referenced
this issue
Jan 31, 2025
In the interest of two witnesses, I used this on-line tool to generate some fictitious .sigmf-meta data: https://www.liquid-technologies.com/online-schema-to-json-converter
When I pasted the contents of sigmf-schema.json into the text box, I get an error: Specified cast is not valid..
BUT, when I pasted the contents of the #335 it was able to produce something.
I'm working to use https://github.com/json-schema-org/json-schema-spec to validate SigMF metadata.
I'm finding that the schema-dictated
maximum
value for several properties to be very problematic, and I wonder if there's a good rationale for the current value:I would like to suggest that the value
2**63-1
be used instead, as this makes it much easier for libraries likejson-schema-spec
to implement such a constraint.This would cause a breakage for all those files out there with values > 9223372036854775807, it's true. But, given the number of elementary particles in the universe, I don't think that's going to be a problem in practice.
Like, it's one thing to enforce such a limit in python; it's something else to do so in generic C++. The implementation of
json-schema-spec
useslong
to hold such a constraint, on a JSON element of typeunsigned int
(that's a nlohmann/json thing).What is the origin of 18446744073709552000 anyway?
The text was updated successfully, but these errors were encountered: