-
Notifications
You must be signed in to change notification settings - Fork 9.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is type
a required field for Schema Objects?
#1657
Comments
The latter. |
Additionally, from the spec
So I can see the potential for confusion. If you think a wording clarification (such as |
And what would be the default fallback type, if no type was specified? string? This should also be mentioned. |
@silkentrance , the type keyword is an optional assertion. If you don't include it, values of any type are allowed. This isn't mentioned explicitly in the OpenAPI spec, because in this respect OpenAPI follows the same semantics as the underlying JSON Schema spec. |
@tedepstein this might work well with dynamically typed languages, but with statically typed languages this represents a problem, as Object, at least in Java, will not be able to carry the payload. As such, hints for code generator implementers would be needed that will advise them on how to represent such an unspecified type, e.g. Any as in XML Schema. And, since no assumption can be made on the actual type, Any would be the best fit code generation wise, requiring the value to be cast / unmarshalled to its actual data type using a best guess/trial and error strategy. Which of course is a no-no since the consumer, that is the implementer of the so defined API now has to disambiguate the ambiguous type. Why do you have the need to stick that closely to the JSON Schema specification? OAS is not something that is abstract, it basically boils down to code generation and providing people with useful APIs and frameworks thereof, using the idioms of the target frameworks and languages and platforms for which the code is being generated for. Or do you really want to replicate CORBA? (just kidding) |
@silkentrance , agreed, this is one of those areas where JSON Schema's flexibility as a validation language makes it an imperfect fit for type description and code generation. I know that JSON Schema is introducing "vocabularies" as a language feature to add extended metadata, but I haven't studied that yet. In the meantime, our editing tools encourage users to specify a |
@tedepstein and maybe, just maybe, and since the OAS is defined using JSON Schema, there is a slight misunderstanding in what the meta schema of OAS is, which is basically JSON Schema, and what the concrete schema of the so defined APIs and types is, which is basically an applied customisation of JSON Schema which in turn is either more restrictive or even more extensible, as in vendor extensions. I sometimes also get confused with all the meta meta meta levels when defining a language or system. Therefore, when not specifying a See also #1649 |
Ultimately this question arose while supporting the edge case (which appears often due to typos) of code generation for Swift, Kotlin and TypeScript. When a user forgets to enter |
I should point out that OAS itself is not defined using a JSON schema. An OpenAPI document (i.e. a description of an API, conforming to the OAS) takes the form of an object graph that can be partially described and validated using JSON Schema. But the written specification is the authoritative source. Any "official" or unofficial JSON Schema describing the OAS itself is supplementary, not considered definitive.
Getting lost in the meta-levels is an occupational hazard. ;-)
That would be a further break from JSON Schema. The OpenAPI TSC wants to explore convergence with a future draft of JSON Schema, such that Schema Object fully supports the syntax and semantics of JSON Schema. The overall consensus seems to be that this would be valuable, though I don't think they've committed to this yet.
We currently treat it as a warning in our editors, though again this is a bit of a divergence from the OAS standard, so we plan to make that warning configurable. A particularly common form of this is a schema that omits In reality, this construct almost always means that the user intends There are lots of other cases where annotations like |
@tedepstein I think that allowing users to configure type assumptions is a good practical compromise. While it encourages people to write schemas that might behave surprisingly in other environments, for folks who are just using schemas within OAS it's probably better to work with what people are probably going to do anyway. I would default to strict conformance, if for no other reason than to make users think about it for a half-second and consider interoperability, but as you note there are quite a few ways to deterministically infer type. |
@tedepstein In my understanding, the OAS' schema should be, and considering Swagger, is, JSON schema.
and code generation wise, which generates code based on the meta-3 level, whilst the underlying client/service framework utilises the JSON Schema at the meta-2 level for validating the requests/responses based on the internally generated JSON Schema before they ever get de/serialised into real objects of the target language or for transmission over the wire. As I see it, neither the existing framework nor the existing code generators add the extra meta-2 level, or when they do, they will encode it in the so generated classes. The latter is the problem here, as the framework will need to make best guesses on the actual type, which may or may not be defined by meta-3. For XML it is rather simple, either it is a simple type, and when there is no xsi:type information, it is xs:string, or it is an element with arbitrary content. In effect, the user will get a generic JSonNode type (jackson, Java), in no way being able to derive any type information from it, e.g. on whether it is a MyPreviousObject or a simple string or whatever, basically having to second guess what type it is by himself or herself, or considering for example OpenProject and other such solutions, they will introduce a type discriminator that will allow them to discern the actual type using yet another framework for doing so. As for the missing type information, this will pose extra work on the framework, which, while realising the meta-2 level, will now have to undergo a trial and error phase where it tries to validate whether a given parameter or payload matches any of the available meta-3 declared parameter / content / object / type schemas. In conclusion, I strongly believe that at the OAS meta-3 level, there must always be a type information, even when targeting Javascript clients or NodeJS. |
In addition, the missing type information could also be replaced by either oneOf or allOf. But neither of type, oneOf or allOf must ever be missing, or at least |
@handrews leaving out the type information or including it has nothing to do with adhering to strict conformance. Whether it is optional, which I strongly believe it must not, as it is very specific to untyped dynamic languages, or whether it is mandatory in form of either type, oneOf or allOf (including type as an extension), is not up to the question. It is about OAS targeting concrete consumers of that specification, which are basically user implemented clients and services, using either strongly typed or laxly typed languages. JSON Schema, on the other hand is an altogether different beast, if you don't mind me saying so. |
What I'm saying about OAS and JSON Schema is something I've seen stated here, and repeated in several other contexts. There is no official JSON Schema for OAS3 as of yet, and to quote @webron:
All this is to say that OpenAPI is not defined in terms of a JSON Schema. It's defined by the written, human-readable specification. Anyway, I think this is incidental to the discussion of whether |
This is one possible way of defining layers in a solution stack, but it's not the only way. There are many cases where OAS is not involved at all in the implementation of clients or servers; it's used separately to produce API documentation used by a client developer.
Sorry, which framework, and which code generators? There are many. If you're talking about Swagger-Codegen, Swagger-Core, etc., those are separate implementation projects, and the implementers there made their own decisions about validation.
If those implementations don't include message validation, it could just be that they never got around to it, or didn't think it was an essential feature. I would not assume that they were put off by the fact that
I think this is all a way of saying that OpenAPI allows varying degrees of precision and completeness, and not every OpenAPI document is suitable as an input for a code generator or an interpretive runtime framework. Those generators or frameworks can try to infer missing type information, based on other properties. But clearly it's extra work to implement this inferrence, and it's not guaranteed to produce the right behavior. |
I don't think I would support a proposal to make Aside from being a breaking change, it would limit the use of OpenAPI to describe some APIs in the wild that actually do allow more than one data type for certain parameters or properties. I don't think this is good API design, but OpenAPI should be able to describe bad APIs as well as good ones. That said, optional typing is a legitimate challenge for code generators and implementation frameworks. So we can consider a range of possible ways to deal with this:
|
This should never be the case. The behaviour of the system must always be predictable. |
@tedepstein there are two aspects of JSON Schema vocabularies (which I'm still in the middle of writing up as it's a series of non-trivial PRs to the draft and I've been in summer vacation mode for a while) which can help here. One is the obvious part- adding a set of new keywords as a vocabulary that you can declare that you are using. This is not unlike declaring either the validation or hyper-schema "vocabularies" that informally exist today, which is done with This part will let us define a code generation vocabulary with new keywords, e.g. The less obvious part is being able to easily customize meta-schemas, as declaring your vocabulary and declaring your meta-schema can be done separately (currently, all you get is Custom meta-schemas will probably allow defining additional keywords as a kind of light-weight anonymous vocabulary, but the main point of them will be to allow for more restrictive meta-schemas. Such a meta-schema could make A code generation tool could easily apply such a strict meta-schema to a schema intended for use by the tool, and reject it if it does not pass. This is the appropriate place for such a tighter check, not the OAS specification itself. |
@handrews, any thought of allowing a vocabulary to add constraints, such as making |
@tedepstein you can add constraints with a custom meta-schema, although the distinction there is a bit fuzzy as I haven't finished nailing it all down. For the most part, vocabularies should add keywords (including keywords in objects like the LDO when appropriate), and not restrict other vocabularies. But there is already a way to restrict existing vocabularies with custom meta-schemas, and we will make that easier. Currently, you only get one URI to indicate your meta-schema, which is effectively your vocabulary. If that meta-schema is using Instead, meta-schemas will declare what vocabularies they follow, and then will be able to add further validation rules which can constrain those vocabularies. It might look something like this (I don't even know if this is the same syntax as in the last example I came up with, so take it with a grain of salt). Bear with me here- it's a bit messy as we're still nailing down some tricky details. Note that it's relatively rare for people to write meta-schemas compared to schemas, so a little complexity is more acceptable here than in, say, object validation. {
"$schema": "https://example.com/meta-schema#",
...
} {
"$schema": "https://json-schema.org/draft-08/schema#",
"$id": "https://example.com/meta-schema",
"$vocabularies": [
"https://json-schema.org/draft-08/vocabularies/schema",
"https://json-schema.org/draft-08/vocabularies/code-gen"
],
"$recursiveRoot": true,
"required": ["type"],
"allOf": [
{"$ref": "https://json-schema.org/draft-08/schema"},
{"$ref": "https://json-schema.org/draft-08/code-gen"}
]
} What this says is that anything using this meta-schema is:
None of this is in the spec yet, although it's all just about ready for PRs as soon as I have a chance. My view is that by making it easy to implement restrictions in custom meta-schemas, which are how you declare your use of vocabulary combinations anyway, we will discourage vocabulary authors interfering with each other's keywords. Notably, the spec will declare that the behavior of vocabularies that define incompatible semantics for the same keyword is undefined. Don't do it. |
Thanks, @handrews . I will probably need to come back to this later, as I don't have time to fully process it now.
There's some interesting prior art you might want to consider here, with UML profiles. (This may or may not be your idea of a good reference point, but profiles have been widely implemented by tool providers, and used in some demanding applications.) UML profiles can extend other profiles, and IIUC those extensions can include additional constraints. As long as there are no cycles in the extension graph, the constraints are cumulative, and there's no conflict. If it's OK with you, I'd like to leave this as a placeholder for a future discussion. |
@handrews @tedepstein Regarding JSON Schema vocabularies and UML profiles, I think that this should be best discussed in the JSON Schema project. However, I am with @tedepstein here, however, the current approach targeted by JSON Schema is more or less an extension to the schema itself, whereas UML profiles, for our purpose that is, simply add configurations as in vendor extensions, say |
This is an interesting discussion, and the information is valuable. We have a constant challenge when it comes to defining the spec when we consider the implications. There are both consumers and producers to keep in mind, and various tools that work around the API life cycle, development languages and so on. I doubt we're ever going to make
As we end up saying... we're never going to make everyone happy. |
Any can be represented in most strongly typed languages. Less strongly typed languages tend to let you represent anything you want as an Any or Object type and it can be interpreted at the consumer's leisure. More strongly typed languages tend to have an enum or tagged union type that can represent this without issue. If one assumes that "any" can only mean the JSON spec's small group of options, a
Much same as above, I haven't had trouble representing oneOf or anyOf in any strongly typed language I've targeted. Discriminators are ultimately an interface with a single field, and you can match on a closed set of types to discriminate between the possible objects. Some languages you have a loosely connected pile of classes linked with a helper function, whereas stronger ones you can use a proper tagged union.
We already have the practical reality of XML appearing in some APIs, sometimes along side JSON APIs within the same document. Implementing this in strongly types languages is usually not so much of a problem either, as you can add enough hints that the parsing tools you glue to it can assess what is meant to happen without any real trouble. Adding the 'complexity' of things like protobuf don't make things more challenging beyond having to recognise more MIME types and adding handlers. |
@bbqsrc absolutely. I'm not saying there are no solutions, and the statements I made were generally in regard to making |
@webron Is there somewhere that we can track and follow along on the feature to allow users to use any payload schema format? I'm particularly interested in using protobuf as the payload format. |
@charlie430 , I think you're looking for the Alternative Schemas feature proposed in #1532. You can subscribe to notifications on that feature. There's also an open pull request #1736 currently under review. |
OK, having slogged through this again, I'm reaching the following conclusions:
Given all of that, and that discussion stalled a year ago, I'm going to go ahead and close this. If anyone wants to discuss custom meta-schemas for this in OAS 3.1, that's worth a new issue :-) |
If a schema has no "type" key, and ONLY has a "properties" key, then we can reasonably assume that the type is "object". See OAI/OpenAPI-Specification#1657 Excerpt: A particularly common form of this is a schema that omits type, but specifies properties. Strictly speaking, this does not mean that the value must be an object. It means that if the value is an object, and it includes any of those properties, the property values must conform to the corresponding property subschemas. In reality, this construct almost always means that the user intends type: object, and I think it would be reasonable for a code generator to assume this, maybe with a validation: strict|lax config option to control that behavior.
* Handle schemas with no type. If a schema has no "type" key, and ONLY has a "properties" key, then we can reasonably assume that the type is "object". See OAI/OpenAPI-Specification#1657 Excerpt: A particularly common form of this is a schema that omits type, but specifies properties. Strictly speaking, this does not mean that the value must be an object. It means that if the value is an object, and it includes any of those properties, the property values must conform to the corresponding property subschemas. In reality, this construct almost always means that the user intends type: object, and I think it would be reasonable for a code generator to assume this, maybe with a validation: strict|lax config option to control that behavior. * Improve code as per @oliyh's suggestion.
…fix Infragistics AppBuilder tooling (#2283) ## Why make this change? - Closes #2212 which describes how the missing `"type":"object"` key/value pair on the response child object schema breaks certain client tooling. in this case: Infragistics AppBuilder. ### Background I found a relevant thread that discusses whether type is a required property. Consensus is that type isn't required: OAI/OpenAPI-Specification#1657 OpenAPI-Specification discussion PaloAltoNetworks/docusaurus-openapi-docs#430 Example of how different tooling handles type or missing type differently. Ultimately, different tooling handles the presence of the type property differently. Some may try to guess the type when not present: The fact that the type isn't required means that https://github.com/microsoft/OpenAPI.NET didn't complain about missing type. An error if type were required would have helped prevent this becoming an issue in the first place. ## What is this change? - Adds `"type": "object"` to the openapi document for describing the response schema: ```json "responses": { "200": { "description": "OK", "content": { "application/json": { "schema": { "type": "object", // <--- This property/value "properties": { "value": { "type": "array", "items": { "$ref": "#/components/schemas/Book" } }, "nextLink": { "type": "string" } } } } } } ``` ## How was this tested? - [x] Integration Tests - [ ] Unit Tests ## Sample Request(s) View generated schema at ```https GET localhost:5001/api/openapi ``` Co-authored-by: Abhishek Kumar <[email protected]>
I am wondering how to interpret the following line of OpenAPI 3 (as of v3.0.1):
Does this mean that the
type
field is mandatory and must be a string, or that it is optional and if it exists it must be a string?The text was updated successfully, but these errors were encountered: