Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is type a required field for Schema Objects? #1657

Closed
bbqsrc opened this issue Aug 7, 2018 · 28 comments
Closed

Is type a required field for Schema Objects? #1657

bbqsrc opened this issue Aug 7, 2018 · 28 comments

Comments

@bbqsrc
Copy link

bbqsrc commented Aug 7, 2018

I am wondering how to interpret the following line of OpenAPI 3 (as of v3.0.1):

type - Value MUST be a string. Multiple types via an array are not supported.

Does this mean that the type field is mandatory and must be a string, or that it is optional and if it exists it must be a string?

@darrelmiller
Copy link
Member

The latter. type is not a required property in JSON Schema, nor in OAS Schema Object.

@MikeRalphson
Copy link
Member

Additionally, from the spec

In the following description, if a field is not explicitly REQUIRED or described with a MUST or SHALL, it can be considered OPTIONAL.

So I can see the potential for confusion. If you think a wording clarification (such as If present, value MUST ...) would be beneficial, then a PR would be gratefully received.

@silkentrance
Copy link

silkentrance commented Aug 16, 2018

And what would be the default fallback type, if no type was specified? string? This should also be mentioned.

@tedepstein
Copy link
Contributor

@silkentrance , the type keyword is an optional assertion. If you don't include it, values of any type are allowed.

This isn't mentioned explicitly in the OpenAPI spec, because in this respect OpenAPI follows the same semantics as the underlying JSON Schema spec.

@silkentrance
Copy link

silkentrance commented Aug 16, 2018

@tedepstein this might work well with dynamically typed languages, but with statically typed languages this represents a problem, as Object, at least in Java, will not be able to carry the payload. As such, hints for code generator implementers would be needed that will advise them on how to represent such an unspecified type, e.g. Any as in XML Schema. And, since no assumption can be made on the actual type, Any would be the best fit code generation wise, requiring the value to be cast / unmarshalled to its actual data type using a best guess/trial and error strategy.

Which of course is a no-no since the consumer, that is the implementer of the so defined API now has to disambiguate the ambiguous type.

Why do you have the need to stick that closely to the JSON Schema specification? OAS is not something that is abstract, it basically boils down to code generation and providing people with useful APIs and frameworks thereof, using the idioms of the target frameworks and languages and platforms for which the code is being generated for. Or do you really want to replicate CORBA? (just kidding)

@tedepstein
Copy link
Contributor

@silkentrance , agreed, this is one of those areas where JSON Schema's flexibility as a validation language makes it an imperfect fit for type description and code generation.

I know that JSON Schema is introducing "vocabularies" as a language feature to add extended metadata, but I haven't studied that yet. In the meantime, our editing tools encourage users to specify a type. By default, we treat missing type as a warning, though we're now making this configurable as it puts us out of strict conformance to the OpenAPI spec.

@silkentrance
Copy link

silkentrance commented Aug 16, 2018

@tedepstein and maybe, just maybe, and since the OAS is defined using JSON Schema, there is a slight misunderstanding in what the meta schema of OAS is, which is basically JSON Schema, and what the concrete schema of the so defined APIs and types is, which is basically an applied customisation of JSON Schema which in turn is either more restrictive or even more extensible, as in vendor extensions. I sometimes also get confused with all the meta meta meta levels when defining a language or system.

Therefore, when not specifying a type explicitly, OAS should consider the type for being a string. The user can then decide on how to handle the value of either parameter or property.

See also #1649

@bbqsrc
Copy link
Author

bbqsrc commented Aug 18, 2018

Ultimately this question arose while supporting the edge case (which appears often due to typos) of code generation for Swift, Kotlin and TypeScript. When a user forgets to enter type, we now simply treat this as an error for the purpose of code generation as there is no meaningful step the generator can take without causing potentially worse effects by taking a guess based on the other fields.

@tedepstein
Copy link
Contributor

tedepstein commented Aug 18, 2018

@silkentrance,

and maybe, just maybe, and since the OAS is defined using JSON Schema, there is a slight misunderstanding in what the meta schema of OAS is, which is basically JSON Schema...

I should point out that OAS itself is not defined using a JSON schema. An OpenAPI document (i.e. a description of an API, conforming to the OAS) takes the form of an object graph that can be partially described and validated using JSON Schema.

But the written specification is the authoritative source. Any "official" or unofficial JSON Schema describing the OAS itself is supplementary, not considered definitive.

...and what the concrete schema of the so defined APIs and types is, which is basically an applied customisation of JSON Schema which in turn is either more restrictive or even more extensible, as in vendor extensions. I sometimes also get confused with all the meta meta meta levels when defining a language or system.

Getting lost in the meta-levels is an occupational hazard. ;-)

Therefore, when not specifying a type explicitly, OAS should consider the type for being a string. The user can then decide on how to handle the value of either parameter or property.

That would be a further break from JSON Schema. The OpenAPI TSC wants to explore convergence with a future draft of JSON Schema, such that Schema Object fully supports the syntax and semantics of JSON Schema. The overall consensus seems to be that this would be valuable, though I don't think they've committed to this yet.

@bbqsrc,

When a user forgets to enter type, we now simply treat this as an error for the purpose of code generation as there is no meaningful step the generator can take without causing potentially worse effects by taking a guess based on the other fields.

We currently treat it as a warning in our editors, though again this is a bit of a divergence from the OAS standard, so we plan to make that warning configurable.

A particularly common form of this is a schema that omits type, but specifies properties. Strictly speaking, this does not mean that the value must be an object. It means that if the value is an object, and it includes any of those properties, the property values must conform to the corresponding property subschemas.

In reality, this construct almost always means that the user intends type: object, and I think it would be reasonable for a code generator to assume this, maybe with a validation: strict|lax config option to control that behavior.

There are lots of other cases where annotations like format, or assertions like maxLength or multipleOf can give a clue as to the intended type. And assuming type: string as the default might even be reasonable in some cases. Just be aware that any behaviors like this are out of strict conformance with the spec, and are prone to misinterpretation.

@handrews
Copy link
Member

@tedepstein I think that allowing users to configure type assumptions is a good practical compromise. While it encourages people to write schemas that might behave surprisingly in other environments, for folks who are just using schemas within OAS it's probably better to work with what people are probably going to do anyway.

I would default to strict conformance, if for no other reason than to make users think about it for a half-second and consider interoperability, but as you note there are quite a few ways to deterministically infer type.

@silkentrance
Copy link

silkentrance commented Aug 18, 2018

@tedepstein In my understanding, the OAS' schema should be, and considering Swagger, is, JSON schema.

meta-4 (JSON Schema): OAS schema defined using JSON Schema
meta-3 (OAS Schema): Custom API specifications
meta-2 (JSON Schemas): Custom API derived JSON schemas
meta-1 (Generated Code): Types, Interfaces...
meta-0 (Instance): instances of types, requests, responses and what not

and code generation wise, which generates code based on the meta-3 level, whilst the underlying client/service framework utilises the JSON Schema at the meta-2 level for validating the requests/responses based on the internally generated JSON Schema before they ever get de/serialised into real objects of the target language or for transmission over the wire.

As I see it, neither the existing framework nor the existing code generators add the extra meta-2 level, or when they do, they will encode it in the so generated classes.
But, at least for Java, I do not see such behaviour, except for maybe JavaSpring where the beans need to be validated, but again, this is not a validation on the JSON Schema level. As such, the JSON schema that should hold governance over everything, is never applied.
And I do not see it in the existing Swagger provided framework, at least for Java, either.

The latter is the problem here, as the framework will need to make best guesses on the actual type, which may or may not be defined by meta-3. For XML it is rather simple, either it is a simple type, and when there is no xsi:type information, it is xs:string, or it is an element with arbitrary content.

In effect, the user will get a generic JSonNode type (jackson, Java), in no way being able to derive any type information from it, e.g. on whether it is a MyPreviousObject or a simple string or whatever, basically having to second guess what type it is by himself or herself, or considering for example OpenProject and other such solutions, they will introduce a type discriminator that will allow them to discern the actual type using yet another framework for doing so.

As for the missing type information, this will pose extra work on the framework, which, while realising the meta-2 level, will now have to undergo a trial and error phase where it tries to validate whether a given parameter or payload matches any of the available meta-3 declared parameter / content / object / type schemas.

In conclusion, I strongly believe that at the OAS meta-3 level, there must always be a type information, even when targeting Javascript clients or NodeJS.

@silkentrance
Copy link

silkentrance commented Aug 18, 2018

In addition, the missing type information could also be replaced by either oneOf or allOf. But neither of type, oneOf or allOf must ever be missing, or at least type should default to string, or object, depending on the information in the specification, e.g. properties avail? make it an object.

@silkentrance
Copy link

silkentrance commented Aug 18, 2018

@handrews leaving out the type information or including it has nothing to do with adhering to strict conformance. Whether it is optional, which I strongly believe it must not, as it is very specific to untyped dynamic languages, or whether it is mandatory in form of either type, oneOf or allOf (including type as an extension), is not up to the question.

It is about OAS targeting concrete consumers of that specification, which are basically user implemented clients and services, using either strongly typed or laxly typed languages.

JSON Schema, on the other hand is an altogether different beast, if you don't mind me saying so.
It resides on the meta-5 level, as it is the schema of the OAS. And it also has interference into the meta-2 level, where we are talking about schemas derived from applied OAS based specifications that define APIs, requests, responses and the objects and types involved during runtime, handling both requests and responses, but not so much on the meta-1 level where we are talking generated code, i.e. request objects, response objects, delegates, and types and primitives thereof. Here, the user is solely responsible for processing incoming data, once it was validated by the framework, and returning responses that then will also be validated by the framework.

@tedepstein
Copy link
Contributor

@silkentrance ,

In my understanding, the OAS' schema should be, and considering Swagger, is, JSON schema.

What I'm saying about OAS and JSON Schema is something I've seen stated here, and repeated in several other contexts. There is no official JSON Schema for OAS3 as of yet, and to quote @webron:

Relying solely on the schema for validation is not good enough. We will publish a schema, but it is not the source of truth. JSON Schema simply cannot fully validate the spec.

@darrelmiller also wrote:

As JSON Schema cannot fully validate an OpenAPI document we are still not sure how best to include and describe the role of a JSON Schema. Calling it an "official" JSON Schema has the risk of making false promises.

All this is to say that OpenAPI is not defined in terms of a JSON Schema. It's defined by the written, human-readable specification.

Anyway, I think this is incidental to the discussion of whether type should be required or optional.

@tedepstein
Copy link
Contributor

@silkentrance ,

meta-4 (OAS Schema Schema): OAS schema defined using JSON Schema
meta-3 (OAS schema): Custom API specifications
meta-2 (JSON Schemas): Custom API derived JSON schemas
meta-1 (Generated Code): Types, Interfaces...
meta-0 (instance): instances of types, requests, responses and what not

This is one possible way of defining layers in a solution stack, but it's not the only way. There are many cases where OAS is not involved at all in the implementation of clients or servers; it's used separately to produce API documentation used by a client developer.

As I see it, neither the existing framework nor the existing code generators add the extra meta-2 level, or when they do, they will encode it in the so generated classes.

Sorry, which framework, and which code generators? There are many. If you're talking about Swagger-Codegen, Swagger-Core, etc., those are separate implementation projects, and the implementers there made their own decisions about validation.

But, at least for Java, I do not see such behaviour, except for maybe JavaSpring where the beans need to be validated, but again, this is not a validation on the JSON Schema level. As such, the JSON schema that should hold governance over everything, is never applied.
And I do not see it in the existing Swagger provided framework, at least for Java, either.

If those implementations don't include message validation, it could just be that they never got around to it, or didn't think it was an essential feature. I would not assume that they were put off by the fact that type is optional.

The latter is the problem here, as the framework will need to make best guesses on the actual type, which may or may not be defined by meta-3. For XML it is rather simple, either it is a simple type, and when there is no xsi:type information, it is xs:string, or it is an element with arbitrary content.

In effect, the user will get a generic JSONObject type, in no way being able to derive any type information from it, e.g. on whether it is a MyPreviousObject or a simple string or whatever, basically having to second guess what type it is by himself or herself, or considering for example OpenProject and other such solutions, they will introduce a type descriminator that will allow them to discern the actual type using yet another framework for doing so.

I think this is all a way of saying that OpenAPI allows varying degrees of precision and completeness, and not every OpenAPI document is suitable as an input for a code generator or an interpretive runtime framework.

Those generators or frameworks can try to infer missing type information, based on other properties. But clearly it's extra work to implement this inferrence, and it's not guaranteed to produce the right behavior.

@tedepstein
Copy link
Contributor

tedepstein commented Aug 19, 2018

I don't think I would support a proposal to make type required, even though it would make my life easier as a tool provider.

Aside from being a breaking change, it would limit the use of OpenAPI to describe some APIs in the wild that actually do allow more than one data type for certain parameters or properties. I don't think this is good API design, but OpenAPI should be able to describe bad APIs as well as good ones.

That said, optional typing is a legitimate challenge for code generators and implementation frameworks. So we can consider a range of possible ways to deal with this:

  1. Leave it entirely up to the generators and frameworks. Let them do their own validation, and either reject OAS documents with missing type, or use type inference and default behaviors, along with appropriate configuration options and log information. This is what we're doing today.
  2. Lean on tool providers to facilitate creation of implementation-ready OpenAPI specs using extra validations (or other feedback), and to make these features highly discoverable so API designers are encouraged to use them. Further to this:
    • Explore JSON Schema Vocabularies. This is a planned set of features that I have only vague knowledge of, as I haven't read the draft proposal. I think of vocabularies as something similar to UML profiles, but I'm not sure how close they are. Maybe it will be possible to create an "API Implementation Vocabulary" that makes type required, among other things.
    • Standardize the Implementation Profile using JSON Schema vocabularies, or some other profiling mechanism. Converge on an agreed set of constraints (and maybe extensions) to be defined under OpenAPI itself, or as a separate, related project.
  3. Use SHOULD or RECOMMENDED in the OAS3 spec to indicate that type should be specified, wherever practical, to confer a minimum level of precision required by a broad set of downstream consumers, including code generators, frameworks, etc.
    • This would be a non-breaking change, but would allow our editors (and other commercial and open source editors) to flag missing type as a warning, while still remaining in full conformance to the OpenAPI specification.
    • I think a guideline like this could also be considered benign with respect to the planned alignment of OAS with JSON Schema. But that might need further discussion.

@silkentrance
Copy link

Those generators or frameworks can try to infer missing type information, based on other properties. But clearly it's extra work to implement this inferrence, and it's not guaranteed to produce the right behavior

This should never be the case. The behaviour of the system must always be predictable.

@handrews
Copy link
Member

@tedepstein there are two aspects of JSON Schema vocabularies (which I'm still in the middle of writing up as it's a series of non-trivial PRs to the draft and I've been in summer vacation mode for a while) which can help here.

One is the obvious part- adding a set of new keywords as a vocabulary that you can declare that you are using. This is not unlike declaring either the validation or hyper-schema "vocabularies" that informally exist today, which is done with $schema.

This part will let us define a code generation vocabulary with new keywords, e.g. baseClass and childOf for flagging an allOf as implementing inheritance rather than simply ANDing the schemas without regard to order.

The less obvious part is being able to easily customize meta-schemas, as declaring your vocabulary and declaring your meta-schema can be done separately (currently, all you get is $schema, so as soon as you customize your meta-schema, tools can no longer tell that you're using standard validation or hyper-schema).

Custom meta-schemas will probably allow defining additional keywords as a kind of light-weight anonymous vocabulary, but the main point of them will be to allow for more restrictive meta-schemas. Such a meta-schema could make type required.

A code generation tool could easily apply such a strict meta-schema to a schema intended for use by the tool, and reject it if it does not pass. This is the appropriate place for such a tighter check, not the OAS specification itself.

@tedepstein
Copy link
Contributor

@handrews, any thought of allowing a vocabulary to add constraints, such as making type required?

@handrews
Copy link
Member

@tedepstein you can add constraints with a custom meta-schema, although the distinction there is a bit fuzzy as I haven't finished nailing it all down.

For the most part, vocabularies should add keywords (including keywords in objects like the LDO when appropriate), and not restrict other vocabularies.

But there is already a way to restrict existing vocabularies with custom meta-schemas, and we will make that easier. Currently, you only get one URI to indicate your meta-schema, which is effectively your vocabulary. If that meta-schema is using allOf to add constraints to a standard meta-schema, there's no reliable way to recognize that right now.

Instead, meta-schemas will declare what vocabularies they follow, and then will be able to add further validation rules which can constrain those vocabularies. It might look something like this (I don't even know if this is the same syntax as in the last example I came up with, so take it with a grain of salt).

Bear with me here- it's a bit messy as we're still nailing down some tricky details. Note that it's relatively rare for people to write meta-schemas compared to schemas, so a little complexity is more acceptable here than in, say, object validation.

{
    "$schema": "https://example.com/meta-schema#",
    ...
}
{
    "$schema": "https://json-schema.org/draft-08/schema#",
    "$id": "https://example.com/meta-schema",
    "$vocabularies": [
        "https://json-schema.org/draft-08/vocabularies/schema",
        "https://json-schema.org/draft-08/vocabularies/code-gen"
    ],
    "$recursiveRoot": true,
    "required": ["type"],
    "allOf": [
        {"$ref": "https://json-schema.org/draft-08/schema"},
        {"$ref": "https://json-schema.org/draft-08/code-gen"}
    ]
}

What this says is that anything using this meta-schema is:

  • Using keyword semantics defined in the "schema" (core + validation) and "code-gen" (we're working on it, at least theoretically) vocabularies. This is what $vocabularies indicates- just keyword semantics, not validation of those keywords

  • Additionally, it must be valid against both of the individual vocabulary meta-schemas, which are included with allOf. It's conceivable that someone might want to declare conformance with multiple vocabulary semantics, but only allow each object to use one set of keywords or the other. Although that would be tricker than it sounds- we're still figuring out how much flexibility this part really needs, we might end up saying that $vocabularies just ANDs in all of the relevant meta-schemas.

  • The required should be fairly obvious. The $recursiveRoot is what allows you to just set the required in the meta-schema's root schema object. Due to another keyword ($recursiveRef) in the draft-08 meta-schemas, that required will get applied anywhere a schema object is referenced.

None of this is in the spec yet, although it's all just about ready for PRs as soon as I have a chance.

My view is that by making it easy to implement restrictions in custom meta-schemas, which are how you declare your use of vocabulary combinations anyway, we will discourage vocabulary authors interfering with each other's keywords. Notably, the spec will declare that the behavior of vocabularies that define incompatible semantics for the same keyword is undefined. Don't do it.

@tedepstein
Copy link
Contributor

Thanks, @handrews . I will probably need to come back to this later, as I don't have time to fully process it now.

For the most part, vocabularies should add keywords (including keywords in objects like the LDO when appropriate), and not restrict other vocabularies.
...
My view is that by making it easy to implement restrictions in custom meta-schemas, which are how you declare your use of vocabulary combinations anyway, we will discourage vocabulary authors interfering with each other's keywords. Notably, the spec will declare that the behavior of vocabularies that define incompatible semantics for the same keyword is undefined. Don't do it.

There's some interesting prior art you might want to consider here, with UML profiles. (This may or may not be your idea of a good reference point, but profiles have been widely implemented by tool providers, and used in some demanding applications.)

UML profiles can extend other profiles, and IIUC those extensions can include additional constraints. As long as there are no cycles in the extension graph, the constraints are cumulative, and there's no conflict.

If it's OK with you, I'd like to leave this as a placeholder for a future discussion.

@silkentrance
Copy link

@handrews @tedepstein Regarding JSON Schema vocabularies and UML profiles, I think that this should be best discussed in the JSON Schema project. However, I am with @tedepstein here, however, the current approach targeted by JSON Schema is more or less an extension to the schema itself, whereas UML profiles, for our purpose that is, simply add configurations as in vendor extensions, say x-codegen-single-inheritance-only: <boolean>, and so on.

@webron
Copy link
Member

webron commented Aug 24, 2018

This is an interesting discussion, and the information is valuable.

We have a constant challenge when it comes to defining the spec when we consider the implications. There are both consumers and producers to keep in mind, and various tools that work around the API life cycle, development languages and so on.

I doubt we're ever going to make type mandatory, for a few reasons:

  • Some APIs actually support 'any' input, and we want to allow the producers to describe such APIs. The fact that this imposes challenges on producers in certain development languages, is actually producer's concern. They may simply not care about strongly typed languages, and it's not up to us to force them to care.
  • In OAS3 we also introduced support for oneOf and anyOf (okay, and not) - and those also add challenges to strongly typed languages. However, there's a reasonable demand for it.
  • We have a draft feature that would allow users to use any schema format they choose (any version of JSON Schema, XSD, protobuf and so on). Once that feature is fully integrated with the spec, we won't have the option to restrict and change how these formats are used - so it's a lost battle.

As we end up saying... we're never going to make everyone happy.

@bbqsrc
Copy link
Author

bbqsrc commented Aug 24, 2018

Some APIs actually support 'any' input, and we want to allow the producers to describe such APIs. The fact that this imposes challenges on producers in certain development languages, is actually producer's concern. They may simply not care about strongly typed languages, and it's not up to us to force them to care.

Any can be represented in most strongly typed languages. Less strongly typed languages tend to let you represent anything you want as an Any or Object type and it can be interpreted at the consumer's leisure. More strongly typed languages tend to have an enum or tagged union type that can represent this without issue. If one assumes that "any" can only mean the JSON spec's small group of options, a JSON type can be implemented very quickly and solve that problem in a very typesafe manner. This is what I have done in Swift.

In OAS3 we also introduced support for oneOf and anyOf (okay, and not) - and those also add challenges to strongly typed languages. However, there's a reasonable demand for it.

Much same as above, I haven't had trouble representing oneOf or anyOf in any strongly typed language I've targeted. Discriminators are ultimately an interface with a single field, and you can match on a closed set of types to discriminate between the possible objects. Some languages you have a loosely connected pile of classes linked with a helper function, whereas stronger ones you can use a proper tagged union.

We have a draft feature that would allow users to use any schema format they choose (any version of JSON Schema, XSD, protobuf and so on). Once that feature is fully integrated with the spec, we won't have the option to restrict and change how these formats are used - so it's a lost battle.

We already have the practical reality of XML appearing in some APIs, sometimes along side JSON APIs within the same document. Implementing this in strongly types languages is usually not so much of a problem either, as you can add enough hints that the parsing tools you glue to it can assess what is meant to happen without any real trouble. Adding the 'complexity' of things like protobuf don't make things more challenging beyond having to recognise more MIME types and adding handlers.

@webron
Copy link
Member

webron commented Aug 24, 2018

@bbqsrc absolutely. I'm not saying there are no solutions, and the statements I made were generally in regard to making type mandatory, or rather not (which is not necessarily related to code generation).

@charlie430
Copy link

charlie430 commented Jan 18, 2019

  • We have a draft feature that would allow users to use any schema format they choose

@webron Is there somewhere that we can track and follow along on the feature to allow users to use any payload schema format? I'm particularly interested in using protobuf as the payload format.

@tedepstein
Copy link
Contributor

@charlie430 , I think you're looking for the Alternative Schemas feature proposed in #1532. You can subscribe to notifications on that feature.

There's also an open pull request #1736 currently under review.

@handrews
Copy link
Member

OK, having slogged through this again, I'm reaching the following conclusions:

  • The answer to the original question is "No, it is not required."
  • For OAS 3.1 we just affirmed compatibility with the latest JSON Schema draft, which is a decisive statement that it is not going to be required, HOWEVER...
  • Moving to the latest JSON Schema draft makes it easy for you to enforce additional schema rules with a custom meta-schema, which is how JSON Schema recommends enforcing such things.
  • If enforcing via a meta-schema is unappealing, the Alternative Schemas proposal Proposal: x-oas-draft-alternativeSchemas  #1532 is the right place to look for a solution.

Given all of that, and that discussion stalled a year ago, I'm going to go ahead and close this. If anyone wants to discuss custom meta-schemas for this in OAS 3.1, that's worth a new issue :-)

bombaywalla added a commit to bombaywalla/martian that referenced this issue Feb 13, 2022
If a schema has no "type" key, and ONLY has a "properties" key, then
we can reasonably assume that the type is "object".

See OAI/OpenAPI-Specification#1657

Excerpt:
A particularly common form of this is a schema that omits type, but specifies properties.
Strictly speaking, this does not mean that the value must be an object.
It means that if the value is an object, and it includes any of those properties,
the property values must conform to the corresponding property subschemas.

In reality, this construct almost always means that the user intends type: object,
and I think it would be reasonable for a code generator to assume this,
maybe with a validation: strict|lax config option to control that behavior.
oliyh pushed a commit to oliyh/martian that referenced this issue Feb 15, 2022
* Handle schemas with no type.

If a schema has no "type" key, and ONLY has a "properties" key, then
we can reasonably assume that the type is "object".

See OAI/OpenAPI-Specification#1657

Excerpt:
A particularly common form of this is a schema that omits type, but specifies properties.
Strictly speaking, this does not mean that the value must be an object.
It means that if the value is an object, and it includes any of those properties,
the property values must conform to the corresponding property subschemas.

In reality, this construct almost always means that the user intends type: object,
and I think it would be reasonable for a code generator to assume this,
maybe with a validation: strict|lax config option to control that behavior.

* Improve code as per @oliyh's suggestion.
Aniruddh25 pushed a commit to Azure/data-api-builder that referenced this issue Jul 3, 2024
…fix Infragistics AppBuilder tooling (#2283)

## Why make this change?

- Closes #2212 which describes how the missing `"type":"object"`
key/value pair on the response child object schema breaks certain client
tooling. in this case: Infragistics AppBuilder.

### Background

I found a relevant thread that discusses whether type is a required
property. Consensus is that type isn't required:

OAI/OpenAPI-Specification#1657
OpenAPI-Specification discussion
PaloAltoNetworks/docusaurus-openapi-docs#430
Example of how different tooling handles type or missing type
differently.
Ultimately, different tooling handles the presence of the type property
differently. Some may try to guess the type when not present:

The fact that the type isn't required means that
https://github.com/microsoft/OpenAPI.NET didn't complain about missing
type. An error if type were required would have helped prevent this
becoming an issue in the first place.

## What is this change?

- Adds `"type": "object"` to the openapi document for describing the
response schema:
```json
                "responses": {
                    "200": {
                        "description": "OK",
                        "content": {
                            "application/json": {
                                "schema": {
                                    "type": "object", // <--- This property/value
                                    "properties": {
                                        "value": {
                                            "type": "array",
                                            "items": {
                                                "$ref": "#/components/schemas/Book"
                                            }
                                        },
                                        "nextLink": {
                                            "type": "string"
                                        }
                                    }
                                }
                            }
                        }
                    }
```

## How was this tested?

- [x] Integration Tests
- [ ] Unit Tests

## Sample Request(s)

View generated schema at
```https
GET localhost:5001/api/openapi
```

Co-authored-by: Abhishek  Kumar <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants