-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Covariance and contravariance #3803
Comments
class Foo
end
class Bar < Foo
end
class Baz
def zzz_array(something : Array(Foo))
puts something
end
def insert_sth(something : Array(Foo))
something.push(Foo.new)
end
end
puts Array(Bar).new.is_a?(Array(Foo)) # returns "false"
array = [Bar.new, Bar.new] of Bar
puts Baz.new.zzz_array(array) # works perfectly fine as expected
puts Baz.new.insert_sth(array) # including this line causes the program to fail compilation. Crystal always intended to mimic the development experience of duck typing while ensuring safety, and the current behavior honors that philosophy. Maybe we already have the best tradeoff for Crystal and just need to be careful in the way we teach users why |
@mverzilli You are totally right with the above :-) It's like Crystal says "OK, I let you pass that Another scenario where this "I'll only yell at you when yo make a real mistake" is this one: def foo(x : Enumerable(Int32))
x << 4
end
array = [1, 2, 3]
foo(array)
p array # => [1, 2, 3, 4] Sure an As a side note, I wouldn't write such program (I would probably simply not use a type restriction) |
I think we are missing a point regarding type restrictions: choosing a method overload. The current situation is assuming that all type parameters in generics are covariant: class Base; end
class Derived < Base; end
def foo(arr : Array(Base))
"Array(Base) overload called"
end
def foo(x)
"No restriction overload called"
end
def bar(arr : Array(Derived))
"Array(Derived) overload called"
end
def bar(x)
"No restriction overload called"
end
derived_arr = [Derived.new]
base_arr = [Base.new]
foo(derived_arr) # => Array(Base) overload called
foo(base_arr) # => Array(Base) overload called
bar(derived_arr) # => Array(Derived) overload called
bar(base_arr) # => No restriction overload called This might not make sense for some classes (C# has a nice list, including I guess that besides the fact that Crystal will "only yell at you when you make a real mistake" (which might be due for another discussion, since though it's really useful developing apps, it could be a problem when distributing libs), covariance and contravariance do affect dispatch resolution. |
Those are fair points. I like C#'s approach of adding If we go that way, one important question is what will be the default variance? If I don't provide an explicit annotation, is it covariant, contravariant or invariant? Or maybe there's a 4th variance taste particular to Crystal (let's call it |
The deferred-yell should be a topic for another discussion, and be consistent across generic and non-generic types. Covariance and contravariance should be required for deciding which overload to pick, not just for throwing errors (or throwing them later). As for the default variance, I'd go with invariant. My guess is that most generics come from the stdlib, where we can ensure that sensible variances are defined. |
Implementing |
It also, for me at lest, feels like going away from the fun part of Crystal where you don't have to explicitly define types most of the times. starting to do |
I have a idea from Java that combines type var into the type restriction: (modified from @mverzilli's example) class Foo; end
class Bar < Foo; end
class Baz
def zzz_array(something : Array(T < Foo)) # notice here, that declared a type T which inherits from Foo
puts something
end
def insert_sth(something : Array(Foo))
something.push(Foo.new)
end
end
puts Array(Bar).new.is_a?(Array(Foo)) # false
puts Array(Bar).new.is_a?(Array(T < Foo)) # true, Array(Bar) matched Array(T < Foo)
array = [Bar.new, Bar.new] of Bar
puts Baz.new.zzz_array(array) # Pass, Array(Bar) matched Array(T < Foo)
puts Baz.new.insert_sth(array) # Failed, Array(Bar) is not a Array(Foo) I don't know this is good or bad, and maybe add Or maybe just use BTW, this cannot avoid the problem discussed above, but I think this should be controlled by developer, not the language |
@bararchy this only applies when you choose to define types: the question is how to decide which method overload to choose when parameters are generics. If you never specify types, then you don't have overloads, and this doesn't apply. @david50407 personally I like the idea. I didn't know it existed in Java, I knew those annotations from C# (via the |
Sry for the duplicate. I didn't find this issue and did not think of covariance as a reason, neither did Gitter point to this. While it is really nice and saves unnecessary conversions it is also very contra intuitive that type definitions are not the same for variables and method signatures and that type differentiation through overloading and is_a?/case are working differently. My 2cents to get this further: I think some sort of notation to mark expected types as co- or contravariant would be great. |
@straight-shoota sorry for the confusion. I meant that I'd use that restriction when defining a type or function, not when using it. Using the Comparer example from C#, the Comparer itself is defined as |
maybe just go with "always invariant"
EDIT to say this is somewhat a duplicate of what is written in the Kotlin doc already |
IMO we should first proceed with supporting either bounded restrictions (#6997 (comment)) or bounded free vars (#6997 (comment)), so that a transition path exists from non-strict overload resolution to the strict mode. Then later we could issue a warning whenever an argument matches a generic type arg restriction in the non-strict manner. Something like the following: def foo(x : Array(Int32 | String))
end
# suppose this is added in 1.1
def bar(x : Array(T)) forall T <= Int32 | String
end
# 1.0: okay
# 1.1: okay
# 1.2: warning: non-strict generic type argument matched
foo([1])
# 1.1: okay
# 1.2: okay
bar([1]) But how should we proceed afterwards in accordance with 1.x's backward compatibility guarantees? When do we drop the non-strict semantics completely? My main concern is that as long as non-strict matching stays, other semantic aspects like overload ordering must be consistent with non-strict matching, so we can't do anything to them unless there is some kind of compile-time option to completely disable non-strict matching and/or turn the above warning into an error. |
Yeah, figuring out a decent migration path is going to be trouble. But I think we first need to figure out where we want to go, rather then how. It's definitely good to keep the how in mind to ensure it's practical. But IMO we're currently still in an early design stage that we shouldn't worry to much about the final integration process. |
Personally I want type restrictions to be enforced by the compiler independently of the method code and in a type safe way. Type restrictionsdef foo(x : Enumerable(Int32))
x << 4
end This code is just plain wrong. The type restriction here should disable any duck typing. I want an The compiler should complain in the same way it complains with exhaustive case on abstract types. The same argument that "someone else might add a subclass" applies here: someone might actually gives this method an Generics
I don't like that for the same reason as above. I want to compiler to tell me if my code is ok with the given type restriction, not if it is ok with the actual implementation. Same argument as the exhaustive case again. Generic types should be invariant. If a method expect an Maybe we could add this InheritanceWhen using inheritance and overloading a method, parameters must be contravariant and the return type must be covariant. There too it may be a surprise but it is not so hard to understand. class Foo
def something
puts "foo"
end
end
class Bar < Foo
def something
puts "bar"
end
end
class Parent
def something(a : Bar) : Foo
puts "parent"
Foo.new
end
end
class Child < Parent
def something(a : Foo) : Bar
puts "child"
Bar.new
end
end
def f(x : Parent)
x.something(Bar.new)
end
f(Parent.new) # => parent, ok
f(Child.new) # => child, ok
|
@erdnaxeli I don't follow the significance of the inheritance example for variance. |
Now that I think about it, due to #8973 the following two defs will not be equivalent: def foo(x : Array(Array))
end
def bar(x : Array(T)) forall T <= Array
end
foo([[1]]) # okay
bar([[1]]) # okay, T = Array(Int32)
foo([[1 || "a"]]) # okay
bar([[1 || "a"]]) # okay, T = Array(Int32 | String)
foo([[1] || ["a"]]) # okay
bar([[1] || ["a"]]) # okay, T = Array(Int32) | Array(String)
foo([[1]] || [["a"]]) # okay
bar([[1]] || [["a"]]) # error, `Array(T)` is never a union |
It seems a bit disproportionate that Maybe this (already currently observed) behaviour needs some refinement? def bar(x : Array(T)) forall T
end
bar([[1]] || [["a"]]) # Error: no overload matches 'bar' with type (Array(Array(Int32)) | Array(Array(String)))
# Overloads are:
# - bar(x : Array(T))
# Couldn't find overloads for these types:
# - bar(x : Array(Array(String))) The error message claims there's no overload |
Argh. Obviously in my example |
Given:
And an array of
Bar
:We can ask the question: is
array
anArray(Foo)
?Two answers are possible:
array
we get aBar
, which is aFoo
array
, we can't put aFoo
, onlyBar
and its subclassesRight now the compiler will answer "Yes" for type restrictions:
But i will answer "No" for
is_a?
:This is not intuitive and should be fixed somehow.
This also affects union types, aliases and recursive aliases.
The text was updated successfully, but these errors were encountered: