Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

String References on Schema produce Attr Error #436

Closed
dfee opened this issue Mar 18, 2017 · 3 comments
Closed

String References on Schema produce Attr Error #436

dfee opened this issue Mar 18, 2017 · 3 comments

Comments

@dfee
Copy link
Member

dfee commented Mar 18, 2017

In this example code, I try to relate objects to each other.

import graphene

class A(graphene.ObjectType):
     name = graphene.String()
     to_b = graphene.Field('B')

class B(graphene.ObjectType):
     name = graphene.String()
     to_b = graphene.Field('A')

class Query(graphene.ObjectType):
     get_a = graphene.Field('A')
     get_b = graphene.Field('B')

schema = graphene.Schema(query=Query)

However, I get an AttributeError

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-7-a46ddc63b837> in <module>()
----> 1 schema = graphene.Schema(query=Query)

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphene/types/schema.py in __init__(self, query, mutation, subscription, directives, types, auto_camelcase)
     42         )
     43         self._directives = directives
---> 44         self.build_typemap()
     45
     46     def get_query_type(self):

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphene/types/schema.py in build_typemap(self)
    103         if self.types:
    104             initial_types += self.types
--> 105         self._type_map = TypeMap(initial_types, auto_camelcase=self.auto_camelcase, schema=self)

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphene/types/typemap.py in __init__(self, types, auto_camelcase, schema)
     57         self.auto_camelcase = auto_camelcase
     58         self.schema = schema
---> 59         super(TypeMap, self).__init__(types)
     60
     61     def reducer(self, map, type):

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphql/type/typemap.py in __init__(self, types)
     14     def __init__(self, types):
     15         super(GraphQLTypeMap, self).__init__()
---> 16         self.update(reduce(self.reducer, types, OrderedDict()))
     17         self._possible_type_map = defaultdict(set)
     18

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphene/types/typemap.py in reducer(self, map, type)
     65             type = type()
     66         if is_graphene_type(type):
---> 67             return self.graphene_reducer(map, type)
     68         return GraphQLTypeMap.reducer(map, type)
     69

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphene/types/typemap.py in graphene_reducer(self, map, type)
     92             internal_type = self.construct_union(map, type)
     93
---> 94         return GraphQLTypeMap.reducer(map, internal_type)
     95
     96     def construct_scalar(self, map, type):

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphql/type/typemap.py in reducer(cls, map, type)
     77
     78         if isinstance(type, (GraphQLObjectType, GraphQLInterfaceType, GraphQLInputObjectType)):
---> 79             field_map = type.fields
     80             type_is_input = isinstance(type, GraphQLInputObjectType)
     81             for field_name, field in field_map.items():

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphql/pyutils/cached_property.py in __get__(self, obj, cls)
     14         if obj is None:
     15             return self
---> 16         value = obj.__dict__[self.func.__name__] = self.func(obj)
     17         return value

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphql/type/definition.py in fields(self)
    179     @cached_property
    180     def fields(self):
--> 181         return define_field_map(self, self._fields)
    182
    183     @cached_property

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphql/type/definition.py in define_field_map(type, field_map)
    188 def define_field_map(type, field_map):
    189     if callable(field_map):
--> 190         field_map = field_map()
    191
    192     assert isinstance(field_map, collections.Mapping) and len(field_map) > 0, (

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphene/types/typemap.py in construct_fields_for_type(self, map, type, is_input_type)
    216                 if not field:
    217                     continue
--> 218             map = self.reducer(map, field.type)
    219             field_type = self.get_field_type(map, field.type)
    220             if is_input_type:

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphene/types/typemap.py in reducer(self, map, type)
     66         if is_graphene_type(type):
     67             return self.graphene_reducer(map, type)
---> 68         return GraphQLTypeMap.reducer(map, type)
     69
     70     def graphene_reducer(self, map, type):

/Users/dfee/code/spark/env/lib/python3.6/site-packages/graphql/type/typemap.py in reducer(cls, map, type)
     57             return cls.reducer(map, type.of_type)
     58
---> 59         if type.name in map:
     60             assert map[type.name] == type, (
     61                 'Schema must contain unique named types but contains multiple types named "{}".'

AttributeError: 'str' object has no attribute 'name'
@syrusakbary
Copy link
Member

String references are no longer available.

You can achieve the same result with:

class A(graphene.ObjectType):
     name = graphene.String()
     to_b = graphene.Field(lambda: B)

class B(graphene.ObjectType):
     name = graphene.String()
     to_b = graphene.Field(lambda: B)

class Query(graphene.ObjectType):
     get_a = graphene.Field(A)
     get_b = graphene.Field(B)

schema = graphene.Schema(query=Query)

@dfee
Copy link
Member Author

dfee commented Mar 18, 2017

So to confirm, the recommend approach to circular references is to import the referenced (or, related) ObjectType after its needed:

# a.py
class A(graphene.ObjectType):
     name = graphene.String()
     to_b = graphene.Field(lambda: B)

from .b import B
# b.py
from .post_type import Post

class B(graphene.ObjectType):
     name = graphene.String()
     to_b = graphene.Field(lambda: A)

from .a import A
# schema.py
from .a import A
from .b import B


class Query(graphene.ObjectType):
     get_a = graphene.Field(A)
     get_b = graphene.Field(B)

schema = graphene.Schema(query=Query)

The other alternative I'm aware of is the method you mentioned here using graphene.LazyType.

I can't seem to find it, but why was support for string references dropped?

Thanks for your help.

@syrusakbary
Copy link
Member

syrusakbary commented Mar 18, 2017

There is also the lazy_import utility function that makes this task easier:

# mymodule/a.py
class A(graphene.ObjectType):
     name = graphene.String()
     to_b = graphene.Field(graphene.lazy_import('mymodule.b.B'))
# mymodule/b.py
class B(graphene.ObjectType):
     name = graphene.String()
     to_a = graphene.Field(graphene.lazy_import('mymodule.a.A'))
     to_b = graphene.Field(lambda: B)

The main reason string references were dropped was because it was non-deterministic.
Talking of a type A (referenced only by the name) only makes sense when including their schema context, however the definitions are outside the schema (for reusability).

Hope this makes sense!

paxswill added a commit to paxswill/evesrp that referenced this issue May 25, 2017
Undocumented except in some issues (graphql-python/graphene#110 and graphql-python/graphene#436). The old mechanism worked in Python 3 but not in Python 2.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants