You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Think we're gonna have to serialize each of the items independently and stitch them together.
Pydantic doesn't like models that reference each other, even when they have json serializations that handle that - the mere presence of a repeated object ID causes the json serialization to choke in a way that can be sometimes rescued by a model_serializer(mode="wrap") but other times doing so causes the next dump attempt to fail.
Some of these are actually cyclical recursion errors, some of them (like object ID columns, which never reference other objects) are getting independent cyclical errors (not the errors of the parent class which may have already been seen, an additional error) may be related to: pydantic/pydantic#9670
We could just.... load objects twice. there isn't really a reason to only load a single object except that it simplifies writing if eg. incompatible edits are made to the attributes of the referenced object (the array should be fine, since it refers tot he same underlying hdf5 dataset)
The text was updated successfully, but these errors were encountered:
Think we're gonna have to serialize each of the items independently and stitch them together.
Pydantic doesn't like models that reference each other, even when they have json serializations that handle that - the mere presence of a repeated object ID causes the json serialization to choke in a way that can be sometimes rescued by a
model_serializer(mode="wrap")
but other times doing so causes the next dump attempt to fail.Some of these are actually cyclical recursion errors, some of them (like object ID columns, which never reference other objects) are getting independent cyclical errors (not the errors of the parent class which may have already been seen, an additional error) may be related to: pydantic/pydantic#9670
We could just.... load objects twice. there isn't really a reason to only load a single object except that it simplifies writing if eg. incompatible edits are made to the attributes of the referenced object (the array should be fine, since it refers tot he same underlying hdf5 dataset)
The text was updated successfully, but these errors were encountered: