-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor: pydantic lite rewrite #154
Conversation
Codecov Report
@@ Coverage Diff @@
## main #154 +/- ##
===========================================
+ Coverage 43.46% 56.55% +13.09%
===========================================
Files 31 32 +1
Lines 3978 4238 +260
===========================================
+ Hits 1729 2397 +668
+ Misses 2249 1841 -408
Continue to review full report at Codecov.
|
remove unused imports
feat: added ieeg session to jsonable
feat: serialize connectivity feature
chore: unify receptor test
chore: add typing to features.query
without called super().__init_subclass__(), other super class's __init_subclass would not be called. In principle, *arg, **kwarg should also be added to __init_subclass__ method
chore: add id to connectivity matrix model chore: update NpArray to allow for kwarg init
@xgui3783 Changes in 176 files is a bit much, the GitHub diff viewer can't show them all without crashing (at least in Safari, need to check this in another browser). |
Most are statically generated pydantic models (in siibra/openminds) Is there anything specific I should look out for? Is there maybe a test I could run locally and check if the output jsons are openMINDS conform? I have implemented some tests in for example: import siibra
jba29 = siibra.parcellations['2.9']
jba29_model = jba29.to_model()
list_jba29_vers = [jba29_ver.dict() for jba29_ver in jba29_model.brain_atlas_versions] the I will see if I can hide changes to specific directory |
@skoehnen I added a can you let me know if the PR still crashes the browser? |
Weird, how come the tests suddenly fail. |
Looks like a problem in test_receptor_to_model in test_receptors.py |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FAILED test/features/test_receptors.py::test_receptor_to_model[receptor57]
Needs to be fixed.
fix: np.int and np.float deprecation
seems to be sporadically caused by unable to fetch KG metadata. reran the test resolves the issue. |
Okay, that is good, at least in the sense that it is not an error on our side ;) |
No, it didn't change anything here. But I guess new PRs will not show the generated files. I fixed it for me by using "Jump To" and looking at specific files and then reloading ;) |
Green across the board, looks good to me. I will take a deeper look into the generated models and their "openMINDS conformity" when I am finished with the curation migration. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems fine to me
No description provided.