Skip to content

Commit

Permalink
preliminary commit
Browse files Browse the repository at this point in the history
ok preliminary commit to use scverse-airr structure as per #261

TODO:

- wherever *.data is mentioned in dandelion, this needs to handle the awkward arrays
- create a slot/function that stores the cell_id
- update the tutorials
- create a new to_scirpy/from_scirpy function that basically transfers the .obs, .obsm and .uns
- create a to/from_ak function so that the rearrangement data can still be repopulated as a pandas dataframe.
- would be cool if i can also make this dask compatible - only issue is dask.dataframe.DataFrame.compute is always slow..? need to learn more about dask. vaex is also cool but not really compatible for contig->cell level wrangling.
  • Loading branch information
zktuong committed May 9, 2023
1 parent 8a50b91 commit d5fb09f
Show file tree
Hide file tree
Showing 3 changed files with 952 additions and 230 deletions.
6 changes: 3 additions & 3 deletions dandelion/tools/_tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -634,7 +634,7 @@ def find_clones(
layout=layout_,
graph=graph_,
)
vdj_data.update_metadata(reinitialize=True)
vdj_data.update_metadata(initialize=True)
elif ("clone_id" in vdj_data.data.columns) and (key_added is not None):
vdj_data.__init__(
data=dat_,
Expand All @@ -643,7 +643,7 @@ def find_clones(
graph=graph_,
)
vdj_data.update_metadata(
reinitialize=True,
initialize=True,
clone_key="clone_id",
retrieve=clone_key,
retrieve_mode="merge and unique only",
Expand All @@ -656,7 +656,7 @@ def find_clones(
graph=graph_,
clone_key=clone_key,
)
vdj_data.update_metadata(reinitialize=True, clone_key=clone_key)
vdj_data.update_metadata(initialize=True, clone_key=clone_key)
vdj_data.threshold = threshold_

else:
Expand Down
Loading

0 comments on commit d5fb09f

Please sign in to comment.