Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

couldn't run drcme.bin.run_spca_fit #21

Open
hongruhu opened this issue Aug 3, 2021 · 5 comments
Open

couldn't run drcme.bin.run_spca_fit #21

hongruhu opened this issue Aug 3, 2021 · 5 comments

Comments

@hongruhu
Copy link

hongruhu commented Aug 3, 2021

hi, we I installed the drcme package using
pip install git+git://github.com/AllenInstitute/drcme.git and ran
python -m drcme.bin.run_spca_fit --input_json my_spca_input.json

it turned out that Error while finding module specification for 'drcme.bin.run_spca_fit' (ModuleNotFoundError: No module named 'drcme.bin')

so I was wondering if this happened due to some error during the conflicts shown below during the pip installation?

thanks

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
ipfx 1.0.4 requires marshmallow==3.0.0rc6, but you have marshmallow 2.21.0 which is incompatible.
allensdk 2.9.0 requires marshmallow==3.0.0rc6, but you have marshmallow 2.21.0 which is incompatible.
@gouwens
Copy link
Collaborator

gouwens commented Aug 3, 2021

My guess is that drcme isn't actually installed. I don't think the marshmallow version issue should keep you from installing it, though I have typically installed it after the dependencies ipfx & allensdk are already present, so it may be different for you. If you're using Anaconda to manage your environment, you can check what the output of conda list is to see what modules (and versions) are present on your system.

Can you see if you actually have drcme installed by trying to import drcme from a Python prompt (and if that works, make sure that it's pointing to the right place by checking drcme.__path__)?

@hongruhu
Copy link
Author

hongruhu commented Aug 3, 2021

@gouwens hi, I just try to import the package locally and I could run the functions in drcme. However, each time when I run the interactive python using

run_spca_fit.main(params_file="spca_params.json", 
                  output_dir="./output", 
                  output_code="EXAMPLE", 
                  datasets=[{"fv_h5_file":"my_feature_vectors.h5"}])

it always gives me the error KeyError: 'limit_to_cortical_layers'

@gouwens
Copy link
Collaborator

gouwens commented Aug 4, 2021

Hmm, I don't know why you could import and run the package locally but not through the command line (unless maybe you are in the directory of the package when running locally, and somehow your Python installation isn't seeing it otherwise?).

The reason your run_spca_fit.main() call is failing is because the datasets parameter needs more information. The argschema object specifies the fields that it expects (and will add in the default values when you run it from the command line). So you need to pass everything that isn't set as required = False in that object. So your call will need to look more like:

run_spca_fit.main(params_file="spca_params.json", 
                  output_dir="./output", 
                  output_code="EXAMPLE", 
                  datasets=[{
                      "fv_h5_file":"my_feature_vectors.h5",
                      "metadata_file": None,
                      "dendrite_type": "all",
                      "limit_to_cortical_layers": [],
                  }])

@hongruhu
Copy link
Author

hongruhu commented Aug 4, 2021

@gouwens thank you, I got it running. If I would compute the sparsePCs of mouse patchseq ephys data from the 2020 cell paper, I would also want to QC the sweeps and might take some dendrite type into account. What I have done is downloading all the nwb files from dandi 000020 and doing the processing. However, it seems that the "filesystem"/local nwb files does not support for sweep QC, and I guess only LIMS support the QC? If so, I was wondering how to access the LIMS or that is only for AIBS internal use? Thanks

@gouwens
Copy link
Collaborator

gouwens commented Aug 10, 2021

It is true that LIMS is only accessible internally at the Allen Institute. However, the QC code is part of IPFX, so it is possible to run it even with the files from DANDI.

To get going now, you could incorporate code from this example Jupyter notebook.

However, it probably makes sense for us to add that QC process to the IFPX feature vector calculation code when using the "filesystem" option. I'll look into doing that - thanks for pointing that out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants