-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference procedure failures; how to pick learning rate? #184
Comments
Hi Sam! This is definitely a problem and should not be happening. I think there might actually be a bug with Can you do me a favor and try running (with default learning rate) but omit the |
Hi @sjfleming ! Omitting the Thanks for the help! |
Yes, I would recommend grabbing the "uncorrected" antibody capture matrix from the raw cellranger output, if you don't want to use the cellbender outputs for those values. (Actually though, cellbender usually does a good job with the antibody features. See #114 .) I do not think that including those features in the input will make a very big difference. The other way you could exclude them (since there is currently a bug) would be to first load the data in |
Hi Stephen,
Thanks for the input here, and good to know that cellbender can help clean
up antibody features.
Best,
Sam
…On Mon, Mar 27, 2023 at 12:28 PM Stephen Fleming ***@***.***> wrote:
Yes, I would recommend grabbing the "uncorrected" antibody capture matrix
from the raw cellranger output, if you don't want to use the cellbender
outputs for those values. (Actually though, cellbender usually does a good
job with the antibody features. See #114
<#114> .)
I do not think that including those features in the input will make a very
big difference. The other way you could exclude them (since there is
currently a bug) would be to first load the data in scanpy, delete the
antibody features, and save the an h5ad file to use as input to cellbender.
CellBender can take an anndata h5ad file as input currently. I would not
think there would be a big difference in terms of the denoised gene counts,
no matter if antibody features are included in the input or not. (In fact,
I think including them might help rather than hurt.)
—
Reply to this email directly, view it on GitHub
<#184 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADSXUDC6BOCAGMR7NM6IYGDW6G5ZVANCNFSM6AAAAAAWBSYMKI>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Hi! I also run into this problem and get the following error when I use the
Is there a release/branch where this has been fixed that you would recommmend us to install? |
Hi @racng , this issue should be fixed currently on the I plan to merge this today and then make an official v0.3.0 release. |
With the new version, instead of an Typically I always include the Antibody Capture features myself. But I have talked to people to want to include ATAC features from a multiome analysis, because it's a bit hard for cellbender to handle 200k+ features. That can be achieved via |
Closed by #238 |
Hi CellBender team!
I'm using CellBender again for the first time since the last major commit (d82893c), and am running into a persistent problem during the Inference procedure within the first minute of the workflow (via Terra).
I get the following error (tried on multiple different count matrices):
I have tried adjusting the
learning_rate
argument as suggested, trying 1e-4, 1e-5, 1e-6, and 1e-7, but I am still getting the same error. After this error, the workflow continues to run for ~60+ minutes and ultimately fails, but sometimes will still produce outputs. In the workflows that don't produce outputs, I see the following error:Should I keep scaling down
learning_rate
until the error no longer occurs? Or could there be something else going on? Just for context, these are highly overloaded reactions, so I'm running with the following parameters:expected_cells = 25000
total_droplets_included = 50000
Thanks in advance for your help!
Best,
Sam
The text was updated successfully, but these errors were encountered: