-
Notifications
You must be signed in to change notification settings - Fork 183
Replies: 4 comments · 11 replies
-
PS: The solve is done on |
Beta Was this translation helpful? Give feedback.
All reactions
-
What version of
|
Beta Was this translation helpful? Give feedback.
All reactions
-
Thanks for checking! Unfortunately you just missed the "window of opportunity" with that configuration, as the missing file on the associated torchvision package was patched about 4 hours ago. To test this, you can just set In my machine, setting In my head, if the cuda version cannot be matched, it should abort the solve and not fallback to anything else. However, I'm sure I'm missing something here about the rationale of that setting. Or is it a bug? |
Beta Was this translation helpful? Give feedback.
All reactions
-
Here is yet another configuration that reproduces the same problem: [project]
name = "pytorch-example"
channels = ["conda-forge"]
platforms = ["osx-arm64", "linux-64"]
[dependencies]
python = "3.*"
pytorch = "*"
[feature.cuda]
platforms = ["linux-64"]
system-requirements = { cuda = "12" }
[feature.cuda.dependencies]
py-xgboost-gpu = "*"
[environments]
cuda = { features = ["cuda"] } I'm using pixi 0.31.0, and running a solve on osx-arm64. |
Beta Was this translation helpful? Give feedback.
All reactions
-
@baszalmstra: I think you perhaps meant Setting the build like this also works:[project]
name = "pytorch-example"
channels = ["conda-forge"]
platforms = ["osx-arm64", "linux-64"]
[dependencies]
python = "3.*"
pytorch = "*"
torchvision = "*"
[feature.cuda]
platforms = ["linux-64"]
system-requirements = { cuda = "12" }
[feature.cuda.dependencies.pytorch]
build = "cuda120*"
# or this, but then we end up with a double-declaration of pytorch version...
# [feature.cuda.dependencies]
# pytorch = { version = "*", build = "cuda120*" }
[environments]
cuda = { features = ["cuda"] } Adding the However, when I add the `cuda-version` dependency, we go back to an older version of pytorch (please note however that the cuda version is respected correctly):[project]
name = "pytorch-example"
channels = ["conda-forge"]
platforms = ["osx-arm64", "linux-64"]
[dependencies]
python = "3.*"
pytorch = "*"
torchvision = "*"
[feature.cuda]
platforms = ["linux-64"]
system-requirements = { cuda = "12" }
[feature.cuda.dependencies]
cuda-version = "12.0"
pytorch-gpu = "*"
[environments]
cuda = { features = ["cuda"] } $ pixi list --platform linux-64 -e cuda | grep pytorch
Environment: cuda
pytorch 2.1.0 cuda120_py38h75dc8b7_303 25.5 MiB conda pytorch-2.1.0-cuda120_py38h75dc8b7_303.conda
pytorch-gpu 2.1.0 cuda120py38h7540baa_303 21.5 KiB conda pytorch-gpu-2.1.0-cuda120py38h7540baa_303.conda If I force the configuration to set `pytorch = "2.4.1"`, then pixi complains it cannot find a matching.
If I search for pytorch through pixi, I can see it correctly depends on `cuda-version>=12.0`$ pixi search --platform linux-64 pytorch
Using channels: conda-forge
pytorch cuda120_py39h13e8a3a_300
--------------------------------
Name pytorch
Version 2.4.1
Build cuda120_py39h13e8a3a_300
Size 32204153
License BSD-3-Clause
Subdir linux-64
File Name pytorch-2.4.1-cuda120_py39h13e8a3a_300.conda
URL https://conda.anaconda.org/conda-forge/linux-64/pytorch-2.4.1-cuda120_py39h13e8a3a_300.conda
MD5 fae1e0c6ca8f0e3d83b875d68458a23b
SHA256 27d9d0db25e257ac877c0218d54130a555812283b46929f903f4adbb04a69b89
Dependencies:
- __cuda
- __glibc >=2.17,<3.0.a0
- _openmp_mutex >=4.5
- cuda-cudart >=12.0.107,<13.0a0
- cuda-nvrtc >=12.0.76,<13.0a0
- cuda-nvtx >=12.0.76,<13.0a0
- cuda-version >=12.0,<13
- cudnn >=9.3.0.75,<10.0a0
- filelock
- fsspec
- jinja2
- libabseil * cxx17*
- libabseil >=20240116.2,<20240117.0a0
- libcblas >=3.9.0,<4.0a0
- libcublas >=12.0.1.189,<13.0a0
- libcufft >=11.0.0.21,<12.0a0
- libcurand >=10.3.1.50,<11.0a0
- libcusolver >=11.4.2.57,<12.0a0
- libcusparse >=12.0.0.76,<13.0a0
- libgcc >=12
- libmagma >=2.8.0,<2.8.1.0a0
- libmagma_sparse >=2.8.0,<2.8.1.0a0
- libprotobuf >=4.25.3,<4.25.4.0a0
- libstdcxx >=12
- libtorch 2.4.1.*
- libuv >=1.49.0,<2.0a0
- mkl >=2023.2.0,<2024.0a0
- nccl >=2.23.4.1,<3.0a0
- networkx
- numpy >=1.19,<3
- python >=3.9,<3.10.0a0
- python_abi 3.9.* *_cp39
- sleef >=3.7,<4.0a0
- sympy >=1.13.1
- typing_extensions >=4.8.0 So, somethings work, some do not. It seems that the pixi solver is a bit flaky in these conditions. Would you agree? Or are these packaging issues? |
Beta Was this translation helpful? Give feedback.
All reactions
-
More info: I was checking the builds from the concerned packages, and they all seem to be in order. Here you can find a couple of useful commands to debug package availability w/o looking at the server and downloading packages, since
$ pixi global install micromamba
# add --json to get a full list with all found packages
$ micromamba search -c conda-forge --platform linux-64 'pytorch * *cuda120*py312*'
$ micromamba search -c conda-forge --platform linux-64 'torchvision * *cuda120*py312*'
|
Beta Was this translation helpful? Give feedback.
All reactions
-
I'm not sure what the current best practices surrounding pytorch and conda-forge, but previously I always put together environments using a combination of the
but (1) I think using the |
Beta Was this translation helpful? Give feedback.
All reactions
-
@synapticarbors: Combining pytorch/nvidia and conda-forge is unsupported as it can create unstable environments (ref1, ref2). If you notice well, the pytorch web page recommends installation against |
Beta Was this translation helpful? Give feedback.
All reactions
-
In this discussion you can find what is supposed to work in terms of channel combinations. |
Beta Was this translation helpful? Give feedback.
-
I have the following
pixi.toml
:I run
pixi install
(to generate a lock file). Then, by inspecting it, I see the following in thecuda
environment:After debugging it a bit, I found out there is indeed a missing file on the torchvision feedstock. However, I'd not expect pixi would fallback to CUDA=11.8 with the
system-requirements
as specified.Am I missing something?
Beta Was this translation helpful? Give feedback.
All reactions