Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JIT cpp extension fails to find cl.exe on windows? #62

Open
tueboesen opened this issue Aug 31, 2020 · 5 comments
Open

JIT cpp extension fails to find cl.exe on windows? #62

tueboesen opened this issue Aug 31, 2020 · 5 comments

Comments

@tueboesen
Copy link

I am trying to use the JIT cpp extension, and have a script from my friend, who is able to run it in linux, but whenever I run it on windows, it fails. The error is quite clear, it seems like it can’t find the cl.exe compiler.

The script fails on the load statement below:

import torch
from torch.utils.cpp_extension import load
from torch.nn.modules.utils import _triple

# load the PyTorch extension
cudnn_convolution = load(name="cudnn_convolution", sources=["src/reversible_network/cudnn_convolution.cpp"], verbose=True)

With the following message:

C:\Users\Tue\PycharmProjects\Epitopes_segmentation\venv\Scripts\python.exe "C:\Program Files\JetBrains\PyCharm Community Edition 2020.1\plugins\python-ce\helpers\pydev\pydevd.py" --multiproc --qt-support=auto --client 127.0.0.1 --port 51413 --file C:/Users/Tue/PycharmProjects/DistogramPredictor/run.py
pydev debugger: process 10460 is connecting
Connected to pydev debugger (build 201.6668.115)
Using C:\Users\Tue\AppData\Local\Temp\torch_extensions as PyTorch extensions root...
C:\Users\Tue\PycharmProjects\Epitopes_segmentation\venv\lib\site-packages\torch\utils\cpp_extension.py:237: UserWarning: Error checking compiler version for cl: [WinError 2] The system cannot find the file specified
  warnings.warn('Error checking compiler version for {}: {}'.format(compiler, error))
Emitting ninja build file C:\Users\Tue\AppData\Local\Temp\torch_extensions\cudnn_convolution\build.ninja...
INFO: Could not find files for the given pattern(s).
Traceback (most recent call last):
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 783, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "C:\Users\Tue\PycharmProjects\DistogramPredictor\src\main.py", line 12, in <module>
    from src.reversible_network.hypernet import HyperNet
  File "C:\Users\Tue\PycharmProjects\DistogramPredictor\src\reversible_network\hypernet.py", line 6, in <module>
    from src.reversible_network.doublesym import DoubleSymLayer3D
  File "C:\Users\Tue\PycharmProjects\DistogramPredictor\src\reversible_network\doublesym.py", line 5, in <module>
    from src.reversible_network.grad import conv3d_weight
  File "C:\Users\Tue\PycharmProjects\DistogramPredictor\src\reversible_network\grad.py", line 6, in <module>
    cudnn_convolution = load(name="cudnn_convolution", sources=["src/reversible_network/cudnn_convolution.cpp"], verbose=True)
  File "C:\Users\Tue\PycharmProjects\Epitopes_segmentation\venv\lib\site-packages\torch\utils\cpp_extension.py", line 888, in load
    return _jit_compile(
  File "C:\Users\Tue\PycharmProjects\Epitopes_segmentation\venv\lib\site-packages\torch\utils\cpp_extension.py", line 1077, in _jit_compile
    _write_ninja_file_and_build_library(
  File "C:\Users\Tue\PycharmProjects\Epitopes_segmentation\venv\lib\site-packages\torch\utils\cpp_extension.py", line 1171, in _write_ninja_file_and_build_library
    _write_ninja_file_to_build_library(
  File "C:\Users\Tue\PycharmProjects\Epitopes_segmentation\venv\lib\site-packages\torch\utils\cpp_extension.py", line 1509, in _write_ninja_file_to_build_library
    _write_ninja_file(
  File "C:\Users\Tue\PycharmProjects\Epitopes_segmentation\venv\lib\site-packages\torch\utils\cpp_extension.py", line 1615, in _write_ninja_file
    cl_paths = subprocess.check_output(['where',
  File "C:\Users\Tue\AppData\Local\Programs\Python\Python38\lib\subprocess.py", line 411, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "C:\Users\Tue\AppData\Local\Programs\Python\Python38\lib\subprocess.py", line 512, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['where', 'cl']' returned non-zero exit status 1.

I’m running python 3.8 and a pip list gives the following:

(venv) C:\Users\Tue\PycharmProjects\DistogramPredictor>pip list
Package             Version
------------------- ------------
beautifulsoup4      4.9.1
biopython           1.77
blis                0.4.1
bs4                 0.0.1
catalogue           1.0.0
certifi             2020.6.20
chardet             3.0.4
click               7.1.2
cssselect           1.1.0
cycler              0.10.0
cymem               2.0.3
dill                0.3.2
en-core-web-sm      2.3.0
filelock            3.0.12
fire                0.3.1
fr-core-news-sm     2.3.0
future              0.18.2
h5py                2.10.0
hnswlib             0.3.4
idna                2.10
joblib              0.16.0
Keras               2.4.3
Keras-Applications  1.0.8
Keras-Preprocessing 1.1.2
kiwisolver          1.2.0
lmdb                0.98
lxml                4.5.2
matplotlib          3.2.1
msgpack             1.0.0
murmurhash          1.0.2
ninja               1.10.0.post1
nltk                3.5
numpy               1.18.5
packaging           20.4
pandas              1.0.4
parse               1.15.0
Pillow              7.1.2
pip                 20.2.2
plac                1.1.3
preshed             3.0.2
pyarrow             0.17.1
pybind11            2.5.0
pyparsing           2.4.7
pyquery             1.4.1
python-dateutil     2.8.1
pytz                2020.1
pywebcopy           6.3.0
PyYAML              5.3.1
regex               2020.6.8
requests            2.24.0
sacremoses          0.0.43
scipy               1.4.1
seaborn             0.10.1
sentencepiece       0.1.91
seqeval             0.0.12
setuptools          47.2.0
six                 1.15.0
soupsieve           2.0.1
spacy               2.3.0
srsly               1.0.2
termcolor           1.1.0
thinc               7.4.1
tokenizers          0.8.0rc4
torch               1.5.0
torchtext           0.6.0
torchvision         0.6.0
tqdm                4.47.0
urllib3             1.25.9
w3lib               1.22.0
wasabi              0.7.0
wget                3.2

How can I fix this issue?

@erwincoumans
Copy link

erwincoumans commented Nov 10, 2020

You need a visual studio command prompt, for example for Visual Studio 2019, you would run (Windows-key and R )

%comspec% /k "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvars64.bat"

or start typing 'x64 Native Tools Command Prompt for VS 2019' in the search bar.

You also need ninja binary in the path, download it from here: https://github.com/ninja-build/ninja/releases unzip and copy it in your path (type 'path' to find some directory)

Then run

set DISTUTILS_USE_SDK=1
python setup.py build
python setup.py install

In case of an error related to lltm.o not found, you can rename the lltm.obj to lltm.o, in my case in the build\temp.win-amd64-3.7\Release folder.
Then, after the extension builds and installs fine, it will report an error, not finding the torch dlls, the error is like:

ImportError: DLL load failed: The specified module could not be found.

Just copy the dlls from Python37\Lib\site-packages\torch\lib or start python from that directory
Then you can run python and import lltm_cpp and find the forward and backward extension:

C:\Python37\Lib\site-packages\torch\lib>python
Python 3.7.9 (tags/v3.7.9:13c94747c7, Aug 17 2020, 18:58:18) [MSC v.1900 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import lltm_cpp
>>> print(dir(lltm_cpp))
['__doc__', '__file__', '__loader__', '__name__', '__package__', '__spec__', 'backward', 'forward']

The developers of extension-cpp probably care mostly about Linux, not Windows.

@Arsmart123
Copy link

For me, my solution is to buy a ubuntu PC and install NVIDIA packages. And I can run the code!

@elephaint
Copy link

For those stumbling upon this thread, below works for me:

  1. Install Build Tools for Visual Studio
  2. Make sure you add compiler cl to your PATH environment variable (see here). Steps:
    • Go to your environment variables in Windows
    • select the PATH environment variable
    • Add the path to your MSVC compiler. Example for VS2019 Build Tools: C:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\VC\Tools\MSVC\14.29.30133\bin\Hostx64\x64
  3. Verify that Windows can find cl by executing where cl in a Windows command line terminal.

@JoelJohera
Copy link

JoelJohera commented Dec 14, 2022

For those stumbling upon this thread, below works for me:

  1. Install Build Tools for Visual Studio

  2. Make sure you add compiler cl to your PATH environment variable (see here). Steps:

    • Go to your environment variables in Windows
    • select the PATH environment variable
    • Add the path to your MSVC compiler. Example for VS2019 Build Tools: C:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\VC\Tools\MSVC\14.29.30133\bin\Hostx64\x64
  3. Verify that Windows can find cl by executing where cl in a Windows command line terminal.

I have the cl PATH on the User variables and the System variables but still where cl returns INFO: Could not find files for the given patterns(s)

What else can I do?

@martinResearch
Copy link

Cupy has a mechanism to find cl.exe automatically without one having to use manual steps to call vcvars64.bat or manually modify the PATH variable (See here). It would be super convenient to has a similar mechanism added to pytorch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants