Skip to content

amd-chrissosa/arc-test-iree-turbine

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

IREE Turbine

image

Turbine is IREE's frontend for PyTorch.

Turbine provides a collection of tools:

  • AOT Export: For compiling one or more nn.Modules to compiled, deployment ready artifacts. This operates via both a simple one-shot export API (Already upstreamed to torch-mlir) for simple models and an underlying advanced API for complicated models and accessing the full features of the runtime.
  • Eager Execution: A torch.compile backend is provided and a Turbine Tensor/Device is available for more native, interactive use within a PyTorch session.
  • Custom Ops: Integration for defining custom PyTorch ops and implementing them in terms of IREE's backend IR or a Pythonic kernel language.

Contact Us

Turbine is under active development. Feel free to reach out on one of IREE's communication channels (specifically, we monitor the #pytorch channel on the IREE Discord server).

Quick Start for Users

  1. Install from source:
pip install iree-turbine
# Or for editable: see instructions under developers

The above does install some unecessary cuda/cudnn packages for cpu use. To avoid this you can specify pytorch-cpu and install via:

pip install -r pytorch-cpu-requirements.txt
pip install iree-turbine

(or follow the "Developers" instructions below for installing from head/nightly)

  1. Try one of the samples:

Generally, we use Turbine to produce valid, dynamic shaped Torch IR (from the torch-mlir torch dialect with various approaches to handling globals). Depending on the use-case and status of the compiler, these should be compilable via IREE with --iree-input-type=torch for end to end execution. Dynamic shape support in torch-mlir is a work in progress, and not everything works at head with release binaries at present.

Developers

Use this as a guide to get started developing the project using pinned, pre-release dependencies. You are welcome to deviate as you see fit, but these canonical directions mirror what the CI does.

Setup a venv

We recommend setting up a virtual environment (venv). The project is configured to ignore .venv directories, and editors like VSCode pick them up by default.

python -m venv --prompt iree-turbine .venv
source .venv/bin/activate

Install PyTorch for Your System

If no explicit action is taken, the default PyTorch version will be installed. This will give you a current CUDA-based version. Install a different variant by doing so explicitly first:

CPU:

pip install -r pytorch-cpu-requirements.txt

ROCM:

pip install -r pytorch-rocm-requirements.txt

Install Development Packages

# Install editable local projects.
pip install -r requirements.txt -e .

Running Tests

pytest .

Optional: Pre-commits and developer settings

This project is set up to use the pre-commit tooling. To install it in your local repo, run: pre-commit install. After this point, when making commits locally, hooks will run. See https://pre-commit.com/

Using a development compiler

If doing native development of the compiler, it can be useful to switch to source builds for iree-compiler and iree-runtime.

In order to do this, check out IREE and follow the instructions to build from source, making sure to specify additional options for the Python bindings:

-DIREE_BUILD_PYTHON_BINDINGS=ON -DPython3_EXECUTABLE="$(which python)"

Configuring Python

Uninstall existing packages:

pip uninstall iree-compiler
pip uninstall iree-runtime

Copy the .env file from iree/ to this source directory to get IDE support and add to your path for use from your shell:

source .env && export PYTHONPATH

About

IREE's PyTorch Frontend, based on Torch Dynamo.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.6%
  • Other 0.4%