Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug Report] We should never import torch outside of using golden functions #17059

Closed
eyonland opened this issue Jan 23, 2025 · 3 comments · Fixed by #17133
Closed

[Bug Report] We should never import torch outside of using golden functions #17059

eyonland opened this issue Jan 23, 2025 · 3 comments · Fixed by #17133
Assignees
Labels
bug Something isn't working MCW op_cat: eltwise

Comments

@eyonland
Copy link
Contributor

Describe the bug
We should never import torch outside of golden functions.

https://github.com/tenstorrent/tt-metal/blob/main/ttnn/ttnn/operations/binary_backward.py#L15

File "/home/blozano/anaconda3/envs/py310/lib/python3.10/site-packages/ttnn/operations/binary_backward.py", line 8, in <module>
    from ttnn.operations.complex_binary_backward import (
  File "/home/blozano/anaconda3/envs/py310/lib/python3.10/site-packages/ttnn/operations/complex_binary_backward.py", line 6, in <module>
    import torch

To Reproduce
Steps to reproduce the behavior:
Just import ttnn without torch and you will see this.

@eyonland eyonland added the bug Something isn't working label Jan 23, 2025
@patrickroberts
Copy link
Contributor

How are the top-level statements like this supposed to reference torch then?

ttnn.attach_golden_function(
    ttnn.sub_bw,
    golden_function=lambda grad, a, b, *args, **kwargs: _golden_function_backward(
        torch.sub, grad, a, b, *args, **kwargs
    ),
)

@eyonland
Copy link
Contributor Author

https://github.com/tenstorrent/tt-metal/blob/main/ttnn/ttnn/operations/binary_backward.py#L23

def _golden_function_backward(torch_op, grad_tensor, input_tensor_a, input_tensor_b, *args, **kwargs):
    import torch

    if torch.is_complex(input_tensor_a):
        if torch_op == torch.add:
            alpha = kwargs.pop("alpha")

@patrickroberts
Copy link
Contributor

Okay, so don't use lambdas then. Gotcha.

@mouliraj-mcw mouliraj-mcw linked a pull request Jan 29, 2025 that will close this issue
1 task
mouliraj-mcw added a commit that referenced this issue Jan 31, 2025
### Ticket
Link to Github Issue #17059

### Problem description
We should never import torch outside of golden functions.

### What's changed
Updated the golden functions

### Checklist
- [x] [All Post commit CI
](https://github.com/tenstorrent/tt-metal/actions/runs/12986959759)
yieldthought pushed a commit that referenced this issue Jan 31, 2025
### Ticket
Link to Github Issue #17059

### Problem description
We should never import torch outside of golden functions.

### What's changed
Updated the golden functions

### Checklist
- [x] [All Post commit CI
](https://github.com/tenstorrent/tt-metal/actions/runs/12986959759)
nikileshx pushed a commit to nikileshx/tt-metal that referenced this issue Feb 3, 2025
### Ticket
Link to Github Issue tenstorrent#17059

### Problem description
We should never import torch outside of golden functions.

### What's changed
Updated the golden functions

### Checklist
- [x] [All Post commit CI
](https://github.com/tenstorrent/tt-metal/actions/runs/12986959759)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working MCW op_cat: eltwise
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants