You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am unable to torch.compile(encoder, fullgraph=True), it gives an error on instantiating "block" on line 2131. It seems to have to do with taking .next() on a listiterator, maybe with block arguments being a *args. I was unable to understand what "block" is and where it comes from. Would appreciate any help on this issue. Switched from the pytorch implementation due to numerical instability so would really like good alternative, this repository seems really promising.
python==3.10.15
torch==2.4.1+cu124
GPU: A100-80GB
File "/miniforge/envs/iris/lib/python3.10/site-packages/torch/_dynamo/variables/builtin.py", line 780, in call_self_handler
unimplemented(
File "/miniforge/envs/iris/lib/python3.10/site-packages/torch/_dynamo/exc.py", line 221, in unimplemented
raise Unsupported(msg)
torch._dynamo.exc.Unsupported: invalid handler args <bound method BuiltinVariable.call_next of BuiltinVariable()> [ListIteratorVariable(length=0, index=0), ConstantVariable()] {}
from user code:
File "/mnt/task_runtime/src/modules/encoder.py", line 55, in forward
feats = self.encoder.forward(x, mask=mask)
File "/miniforge/envs/iris/lib/python3.10/site-packages/x_transformers/x_transformers.py", line 2131, in forward
out, inter = block(x, mask = mask, context_mask = self_attn_kv_mask, attn_mask = attn_mask, rel_pos = self.rel_pos, pos = pos, rotary_pos_emb = rotary_pos_emb, prev_attn = prev_attn, cache = next(iter_attn_cache, None), mem = layer_mem, mem_mask = layer_mem_mask, attn_bias = attn_bias, value_residual = maybe_self_attn_value_residual, return_intermediates = True)
Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information
You can suppress this exception and fall back to eager by setting:
import torch._dynamo
torch._dynamo.config.suppress_errors = True
The text was updated successfully, but these errors were encountered:
I am unable to
torch.compile(encoder, fullgraph=True)
, it gives an error on instantiating "block" on line 2131. It seems to have to do with taking .next() on a listiterator, maybe with block arguments being a*args
. I was unable to understand what "block" is and where it comes from. Would appreciate any help on this issue. Switched from the pytorch implementation due to numerical instability so would really like good alternative, this repository seems really promising.The text was updated successfully, but these errors were encountered: