You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One of the most common optimizations that ML frameworks do is to fuse common pointwise operations such as Relu, Gelu, Silu etc. into preceding convolution / matmul operations. This reduces overhead by applying the pointwise operation while the data is already in the cache. This is not yet implemented in RTen.
The text was updated successfully, but these errors were encountered:
One of the most common optimizations that ML frameworks do is to fuse common pointwise operations such as Relu, Gelu, Silu etc. into preceding convolution / matmul operations. This reduces overhead by applying the pointwise operation while the data is already in the cache. This is not yet implemented in RTen.
The text was updated successfully, but these errors were encountered: