You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In forward() of class RGLRU, the dimension of x is [Batch_size, dim],
but the dimension of self.Wa is [hidden_dim,dim],
where hidden_dim=mul*dim, how could RGLRU work?
"linear_1 = self.lru(linear_1)"
This line is commented in the code.
Upvote & Fund
We're using Polar.sh so you can upvote and help fund this issue.
We receive the funding once the issue is completed & confirmed by you.
Thank you in advance for helping prioritize & fund our backlog.
The text was updated successfully, but these errors were encountered:
[Description]
In forward() of class RGLRU, the dimension of x is [Batch_size, dim],
but the dimension of self.Wa is [hidden_dim,dim],
where hidden_dim=mul*dim, how could RGLRU work?
"linear_1 = self.lru(linear_1)"
This line is commented in the code.
Upvote & Fund
The text was updated successfully, but these errors were encountered: