Skip to content

Releases: lucidrains/x-transformers

0.5.1

31 Dec 22:16
Compare
Choose a tag to compare
save the user

0.5.0

31 Dec 20:58
Compare
Choose a tag to compare
add position infused attention, from DETR and Shortformer papers

0.4.4

28 Dec 04:00
Compare
Choose a tag to compare
add ability to residualize cross attention as well, after noting an i…

0.4.3

28 Dec 03:39
Compare
Choose a tag to compare
allow user to use a projection to logits (vs weight tying) with tie_e…

0.4.2

28 Dec 03:25
Compare
Choose a tag to compare
fix bug with residual attention

0.4.1

28 Dec 03:18
Compare
Choose a tag to compare
for post-normalization, let wrapper take care of last normalization

0.4.0

28 Dec 00:10
Compare
Choose a tag to compare
add residual attention, from Realformer paper

0.3.5

15 Dec 00:21
Compare
Choose a tag to compare
do GLU gating for attention layer output, without queries

0.3.4

14 Dec 07:35
Compare
Choose a tag to compare
create a cross attention only attention layer (CrossAttender)

0.3.3

14 Dec 07:29
Compare
Choose a tag to compare
allow for only cross attention in attention layers