Releases: lucidrains/x-transformers
Releases · lucidrains/x-transformers
0.5.1
save the user
0.5.0
add position infused attention, from DETR and Shortformer papers
0.4.4
add ability to residualize cross attention as well, after noting an i…
0.4.3
allow user to use a projection to logits (vs weight tying) with tie_e…
0.4.2
fix bug with residual attention
0.4.1
for post-normalization, let wrapper take care of last normalization
0.4.0
add residual attention, from Realformer paper
0.3.5
do GLU gating for attention layer output, without queries
0.3.4
create a cross attention only attention layer (CrossAttender)
0.3.3
allow for only cross attention in attention layers