Releases: lucidrains/x-transformers
Releases · lucidrains/x-transformers
0.9.0
add rms norm, given transformer modifications paper out of google
0.8.4
allow for setting embedding dimension to be different than model dime…
0.8.3
add an assert for relative positional keyword arguments
0.8.2
add an assert for relative positional keyword arguments
0.8.1
fix residual gating
0.8.0
add gating at residuals, from deepminds paper for stabilizing txl for…
0.7.4
fix bug with prenorm, introduced when adding residual attention
0.7.3
allow floats for ff_mult
0.7.2
fix bug with default empty memories for txl
0.7.1
fix some more issues with T5 rel pos bias, thanks to @adrian-spataru