-
- Downloads
Rename library to `lee_transformers`
Short import can now be `leet`. This was done due to a name conflict with Transformer iN Transformer and because I did not like top-notch that much.
Showing
- README.md 5 additions, 5 deletionsREADME.md
- docs/Makefile 1 addition, 1 deletiondocs/Makefile
- docs/index.rst 1 addition, 1 deletiondocs/index.rst
- lee_transformers/__init__.py 0 additions, 0 deletionslee_transformers/__init__.py
- lee_transformers/attention_masks.py 0 additions, 0 deletionslee_transformers/attention_masks.py
- lee_transformers/common.py 0 additions, 0 deletionslee_transformers/common.py
- lee_transformers/initialization.py 0 additions, 0 deletionslee_transformers/initialization.py
- lee_transformers/layers/__init__.py 0 additions, 0 deletionslee_transformers/layers/__init__.py
- lee_transformers/layers/adaptive_computation_time.py 0 additions, 0 deletionslee_transformers/layers/adaptive_computation_time.py
- lee_transformers/layers/glu_layers.py 0 additions, 0 deletionslee_transformers/layers/glu_layers.py
- lee_transformers/layers/positional_encodings.py 0 additions, 0 deletionslee_transformers/layers/positional_encodings.py
- lee_transformers/layers/resampling.py 0 additions, 0 deletionslee_transformers/layers/resampling.py
- lee_transformers/layers/rms_norm.py 0 additions, 0 deletionslee_transformers/layers/rms_norm.py
- lee_transformers/layers/rpe_mha.py 0 additions, 0 deletionslee_transformers/layers/rpe_mha.py
- lee_transformers/layers/seq_pool.py 0 additions, 0 deletionslee_transformers/layers/seq_pool.py
- lee_transformers/layers/utils.py 0 additions, 0 deletionslee_transformers/layers/utils.py
- lee_transformers/models/__init__.py 0 additions, 0 deletionslee_transformers/models/__init__.py
- lee_transformers/models/common.py 0 additions, 0 deletionslee_transformers/models/common.py
- lee_transformers/models/lpe_transformer.py 0 additions, 0 deletionslee_transformers/models/lpe_transformer.py
- lee_transformers/models/rpe_transformer.py 0 additions, 0 deletionslee_transformers/models/rpe_transformer.py
Loading
Please register or sign in to comment