keras-self-attention 0.51.0
0
Attention mechanism for processing sequential data that considers the context for each timestamp
Contents
Attention mechanism for processing sequential data that considers the context for each timestamp
Stars: 655, Watchers: 655, Forks: 120, Open Issues: 3The CyberZHG/keras-self-attention
repo was created 6 years ago and the last code push was 2 years ago.
The project is popular with 655 github stars!
How to Install keras-self-attention
You can install keras-self-attention using pip
pip install keras-self-attention
or add it to a project with poetry
poetry add keras-self-attention
Package Details
- Author
- CyberZHG
- License
- MIT
- Homepage
- https://github.com/CyberZHG/keras-self-attention
- PyPi:
- https://pypi.org/project/keras-self-attention/
- GitHub Repo:
- https://github.com/CyberZHG/keras-self-attention
Classifiers
Related Packages
Errors
A list of common keras-self-attention errors.
Code Examples
Here are some keras-self-attention
code examples and snippets.