Contents

keras-self-attention 0.51.0

0

Attention mechanism for processing sequential data that considers the context for each timestamp

Attention mechanism for processing sequential data that considers the context for each timestamp

Stars: 655, Watchers: 655, Forks: 122, Open Issues: 3

The CyberZHG/keras-self-attention repo was created 5 years ago and the last code push was 2 years ago.
The project is popular with 655 github stars!

How to Install keras-self-attention

You can install keras-self-attention using pip

pip install keras-self-attention

or add it to a project with poetry

poetry add keras-self-attention

Package Details

Author
CyberZHG
License
MIT
Homepage
https://github.com/CyberZHG/keras-self-attention
PyPi:
https://pypi.org/project/keras-self-attention/
GitHub Repo:
https://github.com/CyberZHG/keras-self-attention

Classifiers

No  keras-self-attention  pypi packages just yet.

Errors

A list of common keras-self-attention errors.

Code Examples

Here are some keras-self-attention code examples and snippets.

Related Packages & Articles

keras-crf 0.3.0

A more elegant and convenient CRF built on tensorflow-addons.

horovod 0.28.1

Horovod is a powerful distributed training framework for Python that allows you to train deep learning models across multiple GPUs and servers quickly and efficiently. It falls under the category of distributed computing libraries. Built on top of TensorFlow, PyTorch, and other popular deep learning frameworks, Horovod simplifies the process of scaling up your model training by handling the complexities of distributed training under the hood.