Awesome
PyTorch Multi-Head Attention
Install
pip install torch-multi-head-attention
Usage
from torch_multi_head_attention import MultiHeadAttention
MultiHeadAttention(in_features=768, head_num=12)
pip install torch-multi-head-attention
from torch_multi_head_attention import MultiHeadAttention
MultiHeadAttention(in_features=768, head_num=12)