Home

Awesome

MAT

The official implementation of the Molecule Attention Transformer. ArXiv

<p align='center'> <img src="https://github.com/gmum/MAT/blob/master/assets/MAT.png" alt="architecture" width="600"/> </p>

Code

More functionality will be available soon!

Pretrained weights

Pretrained weights are available here

Results

In this section we present the average rank across the 7 datasets from our benchmark.

Requirements

Acknowledgments

Transformer implementation is inspired by The Annotated Transformer.