The GitHub repository for the paper "Informer" accepted by AAAI 2021.
-
Updated
May 27, 2024 - Python
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
The implementation of DeBERTa
CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
list of efficient attention modules
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Text classification using deep learning models in Pytorch
(ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
A Structured Self-attentive Sentence Embedding
Official PyTorch implementation of Fully Attentional Networks
Implementing Stand-Alone Self-Attention in Vision Models using Pytorch
[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers
DSMIL: Dual-stream multiple instance learning networks for tumor detection in Whole Slide Image
[NeurIPS'22] Tokenized Graph Transformer (TokenGT), in PyTorch
Add a description, image, and links to the self-attention topic page so that developers can more easily learn about it.
To associate your repository with the self-attention topic, visit your repo's landing page and select "manage topics."