Probabilistic Self Attention with PageRank
-
Updated
Sep 28, 2024 - Go
Probabilistic Self Attention with PageRank
[MIR-2023-Survey] A continuously updated paper list for multi-modal pre-trained big models
Speech Transformer Exploratory Visual Environment
SA-DETR: Saliency Attention-based DETR for Salienct Object Detection
This repository provides a basic implementation of self-attention. The code demonstrates how attention mechanisms work in predicting the next word in a sequence. It's a basic implementation that demonstrates the core concept of attention but lacks the complexity of more advanced models like Transformers.
Developed a music generation deep learning model using WGAN-GP and self-attention, aimed at creating melodic compositions.
Code for paper: DGR-MIL: Exploring Diverse Global Representation in Multiple Instance Learning for Whole Slide Image Classification [ECCV 2024]
Annotated Notebooks to dive into Self-Attention, In-Context Learning, RAG, Knowledge-Graphs, Fine-Tuning, Model Optimization, and many more.
Predict materials properties using only the composition information!
Official PyTorch Implementation of MambaVision: A Hybrid Mamba-Transformer Vision Backbone
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Semantic segmentation is an important job in computer vision, and its applications have grown in popularity over the last decade.We grouped the publications that used various forms of segmentation in this repository. Particularly, every paper is built on a transformer.
Implementation of self attention in transformer architecture with the help of numpy
The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
Transformer based chatbot based on "Attention is all you need"
The official repo for the paper "HyperSIGMA: Hyperspectral Intelligence Comprehension Foundation Model"
Self-Attentive Contrastive Learning for Conditioned Periocular and Face Biometrics
Re-Implementation of "A Structured Self-Attentive Sentence Embedding" by Lin et al., 2017
PyTorch implementation of U-TAE and PaPs for satellite image time series panoptic segmentation.
Add a description, image, and links to the self-attention topic page so that developers can more easily learn about it.
To associate your repository with the self-attention topic, visit your repo's landing page and select "manage topics."