Skip to content
#

self-attention

Here are 298 public repositories matching this topic...

This repository provides a basic implementation of self-attention. The code demonstrates how attention mechanisms work in predicting the next word in a sequence. It's a basic implementation that demonstrates the core concept of attention but lacks the complexity of more advanced models like Transformers.

  • Updated Sep 23, 2024
  • Python

Semantic segmentation is an important job in computer vision, and its applications have grown in popularity over the last decade.We grouped the publications that used various forms of segmentation in this repository. Particularly, every paper is built on a transformer.

  • Updated Sep 6, 2024
SAITS

The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516

  • Updated Sep 2, 2024
  • Python
HyperSIGMA

Improve this page

Add a description, image, and links to the self-attention topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the self-attention topic, visit your repo's landing page and select "manage topics."

Learn more