Stars
[CVPR 2024] PriViLege: Pre-trained Vision and Language Transformers Are Few-Shot Incremental Learners
Generative Multi-modal Models are Good Class Incremental Learners, CVPR 2024 [PyTorch Code]
[WSDM'2023] "HGCL: Heterogeneous Graph Contrastive Learning for Recommendation"
Causal depthwise conv1d in CUDA, with a PyTorch interface
Code repo for the paper BiT Robustly Binarized Multi-distilled Transformer
面向开发者的 LLM 入门教程,吴恩达大模型系列课程中文版
[ICML 2024] Vision Mamba: Efficient Visual Representation Learning with Bidirectional State Space Model
ReActNet: Towards Precise Binary NeuralNetwork with Generalized Activation Functions. In ECCV 2020.
Proper implementation of ResNet-s for CIFAR10/100 in pytorch that matches description of the original paper.
[CVPR 2020] This project is the PyTorch implementation of our accepted CVPR 2020 paper : forward and backward information retention for accurate binary neural networks.
Official implementation of Rectified Straight Through Estimator (ReSTE).
总结梳理自然语言处理工程师(NLP)需要积累的各方面知识,包括面试题,各种基础知识,工程能力等等,提升核心竞争力
two-tiger / Interactive-continual-Learning-Fast-and-Slow-Thinking
Forked from Biqing-Qi/Interactive-continual-Learning-Fast-and-Slow-ThinkingCode for CVPR 2024 paper Interactive Continual Learning: Fast and Slow Thinking
Accepted to CVPR 2024, "Interactive continual learning: Fast and slow thinking"
Pytorch implementation of Simplified Structured State-Spaces for Sequence Modeling (S5)
Structured state space sequence models
A simple implementation of [Mamba: Linear-Time Sequence Modeling with Selective State Spaces](https://arxiv.org/abs/2312.00752)
[CVPR 2023] Regularizing Second-Order Influences for Continual Learning
[NeurIPS 2023] Bilevel Coreset Selection in Continual Learning: A New Formulation and Algorithm
A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning. arXiv:2307.09218.
An Incremental Learning, Continual Learning, and Life-Long Learning Repository
✨✨Latest Advances on Multimodal Large Language Models