Stars
Stable Diffusion web UI
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
A collection of design patterns/idioms in Python
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference,…
Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
A concise but complete full-attention transformer with a set of promising experimental features from various papers
Python Sorted Container Types: Sorted List, Sorted Dict, and Sorted Set
⚡LLM Zoo is a project that provides data, models, and evaluation benchmark for large language models.⚡
A Comprehensive Benchmark to Evaluate LLMs as Agents (ICLR'24)
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
This may be the simplest implement of DDPM. You can directly run Main.py to train the UNet on CIFAR-10 dataset and see the amazing process of denoising.
Prefix-Tuning: Optimizing Continuous Prompts for Generation
[ICLR'23] DiffuSeq: Sequence to Sequence Text Generation with Diffusion Models
Optimus: the first large-scale pre-trained VAE language model
This script is to save your time from Mercenaries mode of Hearthstone