Stars
Make your models invariant to changes in scale.
The Clay Foundation Model (in development)
Implementation of I-JEPA from "Self-Supervised Learning from Images with a Joint-Embedding Predictive Architecture"
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
đź“‹ A list of open LLMs available for commercial use.
Massively parallel rigidbody physics simulation on accelerator hardware.
DSPy: The framework for programming—not prompting—foundation models
SGLang is a fast serving framework for large language models and vision language models.
A generative AI extension for JupyterLab
Simple, minimal implementation of the Mamba SSM in one file of PyTorch.
A Gradio web UI for Large Language Models.
A programming framework for agentic AI 🤖
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
[ICCV'23] Official repository of paper SwiftFormer: Efficient Additive Attention for Transformer-based Real-time Mobile Vision Applications
[ICLR 2022] Official implementation of cosformer-attention in cosFormer: Rethinking Softmax in Attention
Reformer, the efficient Transformer, in Pytorch
Create Customized Software using Natural Language Idea (through LLM-powered Multi-Agent Collaboration)
Democratization of RT-2 "RT-2: New model translates vision and language into action"
Objectron is a dataset of short, object-centric video clips. In addition, the videos also contain AR session metadata including camera poses, sparse point-clouds and planes. In each video, the came…
Official codebase for I-JEPA, the Image-based Joint-Embedding Predictive Architecture. First outlined in the CVPR paper, "Self-supervised learning from images with a joint-embedding predictive arch…
A curated list of practical guide resources of LLMs (LLMs Tree, Examples, Papers)
PyTorch code and models for the DINOv2 self-supervised learning method.
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
Code for the paper "Language Models are Unsupervised Multitask Learners"