Stars
Lightning ⚡️ fast forecasting with statistical and econometric models.
HKUST Thesis LaTeX3 Template (Also available on Overleaf)
Parallelformers: An Efficient Model Parallelization Toolkit for Deployment
Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language models
A library to inspect and extract intermediate layers of PyTorch models.
System design patterns for machine learning
The standard data-centric AI package for data quality and machine learning with messy, real-world data and labels.
The prototype for NSDI paper "NetHint: White-Box Networking for Multi-Tenant Data Centers"
FFCV: Fast Forward Computer Vision (and other ML workloads!)
A PyTorch repo for data loading and utilities to be shared by the PyTorch domain libraries.
Mistral: A strong, northwesterly wind: Framework for transparent and accessible large-scale language model training, built with Hugging Face 🤗 Transformers.
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Train…
DeeBERT: Dynamic Early Exiting for Accelerating BERT Inference
Official Pytorch Implementation of Length-Adaptive Transformer (ACL 2021)
Kats, a kit to analyze time series data, a lightweight, easy-to-use, generalizable, and extendable framework to perform time series analysis, from understanding the key statistics and characteristi…
FedScale is a scalable and extensible open-source federated learning (FL) platform.
Ἀνατομή is a PyTorch library to analyze representation of neural networks
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Feature extraction made simple with torchextractor
Investment Research for Everyone, Everywhere.