Popular repositories Loading
-
DeepSpeed
DeepSpeed PublicForked from microsoft/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Python
-
nanoGPT
nanoGPT PublicForked from karpathy/nanoGPT
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Python
-
stanford_alpaca
stanford_alpaca PublicForked from tatsu-lab/stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
Python
-
alpaca-lora
alpaca-lora PublicForked from tloen/alpaca-lora
Instruct-tune LLaMA on consumer hardware
Jupyter Notebook
-
Chinese-LLaMA-Alpaca-2
Chinese-LLaMA-Alpaca-2 PublicForked from ymcui/Chinese-LLaMA-Alpaca-2
中文LLaMA-2 & Alpaca-2大模型二期项目 + 16K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs, including 16K long context models)
Python
-
Chinese-LLaMA-Alpaca
Chinese-LLaMA-Alpaca PublicForked from ymcui/Chinese-LLaMA-Alpaca
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Python
If the problem persists, check the GitHub status page or contact support.