LLaMA-List
LLaMA
OpenSource
-
2023-llama-dl : High-speed download of LLaMA, Facebook’s 65B parameter GPT model
-
2023-LlamaIndex : LlamaIndex (GPT Index) is a project that provides a central interface to connect your LLM’s with external data.
-
2023-dalai : The simplest way to run LLaMA on your local machine
-
2023-Alpaca.cpp : Run a fast ChatGPT-like model locally on your device. The screencast below is not sped up and running on an M2 Macbook Air with 4GB of weights.
-
2023-Alpaca-LoRA : Instruct-tuning LLaMA on consumer hardware.
-
2023-llama-rs : LLaMA-rs is a Rust port of the llama.cpp project. This allows running inference for Facebook’s LLaMA model on a CPU with good performance using full precision, f16 or 4-bit quantized versions of the model.
-
2023-Serge : A chat interface based on llama.cpp for running Alpaca models. Entirely self-hosted, no API keys needed. Fits on 4GB of RAM and runs on the CPU.
-
2023-llama.cpp : The main goal is to run the model using 4-bit quantization on a MacBook.
-
2023-gpt4all : Demo, data and code to train an assistant-style large language model with ~800k GPT-3.5-Turbo Generations based on LLaMa
-
2023-open_llama : In this repo, we release a permissively licensed open source reproduction of Meta AI’s LLaMA large language model. In this release, we’re releasing a public preview of the 7B OpenLLaMA model that has been trained with 200 billion tokens. We provide PyTorch and Jax weights of pre-trained OpenLLaMA models, as well as evaluation results and comparison against the original LLaMA models. Stay tuned for our updates.
Chinese
-
2023-Chinese-LLaMA-Alpaca : 为了促进大模型在中文 NLP 社区的开放研究,本项目开源了中文 LLaMA 模型和经过指令精调的 Alpaca 大模型。这些模型在原版 LLaMA 的基础上扩充了中文词表并使用了中文数据进行二次预训练,进一步提升了中文基础语义理解能力。同时,在中文 LLaMA 的基础上,本项目使用了中文指令数据进行指令精调,显著提升了模型对指令的理解和执行能力。
-
2023-Baize : Baize is an open-source chat model trained with LoRA. It uses 100k dialogs generated by letting ChatGPT chat with itself. We also use Alpaca’s data to improve its performance. We have released 7B, 13B and 30B models. Please refer to the paper for more details.
Alpaca
-
2023-deep-diver/Alpaca-LoRA-Serve : This repository demonstrates Alpaca-LoRA as a Chatbot service with Alpaca-LoRA and Gradio. It comes with the following features:
-
2023-LianjiaTech/BELLE : 本项目基于 Stanford Alpaca ,Stanford Alpaca 的目标是构建和开源一个基于 LLaMA 的模型。Stanford Alpaca 的种子任务都是英语,收集的数据也都是英文,因此训练出来的模型未对中文优化。
-
2023-Chinese-alpaca-lora : CamelBell(驼铃), tuning Chinese Data on Chinese based model GLM is now an individual repo. We may move original Luotuo into a new repo also.
-
2023-Dolly : This fine-tunes the GPT-J 6B model on the Alpaca dataset using a Databricks notebook. Please note that while GPT-J 6B is Apache 2.0 licensed, the Alpaca dataset is licensed under Creative Commons NonCommercial (CC BY-NC 4.0).