Alpaca-Turbo
Alpaca-Turbo is a frontend to use large language models that can be run locally without much setup required. It is a user-friendly web UI for the llama.cpp , with unique features that make it stand out from other implementations. The goal is to provide a seamless chat experience that is easy to configure and use, without sacrificing speed or functionality.
📝 Example views
demo.mp4
📦 Installation Steps
📺 Video Instructions
- ToDo
- ToDo
🐳 Using Docker (only Linux is supported with docker)
- ToDo
🪟 Using Windows (standalone or miniconda) AND Mac M1/M2 (using miniconda)
oneclick standalone launcher - Alpaca-Turbo.exe.
For Windows users we have a
- Links for installing miniconda:
- Download the latest alpaca-turbo.zip from the release page.
- Extract Alpaca-Turbo.zip to Alpaca-Turbo
Make sure you have enough space for the models in the extracted location
-
Copy your alpaca models to alpaca-turbo/models/ directory.
-
Open cmd as Admin and type
conda init
-
close that window
-
open a new cmd window in your Alpaca-Turbo dir and type
conda create -n alpaca_turbo python=3.10 -y conda activate alpaca_turbo pip install -r requirements.txt python app.py
-
ready to interact
Directly installing with Pip
just get the latest release unzip and then run
pip install -r requirements.txt
python app.py
💁 Contributing
As an open source project in a rapidly developing field, I am open to contributions, whether it be in the form of a new feature, improved infra, or better documentation.
For detailed information on how to contribute.
🙌 Credits
- ggerganov/LLaMA.cpp For their amazing cpp library
- antimatter15/alpaca.cpp For initial versions of their chat app
- cocktailpeanut/dalai For the Inspiration
- MetaAI for the LLaMA models
- Stanford for the alpaca models