• Stars
    star
    2,043
  • Rank 22,647 (Top 0.5 %)
  • Language
    Jupyter Notebook
  • License
    MIT License
  • Created over 1 year ago
  • Updated 11 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Simple UI for LLM Model Finetuning
title emoji colorFrom colorTo sdk app_file pinned
Simple LLM Finetuner
🦙
yellow
orange
gradio
app.py
false

🦙 Simple LLM Finetuner

Open In Colab Open In Spaces

Simple LLM Finetuner is a beginner-friendly interface designed to facilitate fine-tuning various language models using LoRA method via the PEFT library on commodity NVIDIA GPUs. With small dataset and sample lengths of 256, you can even run this on a regular Colab Tesla T4 instance.

With this intuitive UI, you can easily manage your dataset, customize parameters, train, and evaluate the model's inference capabilities.

Acknowledgements

Features

  • Simply paste datasets in the UI, separated by double blank lines
  • Adjustable parameters for fine-tuning and inference
  • Beginner-friendly UI with explanations for each parameter

Getting Started

Prerequisites

  • Linux or WSL
  • Modern NVIDIA GPU with >= 16 GB of VRAM (but it might be possible to run with less for smaller sample lengths)

Usage

I recommend using a virtual environment to install the required packages. Conda preferred.

conda create -n simple-llm-finetuner python=3.10
conda activate simple-llm-finetuner
conda install -y cuda -c nvidia/label/cuda-11.7.0
conda install -y pytorch=2 pytorch-cuda=11.7 -c pytorch

On WSL, you might need to install CUDA manually by following these steps, then running the following before you launch:

export LD_LIBRARY_PATH=/usr/lib/wsl/lib

Clone the repository and install the required packages.

git clone https://github.com/lxe/simple-llm-finetuner.git
cd simple-llm-finetuner
pip install -r requirements.txt

Launch it

python app.py

Open http://127.0.0.1:7860/ in your browser. Prepare your training data by separating each sample with 2 blank lines. Paste the whole training dataset into the textbox. Specify the new LoRA adapter name in the "New PEFT Adapter Name" textbox, then click train. You might need to adjust the max sequence length and batch size to fit your GPU memory. The model will be saved in the lora/ directory.

After training is done, navigate to "Inference" tab, select your LoRA, and play with it.

Have fun!

YouTube Walkthough

https://www.youtube.com/watch?v=yM1wanDkNz8

License

MIT License

More Repositories

1

llavavision

A simple "Be My Eyes" web app with a llama.cpp/llava backend
JavaScript
477
star
2

require-navigator

A Google Chrome extension for navigating Node.js `require()`s in Github.
JavaScript
118
star
3

cerebras-lora-alpaca

LoRA weights for Cerebras-GPT-2.7b finetuned on Alpaca dataset with shorter prompt
Jupyter Notebook
62
star
4

llama-tune

LLaMa Tuning with Stanford Alpaca Dataset using Deepspeed and Transformers
Python
52
star
5

no-bugs

Comprehensive JavaScript test framework for perfect code
JavaScript
39
star
6

llm-companion

Mobile web app for audio "push-to-talk" + TTS chat interface with OpenAI-like APIs
JavaScript
34
star
7

tts-server

A simple TTS server for generating speech using StyleTTS2
Python
22
star
8

supermoon

Caching "npm install" replacement.
JavaScript
13
star
9

array-to-object-with-property-names-that-map-to-array-values-and-property-values-that-are-objects-wi

Convert an array to an object whose property names map to the array items, and whose values are objects whose properties in turn you can specify, and those property values are also mapped to the current array item.
JavaScript
13
star
10

stream-replace

Replace text in a stream
JavaScript
11
star
11

reglite

A web-scale enterprise private npm registry.
JavaScript
9
star
12

nodemagic

Install and use npm and node versions from package.json/engines.
Shell
9
star
13

stackflow

Generate a software stack that markets itself!
JavaScript
7
star
14

yet-another-session-module

We need another session module! Onwards toward 10,000 session modules on npm!
JavaScript
7
star
15

whatsup

Continously log what callbacks a node process is waiting for
JavaScript
6
star
16

io.coffee

Evented IO for V8 CoffeeScript
CoffeeScript
5
star
17

node-objhash

Retrieve a node.js object's v8 identity hash.
JavaScript
3
star
18

tagcheck

Check whether your git+ssh:// dependencies are up to date
JavaScript
3
star
19

uber-api-express-sample

Simple and basic web app skeleton utilizing node.js, express, and the Uber API.
JavaScript
3
star
20

clickbin

Clickb.in application
CSS
3
star
21

unstream

buffer, transform, and re-stream
JavaScript
3
star
22

tunnelify

Spawns whatever `ssh` client/command is available on your system and opens a tunnel to (from?) a remote host and port(s).
JavaScript
3
star
23

simple-llm-chatbot

Playground LLM chatbot app using stable-lm (but you can use any model)
Python
3
star
24

onehundred

Extremely optimistic code coverage tool
JavaScript
3
star
25

onerror-remote-origin

Demonstrates the difference in window.onerror behavior across local and remote scripts
HTML
2
star
26

photoservo

JavaScript
2
star
27

tailwind-demo

Created with CodeSandbox
HTML
1
star
28

lxe.github.io

HTML
1
star
29

qr

Hastily-made client-only browser-based QR code scanner.
JavaScript
1
star
30

usb

C
1
star
31

shattered

Demonstration of SHA1 collision outlined in http://shattered.io/static/shattered.pdf
JavaScript
1
star
32

cprf

Asynchronously, recursively copy directories and files.
JavaScript
1
star
33

fusion-redux-example

fusion-redux-example
JavaScript
1
star
34

fusionjs-hello-world

Created with CodeSandbox
JavaScript
1
star
35

fusion-typescript-example

Using https://fusionjs.com/ and https://baseweb.design/ with TypeScript
TypeScript
1
star