• Stars
    star
    9,388
  • Rank 3,806 (Top 0.08 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created almost 2 years ago
  • Updated 4 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

ChatRWKV is like ChatGPT but powered by RWKV (100% RNN) language model, and open source.

ChatRWKV (pronounced as "RwaKuv", from 4 major params: R W K V)

ChatRWKV is like ChatGPT but powered by my RWKV (100% RNN) language model, which is the only RNN (as of now) that can match transformers in quality and scaling, while being faster and saves VRAM. Training sponsored by Stability EleutherAI :) 中文使用教程,请往下看,在本页面底部。

Raven 14B (finetuned on Alpaca+ShareGPT+...) Demo: https://huggingface.co/spaces/BlinkDL/ChatRWKV-gradio

World 7B (supports 100+ world languages) Demo: https://huggingface.co/spaces/BlinkDL/RWKV-World-7B

Download RWKV-4 weights: https://huggingface.co/BlinkDL (Use RWKV-4 models. DO NOT use RWKV-4a and RWKV-4b models.)

Note: RWKV-4-World is the best model: generation & chat & code in 100+ world languages, with the best English zero-shot & in-context learning ability too.

Use v2/convert_model.py to convert a model for a strategy, for faster loading & saves CPU RAM.

Note RWKV_CUDA_ON will build a CUDA kernel (much faster & saves VRAM). Here is how to build it ("pip install ninja" first):

# How to build in Linux: set these and run v2/chat.py
export PATH=/usr/local/cuda/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
# How to build in win:
Install VS2022 build tools (https://aka.ms/vs/17/release/vs_BuildTools.exe select Desktop C++). Reinstall CUDA 11.7 (install VC++ extensions). Run v2/chat.py in "x64 native tools command prompt". 

RWKV pip package: https://pypi.org/project/rwkv/ (please always check for latest version and upgrade)

World demo script: https://github.com/BlinkDL/ChatRWKV/blob/main/API_DEMO_WORLD.py

Raven Q&A demo script: https://github.com/BlinkDL/ChatRWKV/blob/main/v2/benchmark_more.py

ChatRWKV-strategy

RWKV Discord: https://discord.gg/bDSBUMeFpc (let's build together)

Twitter: https://twitter.com/BlinkDL_AI

RWKV LM: https://github.com/BlinkDL/RWKV-LM (explanation, fine-tuning, training, etc.)

RWKV in 150 lines (model, inference, text generation): https://github.com/BlinkDL/ChatRWKV/blob/main/RWKV_in_150_lines.py

Building your own RWKV inference engine: begin with https://github.com/BlinkDL/ChatRWKV/blob/main/src/model_run.py which is easier to understand (used by https://github.com/BlinkDL/ChatRWKV/blob/main/chat.py).

RWKV preprint https://arxiv.org/abs/2305.13048

RWKV-paper

Cool Community RWKV Projects:

https://github.com/saharNooby/rwkv.cpp fast i4 i8 fp16 fp32 CPU inference using ggml

https://github.com/harrisonvanderbyl/rwkv-cpp-cuda fast windows/linux & cuda/rocm/vulkan GPU inference (no need for python & pytorch)

https://github.com/Blealtan/RWKV-LM-LoRA LoRA fine-tuning

https://github.com/josStorer/RWKV-Runner cool GUI

More RWKV projects: https://github.com/search?o=desc&q=rwkv&s=updated&type=Repositories

ChatRWKV v2: with "stream" and "split" strategies, and INT8. 3G VRAM is enough to run RWKV 14B :) https://github.com/BlinkDL/ChatRWKV/tree/main/v2

os.environ["RWKV_JIT_ON"] = '1'
os.environ["RWKV_CUDA_ON"] = '0' # if '1' then use CUDA kernel for seq mode (much faster)
from rwkv.model import RWKV                         # pip install rwkv
model = RWKV(model='/fsx/BlinkDL/HF-MODEL/rwkv-4-pile-1b5/RWKV-4-Pile-1B5-20220903-8040', strategy='cuda fp16')

out, state = model.forward([187, 510, 1563, 310, 247], None)   # use 20B_tokenizer.json
print(out.detach().cpu().numpy())                   # get logits
out, state = model.forward([187, 510], None)
out, state = model.forward([1563], state)           # RNN has state (use deepcopy if you want to clone it)
out, state = model.forward([310, 247], state)
print(out.detach().cpu().numpy())                   # same result as above

RWKV-eval

Here is https://huggingface.co/BlinkDL/rwkv-4-raven/blob/main/RWKV-4-Raven-14B-v7-Eng-20230404-ctx4096.pth in action: ChatRWKV

When you build a RWKV chatbot, always check the text corresponding to the state, in order to prevent bugs.

  1. Never call raw forward() directly. Instead, put it in a function that will record the text corresponding to the state.

  2. The best chat format (check whether your text is of this format): Bob: xxxxxxxxxxxxxxxxxx\n\nAlice: xxxxxxxxxxxxx\n\nBob: xxxxxxxxxxxxxxxx\n\nAlice:

  • There should not be any space after the final "Alice:". The generation result will have a space in the beginning, and you can simply strip it.
  • You can use \n in xxxxx, but avoid \n\n. So simply do xxxxx = xxxxx.strip().replace('\r\n','\n').replace('\n\n','\n')

If you are building your own RWKV inference engine, begin with https://github.com/BlinkDL/ChatRWKV/blob/main/src/model_run.py which is easier to understand (used by https://github.com/BlinkDL/ChatRWKV/blob/main/chat.py)

The lastest "Raven"-series Alpaca-style-tuned RWKV 14B & 7B models are very good (almost ChatGPT-like, good at multiround chat too). Download: https://huggingface.co/BlinkDL/rwkv-4-raven

Previous old model results: ChatRWKV ChatRWKV ChatRWKV ChatRWKV ChatRWKV ChatRWKV ChatRWKV

中文模型

QQ群 553456870(加入时请简单自我介绍)。有研发能力的朋友加群 325154699。

中文使用教程:https://zhuanlan.zhihu.com/p/618011122 https://zhuanlan.zhihu.com/p/616351661

推荐UI:https://github.com/l15y/wenda

Star History

Star History Chart

More Repositories

1

RWKV-LM

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Python
12,430
star
2

AI-Writer

AI 写小说,生成玄幻和言情网文等等。中文预训练生成模型。采用我的 RWKV 模型,类似 GPT-2 。AI写作。RWKV for Chinese novel generation.
Python
2,908
star
3

Hua

Hua is an AI image editor with Stable Diffusion (and more).
352
star
4

RWKV-CUDA

The CUDA version of the RWKV language model ( https://github.com/BlinkDL/RWKV-LM )
Cuda
211
star
5

BlinkDL.github.io

A collection of State of the Art results in AI / ML / DL / RL / CV / NLP.
88
star
6

BlinkDL

A minimalist deep learning library in Javascript using WebGL + asm.js. Run convolutional neural network in your browser.
JavaScript
82
star
7

YYDZ

丁真宇宙,一眼丁真合集,已有两千多张图片。The YYDZ (Yi Yan Ding Zhen / One Eye Ding Zhen) dataset.
79
star
8

RWKV-v2-RNN-Pile

RWKV-v2-RNN trained on the Pile. See https://github.com/BlinkDL/RWKV-LM for details.
Python
65
star
9

LinearAttentionArena

Here we will test various linear attention designs.
Python
55
star
10

BookCNN

《深度卷积网络:原理与实践》现已在淘宝天猫京东当当发售. 这里是其中的代码下载.
Jupyter Notebook
54
star
11

SmallInitEmb

LayerNorm(SmallInit(Embedding)) in a Transformer to improve convergence
Python
45
star
12

WorldModel

Let us make Psychohistory (as in Asimov) a reality, and accessible to everyone. Useful for LLM grounding and games / fiction / business / finance / governance, and can align agents with human too.
40
star
13

LM-Trick-Questions

Here we collect trick questions and failed tasks for open source LLMs to improve them.
32
star
14

Basis

The Basis Programming Language
Python
27
star
15

BlinkToDo

A minimalist ToDo.txt page. 如果你的ToDo有一百项以上,试试这个基于txt的极简事项管理工具。
JavaScript
25
star
16

AntiAging

List of Anti-aging Research
11
star
17

RWKV.com

HTML
11
star
18

Nala

The Nala markup, to turn a "Natural Language" sentence into a code-like statement. Nala 标注,将自然语言变为编程语言。
9
star
19

PathTracingJS

Path tracing demo with JS in your web browser. 用浏览器JS做路径跟踪渲染。
JavaScript
7
star
20

BlinkColorTheme

A colorful theme for HTML+JS+CSS.
CSS
4
star
21

Model_Leaderboard

Leaderboard of AI models.
HTML
4
star
22

MathBook

一个较为系统的数学笔记(graduate level)
2
star
23

BasisLang.com

BasisLang.com
HTML
1
star