• Stars
    star
    428
  • Rank 101,481 (Top 2 %)
  • Language
    Jupyter Notebook
  • License
    MIT License
  • Created over 1 year ago
  • Updated 11 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

EasyLLM -

EasyLLM is an open source project that provides helpful tools and methods for working with large language models (LLMs), both open source and closed source. Get immediataly started or check out the documentation.

EasyLLM implements clients that are compatible with OpenAI's Completion API. This means you can easily replace openai.ChatCompletion, openai.Completion, openai.Embedding with, for example, huggingface.ChatCompletion, huggingface.Completion or huggingface.Embedding by changing one line of code.

Supported Clients

  • huggingface - HuggingFace models
    • huggingface.ChatCompletion - Chat with LLMs
    • huggingface.Completion - Text completion with LLMs
    • huggingface.Embedding - Create embeddings with LLMs
  • sagemaker - Open LLMs deployed on Amazon SageMaker
    • sagemaker.ChatCompletion - Chat with LLMs
    • sagemaker.Completion - Text completion with LLMs
    • sagemaker.Embedding - Create embeddings with LLMs

Check out the Examples to get started.

๐Ÿš€ Getting Started

Install EasyLLM via pip:

pip install easyllm

Then import and start using the clients:

from easyllm.clients import huggingface

# helper to build llama2 prompt
huggingface.prompt_builder = "llama2"

response = huggingface.ChatCompletion.create(
    model="meta-llama/Llama-2-70b-chat-hf",
    messages=[
        {"role": "system", "content": "\nYou are a helpful assistant speaking like a pirate. argh!"},
        {"role": "user", "content": "What is the sun?"},
    ],
    temperature=0.9,
    top_p=0.6,
    max_tokens=256,
)

print(response)

the result will look like

{
  "id": "hf-lVC2iTMkFJ",
  "object": "chat.completion",
  "created": 1690661144,
  "model": "meta-llama/Llama-2-70b-chat-hf",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": " Arrrr, the sun be a big ol' ball o' fire in the sky, me hearty! It be the source o' light and warmth for our fair planet, and it be a mighty powerful force, savvy? Without the sun, we'd be sailin' through the darkness, lost and cold, so let's give a hearty \"Yarrr!\" for the sun, me hearties! Arrrr!"
      },
      "finish_reason": null
    }
  ],
  "usage": {
    "prompt_tokens": 111,
    "completion_tokens": 299,
    "total_tokens": 410
  }
}

Check out other examples:

See the documentation for more detailed usage and examples.

๐Ÿ’ช๐Ÿป Migration from OpenAI to HuggingFace

Migrating from OpenAI to HuggingFace is easy. Just change the import statement and the client you want to use and optionally the prompt builder.

- import openai
+ from easyllm.clients import huggingface
+ huggingface.prompt_builder = "llama2"


- response = openai.ChatCompletion.create(
+ response = huggingface.ChatCompletion.create(
-    model="gpt-3.5-turbo",
+    model="meta-llama/Llama-2-70b-chat-hf",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Knock knock."},
    ],
)

Make sure when you switch your client that your hyperparameters are still valid. For example, temperature of GPT-3 might be different than temperature of Llama-2.

โ˜‘๏ธ Key Features

๐Ÿค Compatible Clients

  • Implementation of clients compatible with OpenAI API format of openai.ChatCompletion, openai.Completion, openai.Embedding.
  • Easily switch between different LLMs like openai.ChatCompletion and huggingface.ChatCompletion by changing one line of code.
  • Support for streaming of completions, checkout example How to stream completions.

โš™๏ธ Helper Modules โš™๏ธ

  • evol_instruct (work in progress) - Use evolutionary algorithms create instructions for LLMs.

  • prompt_utils - Helper methods to easily convert between prompt formats like OpenAI Messages to prompts for open source models like Llama 2.

๐Ÿ™ Contributing

EasyLLM is an open source project and welcomes contributions of all kinds.

The project uses hatch for development. To get started, fork the repository and clone it to your local machine.

  1. Confirm hatch is installed (pipx is great to make it available globally on your machine)
  2. Once in the project directory, run hatch env create to create a default virtual environment for development.
  3. Activate the virtual environment with hatch shell
  4. Start developing! ๐Ÿคฉ

๐Ÿ“” Citation & Acknowledgements

If you use EasyLLM, please share it with me on social media or email. I would love to hear about it! You can also cite the project using the following BibTeX:

@software{Philipp_Schmid_EasyLLM_2023,
author = {Philipp Schmid},
license = {Apache-2.0},
month = juj,
title = {EasyLLM: Streamlined Tools for LLMs},
url = {https://github.com/philschmid/easyllm},
year = {2023}
}

More Repositories

1

deep-learning-pytorch-huggingface

Jupyter Notebook
547
star
2

clipper.js

HTML to Markdown converter and crawler.
TypeScript
471
star
3

document-ai-transformers

Jupyter Notebook
280
star
4

huggingface-sagemaker-workshop-series

Enterprise Scale NLP with Hugging Face & SageMaker Workshop series
Jupyter Notebook
228
star
5

sagemaker-huggingface-llama-2-samples

Jupyter Notebook
86
star
6

cdk-samples

Python
56
star
7

llm-sagemaker-sample

Jupyter Notebook
48
star
8

serverless-bert-huggingface-aws-lambda-docker

Python
40
star
9

terraform-aws-sagemaker-huggingface

HCL
39
star
10

advanced-pii-huggingface-sagemaker

Jupyter Notebook
34
star
11

knowledge-distillation-transformers-pytorch-sagemaker

Jupyter Notebook
33
star
12

serverless-bert-with-huggingface-aws-lambda

Python
30
star
13

deep-learning-habana-huggingface

Jupyter Notebook
29
star
14

amazon-sagemaker-gpt-j-sample

Jupyter Notebook
28
star
15

optimum-transformers-optimizations

Jupyter Notebook
28
star
16

optimum-static-quantization

Jupyter Notebook
27
star
17

efsync

Python
23
star
18

setfit-few-shot-classification-sample

Jupyter Notebook
21
star
19

aws-lambda-with-docker-image

Python
21
star
20

text-generation-inference-tests

Jupyter Notebook
20
star
21

evaluate-llms

Includes examples on how to evaluate LLMs
Jupyter Notebook
19
star
22

fine-tune-GPT-2

Jupyter Notebook
17
star
23

deepspeed-sagemaker-example

Jupyter Notebook
17
star
24

deep-learning-remote-runner

Python
16
star
25

keras-vision-transformer-huggingface

Jupyter Notebook
15
star
26

serverless-machine-learning

collection of serverless machine learning use cases and examples including Hugging Face transformers, timm, Gradio
Python
15
star
27

transformers-pytorch-text-classification

Jupyter Notebook
14
star
28

aws-sagemaker-huggingface-llm

Jupyter Notebook
13
star
29

aws-neuron-samples

Python
12
star
30

terraform-aws-llm-sagemaker

HCL
12
star
31

new-serverless-bert-aws-lambda

Python
11
star
32

blog-github-actions-aws-lambda-python

Python
10
star
33

sentence-transformers-huggingface-inferentia

Jupyter Notebook
9
star
34

huggingface-container

Dockerfile
9
star
35

sagemaker-falcon-180b-samples

Jupyter Notebook
9
star
36

multilingual-serverless-qa-aws-lambda

Python
9
star
37

huggingface-inferentia2-samples

Jupyter Notebook
9
star
38

amazon-sagemaker-flan-t5-xxl

Example how to deploy FLAN-T5-XXL on Amazon SageMaker
Jupyter Notebook
8
star
39

aws-marketplace-example

TypeScript
7
star
40

sample-huggingface-sagemaker-cdk

Python
7
star
41

transformers-deepspeed

Jupyter Notebook
7
star
42

huggingface-mongodb-example

7
star
43

serverless-efs-and-aws-lambda

Python
6
star
44

github-actions

6
star
45

model-recommender

Jupyter Notebook
6
star
46

open-source-function-calling

Jupyter Notebook
6
star
47

blog-custom-github-action

Dockerfile
6
star
48

scale-machine-learning-w-pytorch

Python
5
star
49

transformers-inference-experiments

Jupyter Notebook
5
star
50

keras-financial-summarization-huggingface

Jupyter Notebook
5
star
51

open-llm-stack

Open LLM Stack to easily deploy open source Generative AI application in the cloud and for production
5
star
52

rust-machine-learning

Rust
4
star
53

onnx-transformers

Python
4
star
54

keras-layoutlm-transformers

Jupyter Notebook
4
star
55

blog-github-action-cicd-aws-s3

Vue
4
star
56

amazon-sagemaker-flan-ul2

Jupyter Notebook
4
star
57

aws-bedrock-titan-mteb

Repository to evaluate Amazon Bedrock Titan text-embeddings on MTEB
Python
4
star
58

sentence-transformers-tensorflow

Jupyter Notebook
4
star
59

langchain-tests

Jupyter Notebook
3
star
60

rust-hf-hub-loader

Rust
3
star
61

rust-stuff

Rust
3
star
62

philschmid.de

TypeScript
3
star
63

philschmid-de-v2

JavaScript
3
star
64

huggingface-sagemaker-llm-private-vpc

Jupyter Notebook
3
star
65

prosus-sagemaker-huggingface-workshop

Jupyter Notebook
3
star
66

huggingface-sagemaker-multi-container-endpoint

Jupyter Notebook
2
star
67

pytorch-bert-e2e-model

Jupyter Notebook
2
star
68

aws-devcontainer-test

Dockerfile
2
star
69

tmls-sagemaker-huggingface-workshop

Jupyter Notebook
2
star
70

llama3-aws-trainium-sample

Jupyter Notebook
2
star
71

gradio-docker

2
star
72

sagemaker-huggingface-idefics-sample

Jupyter Notebook
2
star
73

langchain-samples-and-experiments

Jupyter Notebook
2
star
74

sagemaker-cdk-samples

TypeScript
2
star
75

rust-vs-python

Python
2
star
76

rust-lambda-example

Rust AWS Lambda API Gateway CDK example
Rust
2
star
77

transformers-keras-e2e-ner

Jupyter Notebook
2
star
78

accelerate-transformers-example

2
star
79

llmperf-bench

llmperf bench is a toolkit to benchmark Hugging Face TGI with llmperf easily.
Python
2
star
80

sagemaker-debug-xla

Python
1
star
81

transformers-inferentia

Python
1
star
82

german-sentiment-bert

Jupyter Notebook
1
star
83

huggingface-course-sagemaker-talk

Jupyter Notebook
1
star
84

huggingface_sagemaker_tensorflow_distributed

Python
1
star
85

download-release-assets

Shell
1
star
86

sagemaker-beta-inference

Jupyter Notebook
1
star
87

transformers-deepspeed-expermiments

Python
1
star
88

python-project-template

Python
1
star
89

sentence-transformers-optimizations

Jupyter Notebook
1
star
90

train-6-b-gpt-j-amazon-sagemaker

Jupyter Notebook
1
star
91

lambda-apollo-dynamodb-template

TypeScript
1
star
92

speculative-decoding-medusa-example

Example repository on how to train and benchmark Medusa based Speculative Decoding with Hugging Face
1
star
93

stable-diffusion-tests

Jupyter Notebook
1
star
94

BYOC-Amazon-Sagemaker

Python
1
star
95

sample-custom-inference-sagemaker-huggingface

Python
1
star
96

epfllm-megatron-llm

Jupyter Notebook
1
star
97

personal-ai-image

Fine-tune FLUX 1.dev for personal AI photos
Python
1
star
98

philschmid-blog

TypeScript
1
star
99

sdxl-inf2-demo-spaces-gradio

Python
1
star
100

nividia-triton-distilbert-bls-classification-example

Python
1
star