• Stars
    star
    1,671
  • Rank 27,932 (Top 0.6 %)
  • Language
    Java
  • License
    Apache License 2.0
  • Created over 1 year ago
  • Updated 7 days ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Open-source end-to-end LLM Development Platform

cometLLM

PyPI version GitHub cometLLM Documentation Downloads

CometLLM is a tool to log and visualize your LLM prompts and chains. Use CometLLM to identify effective prompt strategies, streamline your troubleshooting, and ensure reproducible workflows!

CometLLM Preview

⚑️ Quickstart

Install comet_llm Python library with pip:

pip install comet_llm

If you don't have already, create your free Comet account and grab your API Key from the account settings page.

Now you are all set to log your first prompt and response:

import comet_llm

comet_llm.log_prompt(
    prompt="What is your name?",
    output=" My name is Alex.",
    api_key="<YOUR_COMET_API_KEY>",
)

🎯 Features

  • Log your prompts and responses, including prompt template, variables, timestamps and duration and any metadata that you need.
  • Visualize your prompts and responses in the UI.
  • Log your chain execution down to the level of granularity that you need.
  • Visualize your chain execution in the UI.
  • Automatically tracks your prompts when using the OpenAI chat models.
  • Track and analyze user feedback.
  • Diff your prompts and chain execution in the UI.

πŸ‘€ Examples

To log a single LLM call as an individual prompt, use comet_llm.log_prompt. If you require more granularity, you can log a chain of executions that may include more than one LLM call, context retrieval, or data pre- or post-processing with comet_llm.start_chain.

Log a full prompt and response

import comet_llm

comet_llm.log_prompt(
    prompt="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: What is your name?\nAnswer:",
    prompt_template="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: {{question}}?\nAnswer:",
    prompt_template_variables={"question": "What is your name?"},
    metadata= {
        "usage.prompt_tokens": 7,
        "usage.completion_tokens": 5,
        "usage.total_tokens": 12,
    },
    output=" My name is Alex.",
    duration=16.598,
)

Read the full documentation for more details about logging a prompt.

Log a LLM chain

from comet_llm import Span, end_chain, start_chain
import datetime
from time import sleep


def retrieve_context(user_question):
    if "open" in user_question:
        return "Opening hours: 08:00 to 17:00 all days"


def llm_answering(user_question, current_time, context):
    prompt_template = """You are a helpful chatbot. You have access to the following context:
    {context}
    The current time is: {current_time}
    Analyze the following user question and decide if you can answer it, if the question can't be answered, say \"I don't know\":
    {user_question}
    """

    prompt = prompt_template.format(
        user_question=user_question, current_time=current_time, context=context
    )

    with Span(
        category="llm-call",
        inputs={"prompt_template": prompt_template, "prompt": prompt},
    ) as span:
        # Call your LLM model here
        sleep(0.1)
        result = "Yes we are currently open"
        usage = {"prompt_tokens": 52, "completion_tokens": 12, "total_tokens": 64}

        span.set_outputs(outputs={"result": result}, metadata={"usage": usage})

    return result


def main(user_question, current_time):
    start_chain(inputs={"user_question": user_question, "current_time": current_time})

    with Span(
        category="context-retrieval",
        name="Retrieve Context",
        inputs={"user_question": user_question},
    ) as span:
        context = retrieve_context(user_question)

        span.set_outputs(outputs={"context": context})

    with Span(
        category="llm-reasoning",
        inputs={
            "user_question": user_question,
            "current_time": current_time,
            "context": context,
        },
    ) as span:
        result = llm_answering(user_question, current_time, context)

        span.set_outputs(outputs={"result": result})

    end_chain(outputs={"result": result})


main("Are you open?", str(datetime.datetime.now().time()))

Read the full documentation for more details about logging a chain.

βš™οΈ Configuration

You can configure your Comet credentials and where you are logging data to:

Name Python parameter name Environment variable name
Comet API KEY api_key COMET_API_KEY
Comet Workspace name workspace COMET_WORKSPACE
Comet Project name project COMET_PROJECT_NAME

πŸ“ License

Copyright (c) Comet 2023-present. cometLLM is free and open-source software licensed under the MIT License.

More Repositories

1

kangas

🦘 Explore multimedia datasets at scale
Jupyter Notebook
1,040
star
2

comet-examples

Examples of Machine Learning code using Comet.ml
Jupyter Notebook
148
star
3

issue-tracking

Questions, Help, and Issues for Comet ML
85
star
4

comet-for-mlflow

Comet-For-MLFlow Extension
Python
55
star
5

blog-serving-hugging-face-models

Shell
19
star
6

comet-content

Content for Comet Blog
Jupyter Notebook
9
star
7

comet-sagemaker

Example demonstrating how to use Comet with Amazon Sagemaker
Python
8
star
8

cometr

Comet SDK for R
R
8
star
9

comet-sdk-extensions

Open source contributions for integrations with comet_ml
Python
6
star
10

comet-java-sdk

Comet Java SDK
Java
5
star
11

co2-tracker-utils

CO2 Tracker helpers
Python
5
star
12

streamlit-jupyter-magic

A local streamlit server for Jupyter environments
Python
5
star
13

keras-fruit-classifer

Classifying fruits using a Keras multi-class image classification model and Google OpenΒ Images
Jupyter Notebook
4
star
14

ray-tune-example

Example for using Ray Tune's Comet Integration
Python
2
star
15

oscontainer

The small Python library to read resources available for Linux OS container.
Python
2
star
16

mlops-actions

Python
2
star
17

terraform-aws-comet

HCL
2
star
18

rd2md

A converter to transform R docs into markdown
Python
2
star
19

comet-detectron

Example for working with Detectron and Comet
Python
1
star
20

docker_upload

Build and upload docker images
1
star
21

docker_build

1
star
22

comet-ml.github.io

Comet ML code
JavaScript
1
star
23

rllib-example

Example of logging RLLib runs to Comet
Python
1
star
24

comet-recipes

A collection of cookiecutter recipes for integrating ML frameworks with Comet in Python and R
Python
1
star
25

ansible-playbill

Python library for invoking anisble playbooks from within other python programs
Python
1
star