• Stars
    star
    191
  • Rank 202,877 (Top 4 %)
  • Language
    R
  • License
    Other
  • Created over 1 year ago
  • Updated about 2 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

chattr

R-CMD-check Codecov test coverage CRAN status

Intro

chattr is an interface to LLMs (Large Language Models). It enables interaction with the model directly from the RStudio IDE. chattr allows you to submit a prompt to the LLM from your script, or by using the provided Shiny Gadget.

This package’s main goal is to aid in exploratory data analysis (EDA) tasks. The additional information appended to your request, provides a sort of β€œguard rails”, so that the packages and techniques we usually recommend as best practice, are used in the model’s responses.

Install

Since this is a very early version of the package install the package from GitHub:

remotes::install_github("mlverse/chattr")

Available models

chattr provides two main integration with two main LLM back-ends. Each back-end provides access to multiple LLM types. The plan is to add more back-ends as time goes by:

Provider Models Setup Instructions
OpenAI GPT Models accessible via the OpenAI’s REST API. chattr provides a convenient way to interact with GPT 4, and 3.5. Interact with OpenAI GPT models
LLamaGPT-Chat LLM models available in your computer. Including GPT-J, LLaMA, and MPT. Tested on a GPT4ALL model. LLamaGPT-Chat is a command line chat program for models written in C++. Interact with local models
GitHub Copilot AI pair programmer that offers autocomplete-style suggestions as you code Interact with GitHub Copilot Chat

Using

The App

The main way to use chattr is through the Shiny Gadget app. By default, in RStudio the app will run inside the Viewer pane. chattr will prompt you to select the model back-end you with to use. The list of the actual models will depend on which of them you have a setup for. If no model setup is found, it will return an error. If you receive the error, please refer to the previous section to learn how to setup a model back-end in your machine. Here is an example of what the selection prompt will look like:

chattr::chattr_app()

#> ── chattr - Available models 
#> 
#> 1: GitHub - Copilot Chat -  (copilot) 
#> 2: OpenAI - Chat Completions - gpt-3.5-turbo (gpt35) 
#> 3: OpenAI - Chat Completions - gpt-4 (gpt4) 
#> 4: LlamaGPT - ~/ggml-gpt4all-j-v1.3-groovy.bin (llamagpt) 
#> 
#> Select the number of the model you would like to use:

This prompt only occurs the first time you call chattr_app(), or chattr(). If you close the app, and open it again, the app will use the model you initially selected. The selection is set for the rest of your R session, or until you manually change it. Please note that if, for example, chattr cannot find the setup for OpenAI, then those lines would not show up as options in your actual prompt.

If you wish to avoid the interactive prompt, then call chattr_use() to designate the model you wish to use before calling the app. You can also use chattr_use() to change the model back-end you are interacting with during your R session:

chattr_use("gpt35")
chattr_app()

Screenshot of the Sniny gadget app in a dark mode RStudio theme

Screenshot of the Sniny gadget app in a dark mode RStudio theme

After the LLM finishes its response, the chattr app processes all markdown code chunks. It will place three convenience buttons:

  • Copy to clipboard - It will write the code inside the chunk to your clipboard.

  • Copy to document - It will copy-paste the code directly to where the app was called from. If the app is started while working on a script, chattr will copy the code to that same script.

  • Copy to new script - It creates a new R script in the RStudio IDE, and copies the content of the chunk directly to it. Very useful when the LLM writes a Shiny app for you

A lot of effort was put in to make the app’s appearance as close as possible to the IDE. This way it feels more integrated with your work space. This includes switching the color scheme based on the current RStudio theme being light, or dark.

The settings screen can be accessed by clicking on the β€œgear” button. The screen that opens will contain the following:

  • Save and Open chats - This is an early experiment to allow us to save and retrieve past chats. chattr will save the file in an RDS format. The main objective of this feature, is to be able to see past chats, not to continue previous conversations with the LLM.

  • Prompt settings - In this section you can change the additional information attached to your prompt. Including the number of max data files, and data frames sent to the LLM.

Screenshot of the Sniny gadget options

Screenshot of the Sniny gadget options

Additional ways to interact

Apart from the Shiny app, chattr provides two more ways to interact with the LLM. For details, see: Other interfaces

How it works

chattr enriches your request with additional instructions, name and structure of data frames currently in your environment, the path for the data files in your working directory. If supported by the model, chattr will include the current chat history.

Diagram that illustrates how chattr handles model requests

Diagram that illustrates how chattr handles model requests

To see what chattr will send to the model, set the preview argument to TRUE:

library(chattr)

data(mtcars)
data(iris)

chattr_use("gpt4")
#> 
#> ── chattr
#> β€’ Provider: OpenAI - Chat Completions
#> β€’ Path/URL: https://api.openai.com/v1/chat/completions
#> β€’ Model: gpt-4
#> β€’ Label: GPT 4 (OpenAI)

chattr(preview = TRUE)
#> 
#> ── chattr ──────────────────────────────────────────────────────────────────────
#> 
#> ── Preview for: Console
#> β€’ Provider: OpenAI - Chat Completions
#> β€’ Path/URL: https://api.openai.com/v1/chat/completions
#> β€’ Model: gpt-4
#> β€’ Label: GPT 4 (OpenAI)
#> β€’ temperature: 0.01
#> β€’ max_tokens: 1000
#> β€’ stream: TRUE
#> 
#> ── Prompt:
#> role: system
#> content: You are a helpful coding assistant
#> role: user
#> content:
#> * Use the 'Tidy Modeling with R' (https://www.tmwr.org/) book as main reference
#> * Use the 'R for Data Science' (https://r4ds.had.co.nz/) book as main reference
#> * Use tidyverse packages: readr, ggplot2, dplyr, tidyr
#> * For models, use tidymodels packages: recipes, parsnip, yardstick, workflows,
#> broom
#> * Avoid explanations unless requested by user, expecting code only
#> * For any line that is not code, prefix with a: #
#> * Keep each line of explanations to no more than 80 characters
#> * DO NOT use Markdown for the code
#> [Your future prompt goes here]

Keyboard Shortcut

The best way to access chattr’s app is by setting up a keyboard shortcut for it. This package includes an RStudio Addin that gives us direct access to the app, which in turn, allows a keyboard shortcut to be assigned to the addin. The name of the addin is: β€œOpen Chat”. If you are not familiar with how to assign a keyboard shortcut see the next section.

How to setup the keyboard shortcut

  • Select Tools in the top menu, and then select Modify Keyboard Shortcuts

    Screenshot that shows where to find the option to modify the keyboard shortcuts Screenshot that shows where to find the option to modify the keyboard shortcuts
  • Search for the chattr adding by writing β€œopen chat”, in the search box

    Screenshot that shows where to input the addin search Screenshot that shows where to input the addin search
  • To select a key combination for your shortcut, click on the Shortcut box and then type press the key combination in your keyboard. In my case, I chose Ctrl+Shift+C

    Screenshot that shows what the interface looks like when a shortcut has been selected Screenshot that shows what the interface looks like when a shortcut has been selected

More Repositories

1

torch

R Interface to Torch
C++
495
star
2

tabnet

An R implementation of TabNet
R
109
star
3

luz

Higher Level API for torch
R
82
star
4

torchvision

R interface to torchvision
R
62
star
5

libtorch-mac-m1

LibTorch builds for the M1 Macs
50
star
6

tok

Tokenizers from HuggingFace
R
40
star
7

tabulate

Pretty Console Output for Tables
C++
39
star
8

cuda.ml

R interface for cuML
R
33
star
9

torchbook_materials

R
28
star
10

torchaudio

R interface to torchaudio
R
26
star
11

tft

R implementation of Temporal Fusion Transformers
R
25
star
12

torch-learnr

Learnr tutorials for Torch
R
19
star
13

minhub

Minimal implementation of Deep Learning models
R
17
star
14

pysparklyr

Extension to {sparklyr} that allows you to interact with Spark & Databricks Connect
R
14
star
15

torchdatasets

Extra datasets for torch
R
14
star
16

hftokenizers

Hugging face tokenizers for R using extendr
Rust
12
star
17

hfhub

Download and cache HuggingFace Hub files
R
11
star
18

tfevents

Write events for TensorBoard
C++
10
star
19

lltm

Long Long Term Memory Neural Net Cells
C++
9
star
20

torchvisionlib

torchvision C++ library extensions
C++
8
star
21

torchsparse

R interface for torchsparse
R
7
star
22

safetensors

Safetensors File Format
R
7
star
23

mall

Run multiple LLM predictions against a data frame with R and Python
R
7
star
24

lantern

C Interface to Torch
C++
5
star
25

madgrad

Madgrad optimizer interface for R
R
5
star
26

callq

Promise Task Queues
R
5
star
27

docker

Docker images for R with GPU support.
Dockerfile
4
star
28

torchwavelets

R
3
star
29

imagenet

Resources to Train ImageNet from R
Dockerfile
3
star
30

package

An R package to easily install all dependencies
R
3
star
31

duktape

R JavaScript interpreter using duktape
C
3
star
32

distributed

Resources to run deep learning with distributed computing
3
star
33

logo

An over-engineered hex logo!
2
star
34

pagedtablejs

JavaScript data table for Data Science
JavaScript
2
star
35

ISLR-deeplearning-torch

Deep Learning lab for the ISLR book.
HTML
2
star
36

libcuml-builds

1
star
37

wav

Read and write WAV files.
C
1
star