• Stars
    star
    421
  • Rank 102,977 (Top 3 %)
  • Language
    Rust
  • License
    MIT License
  • Created 10 months ago
  • Updated 8 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

From anywhere you can type, query and stream the output of an LLM or any other script

Plock

Use an LLM (or anything else that can stream to stdout) directly from literally anywhere you can type. Outputs in real time.

demo

Write a prompt, select it, and (by default) hit Cmd+Shift+.. It will replace your prompt with the output in a streaming fashion.

Also! You can first put something on your clipboard (as in copy some text) before writing / selecting your prompt, and it (by default) Cmd+Shift+/ and it will use the copied text as context to answer your prompt.

For Linux, use Ctrl instead of Cmd.

100% Local by default. (If you want to use an API or something, you can call any shell script you want specified in settings.json)

I show an example settings.json in Settings

Note: Something not work properly? I won't know! Please log an issue or take a crack at fixing it yourself and submitting a PR! Have feature ideas? Log an issue!

Demo showing concept of Triggers, and the new flexible system

Demo using GPT-3.5 and GPT-4

If you are going to use this with remote APIs, consider environment variables for your API keys... make sure they exist wherever you launch, or directly embed them (just don't push that code anywhere)

(Original) Demo using Ollama

(in the video I mention rem, another project I'm working on)

Getting Started

Install ollama and make sure to run ollama pull openhermes2.5-mistral or swap it out in settings for something else.

Launch "plock"

Shortcuts:

Ctrl / Cmd + Shift + .: Replace the selected text with the output of the model.

Ctrl / Cmd + Shift + /: Feed whatever is on your clipboard as "context" and the replace the selected text with the output of the model.

(these two are customizable in settings.json)

Escape: Stop any streaming output

Mac will request access to keyboard accessibility.

Linux (untested), may require X11 libs for clipboard stuff and key simulation using enigo. Helpful instructions

Also system tray icons require some extras

Windows (untested), you'll need to swap out Ollama for something else, as it doesn't support windows yet.

[Settings]

There is a settings.json file which you can edit to change shortcuts, the model, prompts, whether to use shell scripts and what they are, and other settings.

After updating, click the tray icon and select "Load Settings" or restart it.

At any time you can click the tray icon and it will list the settings location. For what it's worth:

On mac, It's at ~/Library/Application Support/today.jason.plock/settings.json.

On linux, I think it's ~/$XDG_DATA_HOME/today.jason.plock/settings.json.

Windows, I think it's ~\AppData\Local\today.jason.plock\settings.json

But clicking the icon is the best way.

Correct me if any of these are wrong.

Using Settings

Take a look at the shortcut keys. A β€œtrigger” can be started with a shortcut. That points to a process (by an 0-index) and a prompt (by a 0-index) to the lists defined in the processes and prompts fields.

a process is either β€œollama” or a command (shell on mac). You can use that to call your script.

prompts can use one of the two built in variables $CLIPBOARD and $SELECTION, or any others you define using set_env_var trigger.

next_steps defines what happens to the output, which can be written to the screen (streaming or all at once ), saved to a variable, and/or kick off another trigger.

In the future i want to make it easy to trigger flexibly (cron, push), output wherever / however, easily chain things together.

very very open to feedback

Show Example
{
  "environment": {
    "PERPLEXITY_API": "",
    "OLLAMA_MODEL": "openhermes2.5-mistral",
    "OPENAI_API": ""
  },
  "processes": [
    {
      "name": "Use GPT",
      "command": [
        "bash",
        "/Users/jason/workspace/plock/scripts/gpt.sh"
      ]
    },
    {
      "name": "Execute text directly as script",
      "command": []
    },
    {
      "name": "Use perplexity",
      "command": [
        "bash",
        "/Users/jason/workspace/plock/scripts/p.sh"
      ]
    },
    {
      "name": "Use Dall-E",
      "command": [
        "bash",
        "/Users/jason/workspace/plock/scripts/dalle.sh"
      ]
    },
    "ollama"
  ],
  "prompts": [
    {
      "name": "default basic",
      "prompt": "$SELECTION"
    },
    {
      "name": "default with context",
      "prompt": "I will ask you to do something. Below is some extra context to help do what I ask. --------- $CLIPBOARD --------- Given the above context, please, $SELECTION. DO NOT OUTPUT ANYTHING ELSE."
    },
    {
      "name": "step",
      "prompt": "$STEP"
    },
    {
      "name": "say gpt",
      "prompt": "say \"$GPT\""
    }
  ],
  "triggers": [
    {
      "trigger_with_shortcut": "Command+Shift+,",
      "process": 1,
      "prompt": 0,
      "next_steps": [
        {
          "store_as_env_var": "STEP"
        },
        {
          "trigger": 4
        }
      ],
      "selection_action": null
    },
    {
      "trigger_with_shortcut": "Command+Shift+.",
      "process": 0,
      "prompt": 0,
      "next_steps": [
        "stream_text_to_screen"
      ],
      "selection_action": "newline"
    },
    {
      "trigger_with_shortcut": "Command+Shift+/",
      "process": 1,
      "prompt": 0,
      "next_steps": [
        "write_final_text_to_screen"
      ],
      "selection_action": "newline"
    },
    {
      "trigger_with_shortcut": "Command+Shift+'",
      "process": 3,
      "prompt": 0,
      "next_steps": [
        "write_image_to_screen"
      ],
      "selection_action": null
    },
    {
      "trigger_with_shortcut": null,
      "process": 0,
      "prompt": 2,
      "next_steps": [
        "stream_text_to_screen",
        {
          "store_as_env_var": "GPT"
        },
        {
          "trigger": 5
        }
      ],
      "selection_action": null
    },
    {
      "trigger_with_shortcut": null,
      "process": 0,
      "prompt": 3,
      "next_steps": [],
      "selection_action": null
    }
  ]
}

Building Plock

If you don't want to blindly trust binaries (you shouldn't), here's how you can build it yourself!

Prerequisites

  • Node.js (v14 or later)
  • Rust (v1.41 or later)
  • NPM (latest version)

Installation Steps

Node.js

Download from: https://nodejs.org/

Rust

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source $HOME/.cargo/env

Bun NPM

Whattt?? Why? - well, windows doesn't support bun in github actions afaict. So, I'm using npm instead.

How to Install Node

Project Setup

git clone <repo_url>
cd path/to/project
npm install
npm run tauri dev

Build

npm run tauri build

Inspiration / Another Great Project

Another demo

Another demo where I use the perplexity shell script to generate an answer super fast. Not affiliated, was just replying to a thread lol

Screen.Recording.2024-01-21.at.7.21.53.PM.mov

Secrets

Curious folks might be wondering what ocr feature is. I took a crack at taking a screenshot, running OCR, and using that for context, instead of copying text manually. Long story short, rusty-tesseract really dissapointed me, which is awkward b/c it's core to xrem.

If someone wants to figure this out... this could be really cool, especially with multi-modal models.

More Repositories

1

rem

An open source approach to locally record and enable searching everything you view on your Mac.
Swift
2,096
star
2

xrem

(Cross-Platform) An open source approach to locally record and enable searching everything you view on any computer.
Rust
203
star
3

portable-hnsw

What if an HNSW index was just a file, and you could serve it from a CDN, and search it directly in the browser?
HTML
76
star
4

little-worlds

Generate little pixelated worlds
HTML
44
star
5

viz-studio

Build visualizations live!
TypeScript
21
star
6

compute-shaders

Learning compute shaders in public, in Godot 4
C#
16
star
7

edujason

JavaScript
11
star
8

vlearn

With a few words and a click of a button, quickly get an engaging, high quality video. (And optionally save and share it!)
CSS
10
star
9

typical

TypeScript
10
star
10

hivemind

TypeScript
2
star
11

hit-the-spot

A dead simple game, made with Godot
C#
1
star
12

ragpipe

For quick rag tests
Python
1
star
13

rem-privacy-policy

Page to host on github pages that states we don't collect data
HTML
1
star
14

terrariaAI

Just playing with and attempting to improve the AI in an indie game called Terraria
C#
1
star
15

PyVagrantScripts

this is old code and a past version of the repo called vagrantWorkspace
Python
1
star
16

issueFinder

A simple way to find open source issues waiting for your contribution!
Ruby
1
star
17

LightsGame

Simple JS / CSS game with objective to turn off all the lights.
JavaScript
1
star
18

hivemindeditor

TypeScript
1
star
19

cssHelper

Simple css generator for stylizing text.
CSS
1
star
20

trackpad_haptic

Rust
1
star
21

freshmanGame

The game I made at my first hackathon.
HTML
1
star
22

vagrantWorkspace

After installing a few dependencies, (startVM.py) creates a puppet provisioned VM or lxc, with another puppet provisioned lxc within said inner layer.
Ruby
1
star