Second Mate
An open-source, mini imitation of GitHub Copilot using update: Replit Code 3B or EleutherAI GPT-Neo-2.7B (via Huggingface Model Hub) for Emacs.
Setup
Inference End / Backend
- Set
device
to โcpuโ or โcudaโ inserve/server.py
- The โprimingโ is currently done in Python. If you want, modify it to another language or turn it off (from subjective experience, priming seems to help).
- Launch
serve/server.py
. This will launch a Flask app which will allow us to sample the model via REST API.
Emacs
- In
emacs/secondmate.el
, customize the URL insecondmate-url
to the address the API is running on.