• Stars
    star
    232
  • Rank 172,847 (Top 4 %)
  • Language
    Python
  • License
    MIT License
  • Created about 1 year ago
  • Updated 8 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

An innovative open-source Code Interpreter with (GPT,Gemini,Claude,LLaMa) models.

Interpreter

Hosting and Spaces:

Colab Replit PyPi Building

Support Project:

Welcome to Code-Interpreter πŸŽ‰, an innovative open-source and free alternative to traditional Code Interpreters. This is powerful tool and it also leverages the power of GPT 3.5 Turbo,PALM 2,Groq,Claude, HuggingFace models like Code-llama, Mistral 7b, Wizard Coder, and many more to transform your instructions into executable code for free and safe to use environments and even has Vision Models for Image Processing available.

Code-Interpreter is more than just a code generator. It's a versatile tool that can execute a wide range of tasks. Whether you need to find files in your system πŸ“‚, save images from a website and convert them into a different format πŸ–ΌοΈ, create a GIF 🎞️, edit videos πŸŽ₯, or even analyze files for data analysis and creating graphs πŸ“Š, Code-Interpreter can handle it all.

After processing your instructions, Code-Interpreter executes the generated code and provides you with the result. This makes it an invaluable tool for developers πŸ’», data scientists πŸ§ͺ, and anyone who needs to quickly turn ideas into working code and now with Vision Models it can also process images and videos.

Designed with versatility in mind, Code-Interpreter works seamlessly on every operating system, including Windows, MacOS, and Linux. So, no matter what platform you're on, you can take advantage of this powerful tool πŸ’ͺ.

Experience the future of code interpretation with Code-Interpreter today! πŸš€

Why this is Unique Interpreter?

The distinguishing feature of this interpreter, as compared to others, is its commitment to remain free πŸ†“. It does not require any model to download or follow to tedious processes or methods for execution. It is designed to be simple and free for all users and works on all major OS Windows,Linux,MacOS

Future Plans:

  • 🎯 We plan to integrate GPT 3.5 models. 🎯 We have added support for GPT 3.5 models.
  • 🌐 .We plan to provide Vertex AI (PALM 2) models.. We have added support for PALM-2 model using LiteLLM
  • πŸ”— We plan to provide API Base change using LiteLLM. Added Support for LiteLLM
  • πŸ€– More Hugging Face models with free-tier.
  • πŸ’» Support for more Operating Systems.
  • πŸ“ Support for Multi-Modal for Text and Vision.
  • πŸ“Š Support for Google and OpenAI Vision Models.
  • πŸ’» Support for Local models via LM Studio.
  • πŸ”— Support for Multi-Modal models from Anthropic AI.

Table of Contents

πŸ“₯ Installation

Installtion with Python package manager.

To install Code-Interpreter, run the following command:

pip install open-code-interpreter
  • To run the interpreter with Python:
interpreter -m 'gemini-pro' -md 'code' -dc
  • Make sure you install required packages before running the interpreter.
  • And you have API keys setup in the .env file.

Installtion with Git

To get started with Code-Interpreter, follow these steps:

  1. Clone the repository:
git clone https://github.com/haseeb-heaven/code-interpreter.git
cd code-interpreter
  1. Install the required packages:
pip install -r requirements.txt
  1. Setup the Keys required.

API Key setup for All models.

Follow the steps below to obtain and set up the API keys for each service:

  1. Obtain the API keys:

    • HuggingFace: Visit HuggingFace Tokens and get your Access Token.
    • Google Palm and Gemini: Visit Google AI Studio and click on the Create API Key button.
    • OpenAI: Visit OpenAI Dashboard, sign up or log in, navigate to the API section in your account dashboard, and click on the Create New Key button.
    • Groq AI: Obtain access here, then visit Groq AI Console, sign up or log in, navigate to the API section in your account, and click on the Create API Key button.
    • Anthropic AI: Obtain access here, then visit Anthropic AI Console, sign up or log in, navigate to the API Keys section in your account, and click on the Create Key button.
  2. Save the API keys:

    • Create a .env file in your project root directory.
    • Open the .env file and add the following lines, replacing Your API Key with the respective keys:
export HUGGINGFACE_API_KEY="Your HuggingFace API Key"
export PALM_API_KEY="Your Google Palm API Key"
export GEMINI_API_KEY="Your Google Gemini API Key"
export OPENAI_API_KEY="Your OpenAI API Key"
export GROQ_API_KEY="Your Groq AI API Key"
export ANTHROPIC_API_KEY="Your Anthropic AI API Key"

Offline models setup.

This Interpreter supports offline models via LM Studio and OLlaMa so to download it from LM-Studio and Ollama follow the steps below.

  • Download any model from LM Studio like Phi-2,Code-Llama,Mistral.
  • Then in the app go to Local Server option and select the model.
  • Start the server and copy the URL. (LM-Studio will provide you with the URL).
  • Run command ollama serve and copy the URL. (OLlaMa will provide you with the URL).
  • Open config file configs/local-model.config and paste the URL in the api_base field.
  • Now you can use the model with the interpreter set the model name to local-model and run the interpreter.
  1. Run the interpreter with Python:

Running with Python.

python interpreter.py -md 'code' -m 'gpt-3.5-turbo' -dc 
  1. Run the interpreter directly:

Running Interpreter without Python (Executable MacOs/Linux only).

./interpreter -md 'code' -m 'gpt-3.5-turbo' -dc 

🌟 Features

  • πŸš€ Code Execution: Code-Interpreter can execute the code generated from your instructions.

  • πŸ’Ύ Code Save/Update: It has the ability to save the generated code for future use and edit the code if needed on the go using advanced editor.

  • πŸ“‘ Offline models: It has the ability to use offline models for code generation using LM Studio.

  • πŸ“œ Command History: It has the ability to save all the commands as history.

  • πŸ“œ Command Mode: Commands entered with '/' are executed as commands like /execute or /edit.

  • πŸ”„ Mode Selection: It allows you to select the mode of operation. You can choose from code for generating code, script for generating shell scripts, or command for generating single line commands.

  • 🧠 Model Selection: You can set the model for code generation. By default, it uses the code-llama model.

  • 🌐 Language Selection: You can set the interpreter language to Python or JavaScript. By default, it uses Python.

  • πŸ‘€ Code Display: It can display the generated code in the output, allowing you to review the code before execution.

  • πŸ’» Cross-Platform: Code-Interpreter works seamlessly on every operating system, including Windows, MacOS, and Linux.

  • 🀝 Integration with HuggingFace: It leverages the power of HuggingFace models like Code-llama, Mistral 7b, Wizard Coder, and many more to transform your instructions into executable code.

  • 🎯 Versatility: Whether you need to find files in your system, save images from a website and convert them into a different format, create a GIF, edit videos, or even analyze files for data analysis and creating graphs, Code-Interpreter can handle it all.

πŸ› οΈ Usage

To use Code-Interpreter, use the following command options:

  • List of all programming languages are:

    • python - Python programming language.
    • javascript - JavaScript programming language.
  • List of all modes are:

    • code - Generates code from your instructions.
    • script - Generates shell scripts from your instructions.
    • command - Generates single line commands from your instructions.
    • vision - Generates description of image or video.
    • chat - Chat with your files and data.
  • List of all models are (Contribute - MORE):

    • gpt-3.5-turbo - Generates code using the GPT 3.5 Turbo model.
    • gpt-4 - Generates code using the GPT 4 model.
    • gemini-pro - Generates code using the Gemini Pro model.
    • palm-2 - Generates code using the PALM 2 model.
    • claude-2 - Generates code using the AnthropicAI Claude-2 model.
    • claude-3 - Generates code using the AnthropicAI Claude-3 model.
    • groq-mixtral - Generates code using the Mixtral model using Groq LPU.
    • groq-llama2 - Generates code using the Groq Llama2 model.
    • groq-gemma - Generates code using the Groq Gemma model.
    • code-llama - Generates code using the Code-llama model.
    • code-llama-phind - Generates code using the Code-llama Phind model.
    • mistral-7b - Generates code using the Mistral 7b model.
    • wizard-coder - Generates code using the Wizard Coder model.
    • star-chat - Generates code using the Star Chat model.
    • local-model - Generates code using the local offline model.
  • Basic usage (with least options)

python interpreter.py -dc
  • Using different models (replace 'model-name' with your chosen model)
python interpreter.py -md 'code' -m 'model-name' -dc
  • Using different modes (replace 'mode-name' with your chosen mode)
python interpreter.py -m 'model-name' -md 'mode-name'
  • Using auto execution
python interpreter.py -m 'wizard-coder' -md 'code' -dc -e
  • Saving the code
python interpreter.py -m 'code-llama' -md 'code' -s
  • Selecting a language (replace 'language-name' with your chosen language)
python interpreter.py -m 'gemini-pro' -md 'code' -s -l 'language-name'
  • Switching to File mode for prompt input (Here providing filename is optional)
python interpreter.py -m 'gemini-pro' -md 'code' --file 'my_prompt_file.txt'
  • Using Upgrade interpreter
python interpreter.py --upgrade

Interpreter Commands πŸ–₯️

Here are the available commands:

  • πŸ“ /save - Save the last code generated.
  • ✏️ /edit - Edit the last code generated.
  • ▢️ /execute - Execute the last code generated.
  • πŸ”„ /mode - Change the mode of interpreter.
  • πŸ”„ /model - Change the model of interpreter.
  • πŸ“¦ /install - Install a package from npm or pip.
  • 🌐 /language - Change the language of the interpreter.
  • 🧹 /clear - Clear the screen.
  • πŸ†˜ /help - Display this help message.
  • πŸšͺ /list - List all the models/modes/language available.
  • πŸ“ /version - Display the version of the interpreter.
  • πŸšͺ /exit - Exit the interpreter.
  • 🐞 /debug - Debug the generated code for errors.
  • πŸ“œ /log - Toggle different modes of logging.
  • ⏫ /upgrade - Upgrade the interpreter.
  • πŸ“ /prompt - Switch the prompt mode File or Input modes.
  • πŸ’» /shell - Access the shell.

βš™οΈ Settings

You can customize the settings of the current model from the .config file. It contains all the necessary parameters such as temperature, max_tokens, and more.

Steps to add your own custom API Server

To integrate your own API server for OpenAI instead of the default server, follow these steps:

  1. Navigate to the Configs directory.
  2. Open the configuration file for the model you want to modify. This could be either gpt-3.5-turbo.config or gpt-4.config.
  3. Add the following line at the end of the file:
    api_base = https://my-custom-base.com
    
    Replace https://my-custom-base.com with the URL of your custom API server.
  4. Save and close the file. Now, whenever you select the gpt-3.5-turbo or gpt-4 model, the system will automatically use your custom server.

Steps to add new Hugging Face model

Manual Method

  1. πŸ“‹ Copy the .config file and rename it to configs/hf-model-new.config.
  2. πŸ› οΈ Modify the parameters of the model like start_sep, end_sep, skip_first_line.
  3. πŸ“ Set the model name from Hugging Face to HF_MODEL = 'Model name here'.
  4. πŸš€ Now, you can use it like this: python interpreter.py -m 'hf-model-new' -md 'code' -e.
  5. πŸ“ Make sure the -m 'hf-model-new' matches the config file inside the configs folder.

Automatic Method

  1. πŸš€ Go to the scripts directory and run the config_builder script .
  2. πŸ”§ For Linux/MacOS, run config_builder.sh and for Windows, run config_builder.bat .
  3. πŸ“ Follow the instructions and enter the model name and parameters.
  4. πŸ“‹ The script will automatically create the .config file for you.

Star History

Star History Chart

🀝 Contributing

If you're interested in contributing to Code-Interpreter, we'd love to have you! Please fork the repository and submit a pull request. We welcome all contributions and are always eager to hear your feedback and suggestions for improvements.

πŸ“Œ Versioning

πŸš€ v1.0 - Initial release.
πŸ“Š v1.1 - Added Graphs and Charts support.
πŸ”₯ v1.2 - Added LiteLLM Support.
🌟 v1.3 - Added GPT 3.5 Support.
🌴 v1.4 - Added PALM 2 Support.
πŸŽ‰ v1.5 - Added GPT 3.5/4 models official Support.
πŸ“ v1.6 - Updated Code Interpreter for Documents files (JSON, CSV, XML).
🌴 v1.7 - Added Gemini Pro Vision Support for Image Processing.

🌟 v1.8 - Added Interpreter Commands Support:

  • 1.8.1 - Added Interpreter Commands Debugging Support.
  • 1.8.2 - Fixed Interpreter Commands
  • 1.8.3 - Added Interpreter Commands Upgrade and Shell Support.
  • 1.8.4 - Fixed Interpreter Model switcher Bug.

πŸ—¨οΈ v1.9 - Added new Chat mode πŸ—¨οΈ for Chatting with your Files, Data and more.

  • v1.9.1 - Fixed Unit Tests and History Args
  • v1.9.2 - Updated Google Vision to adapt LiteLLM instead of Google GenAI.
  • v1.9.3 - Added Local Models Support via LM Studio.

πŸ”₯ v2.0 - Added Groq-AI Models Fastest LLM with 500 Tokens/Sec with Code-LLaMa, Mixtral models.

  • v2.0.1 - Added AnthropicAI Claude-2, Instant models.

πŸ”₯ v2.1 - Added AnhtorpicAI Claude-3 models powerful Opus,Sonnet,Haiku models.

  • v2.1.1 - Added Groq-AI Model Gemma-7B with 700 Tokens/Sec.
  • v2.1.2 - Added Prompt Modes now you can set prompt from file as well just place your prompt in prompt.txt file inside system directory.

πŸ“œ License

This project is licensed under the MIT License. For more details, please refer to the LICENSE file.

Please note the following additional licensing details:

  • The GPT 3.5/4 models are provided by OpenAI and are governed by their own licensing terms. Please ensure you have read and agreed to their terms before using these models. More information can be found at OpenAI's Terms of Use.

  • The PALM models are officially supported by the Google PALM 2 API. These models have their own licensing terms and support. Please ensure you have read and agreed to their terms before using these models. More information can be found at Google Generative AI's Terms of Service.

  • The Hugging Face models are provided by Hugging Face Inc. and are governed by their own licensing terms. Please ensure you have read and agreed to their terms before using these models. More information can be found at Hugging Face's Terms of Service.

  • The Anthropic AI models are provided by Anthropic AI and are governed by their own licensing terms. Please ensure you have read and agreed to their terms before using these models. More information can be found at Anthropic AI's Terms of Service.

πŸ™ Acknowledgments

  • We would like to express our gratitude to HuggingFace,Google,META,OpenAI,GroqAI,AnthropicAI for providing the models.
  • A special shout-out to the open-source community. Your continuous support and contributions are invaluable to us.

πŸ“ Author

This project is created and maintained by Haseeb-Heaven.

More Repositories

1

langchain-coder

Web Application that can generate code and fix bugs and run using various LLM's (GPT,Gemini,PALM)
Python
107
star
2

gemini-interpreter

Gemini is Code generator and Code Interpreter for Google Gemini.
Python
74
star
3

gemini-vision-pro

Google Gemini Vision Web application with Speech and Text
Python
44
star
4

coderunner-chatgpt

Run and save the code in Chat-GPT directly in your browser, Supports upto 70+ languages.
Python
39
star
5

asset-generator

A powerful application to generate AI-generated Assets using DALL-E,Stable Diffustion and DeepAI.
Python
30
star
6

fast-server

FastServer is simple API based server created in FastApi with Clients in various languages.
C++
24
star
7

auto-filer-pro

AI AutoFiler Pro is a tool that automates the process of creating files, sorting data, reading and viewing them, and finding data using artificial intelligence
Python
13
star
8

gpt-3-tokenizer

ChatGPT-3-Tokenizer is tokenizer which calculates tokens and price of input query.
Python
13
star
9

auto-copilot

πŸ› οΈβœ¨ Automation Tool for GitHub Copilot πŸ’ͺ that Auto Fixes code with Live Preview πŸš€πŸ€―πŸ€©
Python
12
star
10

code-runner-extension

Run and save the code in ChatGPT. Supports upto 70+ languages.
JavaScript
12
star
11

heavens-gpt

HeavenGPT is ChatGPT for coders. It's free, secure, and created by Heaven.
Swift
10
star
12

roborun-nes

RoboRun-NES is platform NES game created in C for homebrew project.
Assembly
9
star
13

GTLibCpp

GTLIbCpp is library to make game trainer in C++ it provide all the necessary methods to make simple game trainer with Cheat-Engine support
C++
8
star
14

boardgamr-engine

BoardGamr-Engine is Game Engine to make console board games in c/c++ it provide all the necessary methods to make simple game using only ASCII characters provides various types of board,game cards,dice and player modules
C
6
star
15

crypto-analyzer-bot

This bot analyze crypto market using Binance for coin and alert using Voice or WhatsApp message.
Python
6
star
16

ChatGPT-Plugins-List

This repo contains all the plugins that were released on Plugins store.
Python
4
star
17

Trackmania-ImGui-Dx9-Hooking

Trackmania Helper info for Tracks,challenges and Login info.
C++
4
star
18

Halal-ECodes-Checker

ECode checker check food items for Halal or Haram based on E-Code of item given as input and also filter out Halal,Haram E-Codes from database.
C#
4
star
19

claude-vs-gpt

Claude 3 vs GPT 4 Techincal App development tests
Python
4
star
20

Salary-Calculator

Salary Calculator give salary conversions with different currencies Monthly/Annually.
Python
3
star
21

haseeb-heaven

Heaven Readme
3
star
22

Openplanet-Plugin-Libs

This repo consists list of Helper libraries for Openplanet plugins written in AngelScripts, it contains mostly Utilities,UI,Time libraries which you can include in your Scripts and write plugins easily.
AngelScript
3
star
23

NotepadQt

NotepadQt is simple notepad editor created in Qt framework for educational purpose
C++
3
star
24

Universal-Data-Explorer

All in one Universal Data analyzer tool with beautiful data table exploration and graph visualization
Python
3
star
25

QuranCloudApp

QuranCloudApp is application to get data from Quran like Ruku,Ayah,Page and also search engine to search any text in Quran and get lists of Quran types,formats etc. which fetches data from [QuranCloud](https://alquran.cloud/) server and shows data to user.
C
3
star
26

IGI-Injector

IGI-Injector is DLL Injector for Project I.G.I with GUI.
C#
2
star
27

Frames-Manager-TM-2020

Frames manager Plugin to manage in game frames for Trackmania 2020.
AngelScript
2
star
28

AwesomeCpp-Libraries

Awesome Cpp libraries for various tasks Http Req,Parsers File/Event Mgr,Maths,Container etc
C++
2
star
29

OpenAI-ImageGenerator

OpenAI-ImageGenerator is image generator using OpenAPI.
Python
2
star
30

PrayerTimes-Plugin

This project is a ChatGPT plugin that can show the prayer timings for any location in the world.
Python
2
star
31

GTLibcLib

GTLibcLib is project to build static libraries for GTLibc.
C
2
star
32

Covid-19-Tracker

Covid-19-Tracker application written in .NET provides data of more than 170 countries with latest data from JHU & CSBS data source.
C#
2
star
33

IconFinder-App

GameSprite-Generator is a command line tool to download sprites from https://www.iconfinder.com/, a website that provides high quality icons for designers and developers.
Python
2
star
34

chatgpt-plugin-quickstart-python

ChatGPT Plugins Quick-start guide using Python fraemeworks
Python
1
star
35

Py-SourceCode-Convertor

This tool that allows users to convert source code written in C or C++ into assembly code and hexadecimal code.
Python
1
star