A small Guide to Harnessing the Power of Open Interpreter and Unlocking Productivity with a ChatGPT-Like Terminal Interface

I am still digging on what can ease LLM querying in a confidential and secure way and at the same time, I am collecting stuff for what I have called personally and pompously the PAPE (Prompt Academy Project E-learning). The PAPE is still in still in WIP. But, at the same time, I found plenty of tools launched almost each week. These tools reshape constantly the world of AI daily at a breakneck pace and in unprecedented proportions that makes the very phenomenon difficult to follow. Here is a concise list of my last investigation!

For this post, you can find all files for each project on my GitHub account. See https://github.com/bflaven/ia_usages/tree/main/ia_open_interpreter

Using open-interpreter

By using interpreter, you are about to unleash the power of ChatGPT on your computer… Nice baseline. Their motto is “Anyone can code” like a kind of geek version of the “Anyone can cook” from Gusteau in Ratatouille.

Basically, you can have two usage’s strategies towards open-interpreter:

  1. You can harness ChatGPT to run code on your computer to complete some tedious tasks for example: massive file renaming,
    extract content from a bunch of pdfs and make a summary…etc One for all, every task that you have already delegate to ChatGPT exception made that it is now taking control of your computer! CAUTION: it requires a ChatGPT paid license and it will increase your monthly bill! We don’t get anything for nothing
  2. If you are thrifty and patient, with the help of interpreter combined with LM Studio, you can even take advantages of local LLMs such as Mistral, Orca, Codellama, Llama2… Every LLM available on huggingface.co. It does no cost anything, it is more secure but is slow!

You can have a look at the official website: https://openinterpreter.com/ or check the GitHub https://github.com/KillianLucas/open-interpreter.

If you are using Anaconda, you can create a specific environment e.g “open_interpreter” or use the default one aka “base”. Up to you.

Both in base or in a specific environment, it requires a paid API Key from ChatGPT e.g sk-XXX-jkhjkhjkhjkh-FAKE-jkhjhXXX454423FCFG-FAKE-jk
Using open-interpreter

  """
  [env]
  # Conda Environment
  conda create --name open_interpreter python=3.9.13
  conda info --envs
  source activate open_interpreter
  conda deactivate
  
  # if needed to remove
  conda env remove -n [NAME_OF_THE_CONDA_ENVIRONMENT]
  conda env remove -n open_interpreter
  
  # update conda
  conda update -n base -c defaults conda
  
  # to export requirements
  pip freeze > requirements.txt
  
  # to install
  pip install -r requirements.txt
  
  # [path]
  
  cd /Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy
  python 001_open_interpreter.py
  
  See https://docs.openinterpreter.com/setup#python-usage
  
  - Installation
  pip install open-interpreter
  
  
  - Console or Terminal usage
  interpreter
  
  
  """

Commands to use interpreter

Below are the three possible ways to take advantage of open interpreter. Once again remember that you must have a ChatGPT API Key and that this can also possibly cost you a little money.

For the ChatGPT API Key, check this resource:
https://platform.openai.com/docs/api-reference/

1. Using the classical way through the terminal

# Go to dir
cd /Users/brunoflaven/Documents/01_work/blog_articles/ia_prompt_academy_project/

# Tip: To save this key for later, run export 
# OPENAI_API_KEY=your_api_key on Mac/Linux or setx OPENAI_API_KEY your_api_key on Windows.

# must-have
export OPENAI_API_KEY=sk-XXX-jkhjkhjkhjkh-FAKE-jkhjhXXX454423FCFG-FAKE-jk

# Installation
pip install open-interpreter

# Console or Terminal usage
interpreter

# In the console, you can type the prompt
# Prompt
Give me 5 names for Italian cooking recipes?

# output
Sure, here are 5 traditional Italian recipes:

1 "Pasta Carbonara"                                            
2 "Lasagna alla Bolognese"                                     
3 "Pasta Norma"                                                
4 "Osso Buco alla Milanese"                                    
5 "Risotto ai funghi porcini"
# to bypass message
# Use interpreter -y to bypass this.

# to get out
# Press CTRL-C to exit.     

2. Combine with LM Studio

It requires to have installed LM Studio. Let’s leverage on Code Llama or (codellama), an LMM that has been trained specifically to write code. See below the description.

Code Llama is a model for generating and discussing code, built on top of Llama 2. It’s designed to make workflows.
faster and efficient for developers and make it easier for people to learn how to code. It can generate both code and
natural language about code. Code Llama supports many of the most popular programming languages used today, including.
Python, C++, Java, PHP, Typescript (Javascript), C#, Bash and more.

You will need to run LM Studio in the background.

  1. Download https://lmstudio.ai/ then start it.
  2. Select a model then click ↓ Download.
  3. Click the ↔️ button on the left (below 💬).
  4. Select your model at the top, then click Start Server.
  5. Once the server is running, you can begin your conversation with Open Interpreter.

# type in console
interpreter --local
# you get the command above

# prompt
Give me 5 names for Italian cooking recipes?

# output
1 Lasagna
2 Spaghetti Carbonara
3 Penne Arrabiata
4 Margherita pizza
5 Fettuccine Alfredo

3. Combine with FastAPI

It gives you the opportunity to integrate directly into FastAPI. You can check the example given in 002_open_interpreter.py

# Some examples prompts for interpreter

- Can you set my system to dark mode?
- Can you make a simple Pomodoro app?
- Can you summarize the document "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/my_notebook_of_recipes_v1.docx" and create a text file with it at the same path "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/my_notebook_of_recipes_v1.txt"?
- Can you list what is on my calendar?
- Can you rename the file "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/264166941-37152071-680d-4423-9af3-64836a6f7b60.mp4" with the following name "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/open_interpreter_commercial.mp4"
- Can you summarize the document "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/quotes_philosophy_v1.pdf" and create a text file with it at the same path "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/quotes_philosophy_v1.txt"?

Other local tools for LLM

I made few test also on Mistral on LM Studio but also there are other tools for testing LMMs locally in addition to LM Studio or Ollama such as llama.cpp, KoboldCpp, llama-cpp-python but they are very geek-oriented.

# The LLMs used in LM Studio

# Mixtral-8x7B-v0.1-GGUF
# https://huggingface.co/TheBloke/Mixtral-8x7B-v0.1-GGUF

# Mistral 7B Instruct v0.1 - GGUF
# https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF

More infos