A small Guide to Harnessing the Power of Open Interpreter and Unlocking Productivity with a ChatGPT-Like Terminal Interface
I am still digging on what can ease LLM querying in a confidential and secure way and at the same time, I am collecting stuff for what I have called personally and pompously the PAPE
(Prompt Academy Project E-learning). The PAPE is still in still in WIP. But, at the same time, I found plenty of tools launched almost each week. These tools reshape constantly the world of AI daily at a breakneck pace and in unprecedented proportions that makes the very phenomenon difficult to follow. Here is a concise list of my last investigation!
For this post, you can find all files for each project on my GitHub account. See https://github.com/bflaven/ia_usages/tree/main/ia_open_interpreter
Using open-interpreter
By using interpreter, you are about to unleash the power of ChatGPT on your computer… Nice baseline. Their motto is “Anyone can code” like a kind of geek version of the “Anyone can cook” from Gusteau in Ratatouille.
Basically, you can have two usage’s strategies towards open-interpreter:
- You can harness ChatGPT to run code on your computer to complete some tedious tasks for example: massive file renaming,
extract content from a bunch of pdfs and make a summary…etc One for all, every task that you have already delegate to ChatGPT exception made that it is now taking control of your computer! CAUTION: it requires a ChatGPT paid license and it will increase your monthly bill! We don’t get anything for nothing - If you are thrifty and patient, with the help of interpreter combined with LM Studio, you can even take advantages of local LLMs such as Mistral, Orca, Codellama, Llama2… Every LLM available on huggingface.co. It does no cost anything, it is more secure but is slow!
You can have a look at the official website: https://openinterpreter.com/ or check the GitHub https://github.com/KillianLucas/open-interpreter.
If you are using Anaconda, you can create a specific environment e.g “open_interpreter” or use the default one aka “base”. Up to you.
Both in base or in a specific environment, it requires a paid API Key from ChatGPT e.g sk-XXX-jkhjkhjkhjkh-FAKE-jkhjhXXX454423FCFG-FAKE-jk
""" [env] # Conda Environment conda create --name open_interpreter python=3.9.13 conda info --envs source activate open_interpreter conda deactivate # if needed to remove conda env remove -n [NAME_OF_THE_CONDA_ENVIRONMENT] conda env remove -n open_interpreter # update conda conda update -n base -c defaults conda # to export requirements pip freeze > requirements.txt # to install pip install -r requirements.txt # [path] cd /Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy python 001_open_interpreter.py See https://docs.openinterpreter.com/setup#python-usage - Installation pip install open-interpreter - Console or Terminal usage interpreter """
Commands to use interpreter
Below are the three possible ways to take advantage of open interpreter. Once again remember that you must have a ChatGPT API Key and that this can also possibly cost you a little money.
For the ChatGPT API Key, check this resource:
https://platform.openai.com/docs/api-reference/
1. Using the classical way through the terminal
# Go to dir cd /Users/brunoflaven/Documents/01_work/blog_articles/ia_prompt_academy_project/ # Tip: To save this key for later, run export # OPENAI_API_KEY=your_api_key on Mac/Linux or setx OPENAI_API_KEY your_api_key on Windows. # must-have export OPENAI_API_KEY=sk-XXX-jkhjkhjkhjkh-FAKE-jkhjhXXX454423FCFG-FAKE-jk # Installation pip install open-interpreter # Console or Terminal usage interpreter # In the console, you can type the prompt # Prompt Give me 5 names for Italian cooking recipes? # output Sure, here are 5 traditional Italian recipes: 1 "Pasta Carbonara" 2 "Lasagna alla Bolognese" 3 "Pasta Norma" 4 "Osso Buco alla Milanese" 5 "Risotto ai funghi porcini"
# to bypass message # Use interpreter -y to bypass this. # to get out # Press CTRL-C to exit.
2. Combine with LM Studio
It requires to have installed LM Studio. Let’s leverage on Code Llama or (codellama)
, an LMM that has been trained specifically to write code. See below the description.
Code Llama is a model for generating and discussing code, built on top of Llama 2. It’s designed to make workflows.
faster and efficient for developers and make it easier for people to learn how to code. It can generate both code and
natural language about code. Code Llama supports many of the most popular programming languages used today, including.
Python, C++, Java, PHP, Typescript (Javascript), C#, Bash and more.
You will need to run LM Studio in the background.
- Download https://lmstudio.ai/ then start it.
- Select a model then click ↓ Download.
- Click the ↔️ button on the left (below 💬).
- Select your model at the top, then click Start Server.
- Once the server is running, you can begin your conversation with Open Interpreter.
# type in console interpreter --local # you get the command above # prompt Give me 5 names for Italian cooking recipes? # output 1 Lasagna 2 Spaghetti Carbonara 3 Penne Arrabiata 4 Margherita pizza 5 Fettuccine Alfredo
3. Combine with FastAPI
It gives you the opportunity to integrate directly into FastAPI. You can check the example given in 002_open_interpreter.py
# Some examples prompts for interpreter - Can you set my system to dark mode? - Can you make a simple Pomodoro app? - Can you summarize the document "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/my_notebook_of_recipes_v1.docx" and create a text file with it at the same path "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/my_notebook_of_recipes_v1.txt"? - Can you list what is on my calendar? - Can you rename the file "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/264166941-37152071-680d-4423-9af3-64836a6f7b60.mp4" with the following name "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/open_interpreter_commercial.mp4" - Can you summarize the document "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/quotes_philosophy_v1.pdf" and create a text file with it at the same path "/Users/brunoflaven/Documents/01_work/blog_articles/ia_and_the_fake_prompt_academy/quotes_philosophy_v1.txt"?
Other local tools for LLM
I made few test also on Mistral on LM Studio but also there are other tools for testing LMMs locally in addition to LM Studio or Ollama such as llama.cpp, KoboldCpp, llama-cpp-python but they are very geek-oriented.
# The LLMs used in LM Studio # Mixtral-8x7B-v0.1-GGUF # https://huggingface.co/TheBloke/Mixtral-8x7B-v0.1-GGUF # Mistral 7B Instruct v0.1 - GGUF # https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF
More infos
- Perplexity AI unlocks the power of knowledge… also a good alterntaive to ChatGPT
https://www.perplexity.ai/ - Extracting Text from PDF Files with Python: A Comprehensive Guide
https://towardsdatascience.com/extracting-text-from-pdf-files-with-python-a-comprehensive-guide-9fc4003d517 - pdfplumber
https://github.com/jsvine/pdfplumber - I tested how well ChatGPT can pull data out of messy PDFs (and here’s a script so you can too)
https://source.opennews.org/articles/testing-pdf-data-extraction-chatgpt - Brandon Roberts, brandonrobertz · he/him
https://github.com/brandonrobertz - JournalismAI Discovery
https://www.journalismai.info/programmes/discovery - I tested how well ChatGPT can pull data out of messy PDFs (and here’s a script so you can too)
https://source.opennews.org/articles/testing-pdf-data-extraction-chatgpt/ - Source from opennews.org
https://source.opennews.org/ - ChatGPT Data Extraction: A quick demonstration
https://www.youtube.com/watch?v=wsSqRv-y1r4 - Amazon Science on GitHub
https://github.com/amazon-science - Llama.cpp Tutorial: A Complete Guide to Efficient LLM Inference and Implementation
https://www.datacamp.com/tutorial/llama-cpp-tutorial - Continue, the easiest way to code with any LLM
https://continue.dev/ - Open Interpreter on GitHub
https://github.com/KillianLucas/open-interpreter/#demo - Getting Started with Open Interpreter
https://docs.openinterpreter.com/introduction - Prompt Engineering Guide
https://github.com/dair-ai/Prompt-Engineering-Guide.git - All You Need to Know About Prompt Engineering
https://www.educative.io/courses/all-you-need-to-know-about-prompt-engineering - Introduction to Large Language Models
https://developers.google.com/machine-learning/resources/intro-llms - Prompting Introduction
https://github.com/dair-ai/Prompt-Engineering-Guide/blob/main/guides/prompts-intro.md - Prompt Engineering Guide
https://github.com/dair-ai/Prompt-Engineering-Guide - DAIR.AI, Democratizing Artificial Intelligence Research, Education, and Technologies
https://dair.ai/posts/ - DAIR.AI on GitHub
https://github.com/dair-ai - Source Guides are collections of tutorials, project discussions, and advice on topics of interest to developers and
interactive designers in newsrooms.
https://source.opennews.org/guides/ - Prompt Engineering for LLMs
https://maven.com/dair-ai/prompt-engineering-llms - Docs from openai.com
https://platform.openai.com/docs/overview - Statistical Claim Checking: StatCheck in Action
https://www.youtube.com/watch?v=C3L_aEy58o4 - INRIA on GitHub
https://github.com/orgs/INRIA/repositories - Tineye
https://tineye.com/ - Pimeyes
https://pimeyes.com/en - geekflare.com
https://geekflare.com/reverse-image-search-tools/ - Invid-project
https://www.invid-project.eu/tools-and-services/invid-verification-plugin/ - AFP-Medialab
https://github.com/orgs/AFP-Medialab/repositories - Politifact
https://www.politifact.com/ - Statcheck
https://team.inria.fr/cedar/projects/statcheck/ - Deepware
https://deepware.ai/ - Nunki
https://www.nunki.co/#solution