Difference between revisions of "Open Interpreter"

From Notes_Wiki
m
m
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
[[Main Page|Home]] > [[Local system based AI tools]] > [[Open Interpreter]]
[[Main Page|Home]] > [[Local system based AI tools]] > [[Open Interpreter]]


=About=
Interpreter can write code and will prompt before running on the system:
Interpreter can write code and will prompt before running on the system:


=Installation=
As root ensure python3-pip package is installed:
As root ensure python3-pip package is installed:
==On Rocky 9.x==
<pre>
<pre>
dnf -y install python3-pip
dnf -y install python3-pip
</pre>
</pre>


As a non-root / local user we can install interpreter via:
Then as a non-root / local user we can install interpreter via:
<pre>
<pre>
pip install open-interpreter
pip install open-interpreter
</pre>
</pre>


==On Ubuntu 24.04==
As root user:
<pre>
apt -y install python3-pip
apt -y install python3.12-venv
</pre>
Then as saurabh user:
<pre>
python3 -m venv open-interpreter
./open-interpreter/bin/pip install open-interpreter
</pre>
=Executing=
'''Ideally for this local [[Ollama]] should be setup with at least one non-thinking model eg phi4 for it to work properly.'''
==Rocky 9.x==
This install interpreter in ~/.local/bin path.  Then if this is already in path we can run interpreter via:
This install interpreter in ~/.local/bin path.  Then if this is already in path we can run interpreter via:
<pre>
<pre>
Line 18: Line 41:
</pre>
</pre>


Ideally for this local [[Ollama]] should be setup with at least one model eg phi4 for it to work properly. 


==Ubuntu 24.04==
Execute it from the virtual environment via:
<pre>
./open-interpreter/bin/interpreter --local
</pre>


=Sample Queries=
You can ask queries such as  
You can ask queries such as  
* What operating system is installed on this system?  
* What operating system is installed on this system?  
Line 27: Line 59:
and it quickly writes code to get answer and prompts (y/n) before running it.
and it quickly writes code to get answer and prompts (y/n) before running it.


=Usage=
Use "%reset" to reset old chat and start new session.
Use "%reset" to reset old chat and start new session.
Use 'Ctrl+C' to quit


Refer:
Refer:

Latest revision as of 17:24, 20 July 2025

Home > Local system based AI tools > Open Interpreter

About

Interpreter can write code and will prompt before running on the system:


Installation

As root ensure python3-pip package is installed:

On Rocky 9.x

dnf -y install python3-pip

Then as a non-root / local user we can install interpreter via:

pip install open-interpreter


On Ubuntu 24.04

As root user:

apt -y install python3-pip
apt -y install python3.12-venv

Then as saurabh user:

python3 -m venv open-interpreter
./open-interpreter/bin/pip install open-interpreter


Executing

Ideally for this local Ollama should be setup with at least one non-thinking model eg phi4 for it to work properly.

Rocky 9.x

This install interpreter in ~/.local/bin path. Then if this is already in path we can run interpreter via:

interpreter --local


Ubuntu 24.04

Execute it from the virtual environment via:

./open-interpreter/bin/interpreter --local



Sample Queries

You can ask queries such as

  • What operating system is installed on this system?
  • Which are top 5 processes using most RAM on this system?
  • How much free RAM is there on this computer?

and it quickly writes code to get answer and prompts (y/n) before running it.


Usage

Use "%reset" to reset old chat and start new session.

Use 'Ctrl+C' to quit


Refer:


Home > Local system based AI tools > Open Interpreter