Open Interpreter
From Notes_Wiki
Revision as of 13:14, 22 February 2025 by Saurabh (talk | contribs) (Created page with "Home > Local system based AI tools > Open Interpreter As a non-root / local user we can install interpreter via: <pre> pip install open-interpreter </pre> This install interpreter in ~/.local/bin path. Then if this is already in path we can run interpreter via: <pre> interpreter --local </pre> Ideally for this local Ollama should be setup with at least one model eg phi4 for it to work properly. Refer: * https://github.com/OpenInterpreter/o...")
Home > Local system based AI tools > Open Interpreter
As a non-root / local user we can install interpreter via:
pip install open-interpreter
This install interpreter in ~/.local/bin path. Then if this is already in path we can run interpreter via:
interpreter --local
Ideally for this local Ollama should be setup with at least one model eg phi4 for it to work properly.
Refer: