Using Ollama with Thunderbird ThunderAI plugin
From Notes_Wiki
Home > Local system based AI tools > Ollama > Using Ollama with Thunderbird ThunderAI plugin
We can use local ollama setup via Ollama installation in thunderbird to directly send queries to a particular model via thunderbird using:
- Start thunderbird
- Go to Tools -> "Addons and Themes" -> Extensions
- Search for "ThunderAI" in "Fire more add-ons".
- It should take you to page where there is link to Add to Thunderbird against ThunderAI plugin
- Install addon and then again go to Tools -> "Addons and Themes" -> Extensions
- Click on "Wrench" option against ThunderAI plugin to go to preferences page
- On this page we can change connection type to "Ollama API (Local LLM)"
- Use Host address as : http://127.0.0.1:11434/
- Click on "Update models list" and we should get list of models available on local laptop (Already downloaded once).
- Choose Appropriate model
- We can manage the prompts by clicking "Manage Your prompts" button at bottom.
- Note that with thunder AI we can also export prompts as json for future use
Home > Local system based AI tools > Ollama > Using Ollama with Thunderbird ThunderAI plugin