Using Ollama with Thunderbird ThunderAI plugin

From Notes_Wiki

Home > Local system based AI tools > Ollama > Using Ollama with Thunderbird ThunderAI plugin

We can use local ollama setup via Ollama installation in thunderbird to directly send queries to a particular model via thunderbird using:

  1. Start thunderbird
  2. Go to Tools -> "Addons and Themes" -> Extensions
  3. Search for "ThunderAI" in "Fire more add-ons".
  4. It should take you to page where there is link to Add to Thunderbird against ThunderAI plugin
  5. Install addon and then again go to Tools -> "Addons and Themes" -> Extensions
  6. Click on "Wrench" option against ThunderAI plugin to go to preferences page
  7. On this page we can change connection type to "Ollama API (Local LLM)"
  8. Use Host address as : http://127.0.0.1:11434/
  9. Click on "Update models list" and we should get list of models available on local laptop (Already downloaded once).
  10. Choose Appropriate model
  11. We can manage the prompts by clicking "Manage Your prompts" button at bottom.
    Note that with thunder AI we can also export prompts as json for future use


Home > Local system based AI tools > Ollama > Using Ollama with Thunderbird ThunderAI plugin