Talking to a local LLM in the Firefox sidebar

code.mendhak.com

1 point

modinfo

3 days ago


4 comments

KetoManx64 2 days ago

I tried this a few months back, but the LLM sidebar doesn't offer any extra integration that I saw? No variables and no ability to include the page you're looking at as context by doing something like "@selection" or "@currentpage" or similar. I don't understand what the point of having it in your sidebar is when most people already have shortcuts or workflows already setup for their favorite LLM chat applicacations.

PaulShomo 3 days ago

Despite the common plan for many browsers to include their own local model, I feel nobody is really promoting a practical workflow for in-browser LLM side-usage. It's a good time to explore designs like this.

  • KetoManx64 2 days ago

    Those that care about local models already have the knowledge/abilitu to run them with things like LLMStudio, Llama.cpp

modinfo 3 days ago

TL;DR

Open `about:config`

Set `browser.ml.chat.hideLocalhost` to `false`

Chatbot sidebar shows "localhost" when Open WebUI runs there.

Change `browser.ml.chat.provider` to the correct URL.