Who is using Ollama day-to-day?

I maintain a Chrome extension that adds an AI support to any input field online.

One of my favorite reviews is a 2-star review asking for “local model support”.

My first reaction was: who installs a 100KB Chrome extension to talk to a 10GB model running locally?

But it did make me curious. Are people actually running Ollama or LM Studio as part of their daily workflow?

2 points

brauhaus

a day ago


6 comments

drsalt an hour ago

if you don't care don't waste your time. that user should make their own 100kb extension.

raxxorraxor 21 hours ago

Yes, I do. It is quite helpful. For main coding tasks I use Claude, but if my credits vanish or I have enough time to get an answer, I use Ollama extensively. I would recommend for developers to maintain their own AI pipeline as well.

For anything dealing with personal data, like browser inputs, I would exclusively use local models too. Probably still niche, but non-local AI would be a deal breaker for me for both browsers and OS.

  • brauhaus 12 hours ago

    I'm curious what the workflow actually looks like for people running Ollama day-to-day.

    Do you mostly use it through the terminal, a UI like Open WebUI, or via integrations with other tools?

    I’m trying to understand where a browser integration would actually fit - if at all

manishrana 19 hours ago

I am running but it is only useful for very easy conference tasks, or either it needs a very high computing power. Currently I was running on a 32GB Mac Studio M1, and I mostly use it for generating commit messages.

vordjs 11 hours ago

[dead]