Alpaca is a neat little flatpak that containerizes everything and makes running local models so easy that I can literally do it without a mouse or keyboard.
Most people do not have a local LLM in their pocket right now.
Most people have a client app that talks to a remote LLM, which ‘lives’ in an ecologically and economically dubious mega-datacenter, in their pocket right now.
Plenty of the AI functions on phones are on-device. I know the iPhone is capable of several text-based processing (summarizing, translating) offline, and they have an API for third party developers to use on-device models. And the Pixels have Gemini Nano on-device for certain offline functions.
You can get offline versions of LLMs.
And gpt-oss is an offline version of chatgpt
Indeed https://huggingface.co/openai-community
First thing that came to mind: GPT4All
I’ve been toying with Qwen3.
On my steam deck.
8 bil param model runs stably.
Its’s opensource too!
Alpaca is a neat little flatpak that containerizes everything and makes running local models so easy that I can literally do it without a mouse or keyboard.
I mean, most people have a local LLM in their pocket right now.
Unless I am missing something:
Most people do not have a local LLM in their pocket right now.
Most people have a client app that talks to a remote LLM, which ‘lives’ in an ecologically and economically dubious mega-datacenter, in their pocket right now.
Plenty of the AI functions on phones are on-device. I know the iPhone is capable of several text-based processing (summarizing, translating) offline, and they have an API for third party developers to use on-device models. And the Pixels have Gemini Nano on-device for certain offline functions.
My phone does speech-to-text flawlessly offline, it’s a crazy useful little LLM tool
Oh!
Well, I didn’t know that.
I’m too poor to be able to afford such fancy phones.
Gemini nano, Apple Intelligence On-device, etc.
https://ollama.org/