ZDNET reviewer finds Puma Browser runs local AI effectively on Pixel phone
A ZDNET reviewer installed Puma Browser on a Pixel 9 Pro and found the free, mobile AI‑centric browser can run local large language models on a phone.
Puma Browser is available for Android and iOS and supports using local LLMs, including Qwen 3 1.5b, Qwen 3 4B, LFM2 1.2, LFM2 700M and Google Gemma 3n E2B, among others. The reviewer downloaded Qwen 3 1.5b to the Pixel; the download took over 10 minutes on wireless. The browser's local AI features are described as still in the experimental phase and may encounter issues.
In testing, the reviewer ran a query ("What is Linux?") and reported an immediate response, saying the local LLM on the Pixel 9 Pro performed as well as Ollama on a System76 Thelio desktop and a MacBook Pro. To verify local operation, the reviewer disabled both internet and wireless connectivity and received a quick answer while offline.
The reviewer highlighted implications and trade‑offs: local AI on the phone enables offline use and can avoid cloud‑based energy and privacy concerns, but LLMs consume significant storage—Qwen 3 1.5b is cited as nearly 6GB and some desktop LLMs have been nearly 20GB. The reviewer recommended trying Puma Browser for local mobile AI and noted it can also function as a regular browser.
Key Topics
Tech, Puma Browser, Ollama, Local Ai, Local Llms