4 comments

  • SwellJoe 55 minutes ago
    I hate the AI slop prose, but I like the idea of doing stuff on-device.

    If you're willing to spend a little more than $100, you can actually run not just an agent, but pretty decent local models on a phone or tablet. Gemma 4 E4B-it runs blazingly fast on an M4 iPad or current gen high-end Android device in Google's AI Edge Gallery, I wonder if you could run it in a way that exposes an OpenAI API so you could have an agent or some other local tool query the model. What useful thing you'd do with it, I dunno. But, I probably wouldn't want to code on a phone, either, so I don't know what I'd do with Claude Code on my phone. I don't know anything about networking limitations within Android or how the Linux container is isolated...maybe a UNIX socket file would work, even if you can't do anything on the local network.

  • argsnd 1 hour ago
    I’m sorry but that README.md is full of extremely obnoxious AI prose for what is essentially saying “buy an old Pixel and install Claude Code on it”
    • ozlikethewizard 1 hour ago
      It also tells one to SSH in from their PC. Why would I use a phone instead of my PC if that was an option?
  • bitwize 1 hour ago
    Android "Linux terminal" is still jank compared to termux + proot distro.
    • SwellJoe 1 hour ago
      Which agents work in Termux? Maybe Pi? (I'm surprised Claude Code doesn't, but I guess Claude Code is a bonkers weird design for a terminal app, doing a lot of very unusual things.)
  • JoJ84 2 hours ago
    [dead]