I hate the AI slop prose, but I like the idea of doing stuff on-device.
If you're willing to spend a little more than $100, you can actually run not just an agent, but pretty decent local models on a phone or tablet. Gemma 4 E4B-it runs blazingly fast on an M4 iPad or current gen high-end Android device in Google's AI Edge Gallery, I wonder if you could run it in a way that exposes an OpenAI API so you could have an agent or some other local tool query the model. What useful thing you'd do with it, I dunno. But, I probably wouldn't want to code on a phone, either, so I don't know what I'd do with Claude Code on my phone. I don't know anything about networking limitations within Android or how the Linux container is isolated...maybe a UNIX socket file would work, even if you can't do anything on the local network.
Which agents work in Termux? Maybe Pi? (I'm surprised Claude Code doesn't, but I guess Claude Code is a bonkers weird design for a terminal app, doing a lot of very unusual things.)
If you're willing to spend a little more than $100, you can actually run not just an agent, but pretty decent local models on a phone or tablet. Gemma 4 E4B-it runs blazingly fast on an M4 iPad or current gen high-end Android device in Google's AI Edge Gallery, I wonder if you could run it in a way that exposes an OpenAI API so you could have an agent or some other local tool query the model. What useful thing you'd do with it, I dunno. But, I probably wouldn't want to code on a phone, either, so I don't know what I'd do with Claude Code on my phone. I don't know anything about networking limitations within Android or how the Linux container is isolated...maybe a UNIX socket file would work, even if you can't do anything on the local network.