Next: "How I made one whole website to host one HN-featured blog post"
ahaferburg
Fun project. The article is a bit light on details. I find it astonishing that a project like this runs into performance issues. I would have liked to learn more about what these resource-constrained widgets looked like, what they did, and what caused the performance issues.
bnj
Plenty of comments about using an LLM to assist with this, and I was happy to be able to read about a learning experience where the stakes were pretty low and the feedback loop pretty tight. Thanks for writing it up; for me, it reminds me that some of the use cases where an LLM might be an efficient tool are also the places where it can be wise to take the opportunity to learn and sharpen new skills.
DreaminDani
Great writeup and also a great example of where LLMs can step in to help fill the gaps in areas where you don't have as much skill or interest. For instance, your wife used ChatGPT to come up with a name and you used AI to generate the admin flows that you weren't interested in building.
Sounds like Flutter was a good technology choice too, given its flexibility across platforms. As a designer, I know how frustrating it can be that the Google and Apple interface guidelines aren't too prescriptive but patterns vary so much across domains, that it's better to do what you did and evaluate what others do to solve similar problems. Great work!
show comments
fmajid
Manual data entry is just too unreliable and time-consuming. I don't see how this could work short of integrating OBD-II fuel consumption data combined with some sort of presence tracking.
show comments
avicado0o
also fwiw for small things like this, unless you want to really learn image recognition, just send the image to gemini-flash-3 or something. Sure it's 0.5-1s latency, still faster than entering it manually and it's pretty cheap, I'd reckon it's under the free tier at least for you and your family.
show comments
nottorp
Hmm an app where you can count the users on your fingers, and where it's not a big deal if it's slightly wrong.
Safe to LLM generate it, unless you want to learn something in the process, in which case do whatever parts you want to learn about manually.
Had an 100% generated app with one user - me - on my phone's home screen since some time last year.
show comments
ottomanbob
Interesting, was actually planning on setting up a carshare for our cul-de-sac in Honolulu. This is a great reference, thanks for sharing.
koala-news
Honestly, this is kind of the sweet spot for LLM-built apps.
Small thing, used by a few people, solves one annoying problem, and nobody really cares if it’s not “proper software”.
show comments
utopiah
wow... so much yak shaving, including priceless bits like "sat with ChatGPT for a bit [...] we came up with OurCar" (I mean... how original is that, clearly powerful datacenters computing over a dump of the Internet was needed), I'm impressed.
All this to avoid doing one subtraction (km before, km now) then multiplication (result times average litter/km) in your head.
That's a LOT of effort to be lazy.
show comments
fragmede
Integrate an OBDii dongle with Bluetooth and have the app read it from there.
throwy98888
Cool, glad you had fun building it.
Notably, the only parts of this that could not have been done by a well configured agent in a weekend with SOTA today is the futzing with app stores and the UX iterations.
gandutraveler
Best family app to me is home assistant.
It's so powerful and you can build so many custom UIs on it.
I started it for smart home automations but on daily basis I use it more for managing tasks,scheduling reminders.
And with Claude code remote even my not so technical wife uses it to build her tiny utility apps.
Next: "How I made one whole website to host one HN-featured blog post"
Fun project. The article is a bit light on details. I find it astonishing that a project like this runs into performance issues. I would have liked to learn more about what these resource-constrained widgets looked like, what they did, and what caused the performance issues.
Plenty of comments about using an LLM to assist with this, and I was happy to be able to read about a learning experience where the stakes were pretty low and the feedback loop pretty tight. Thanks for writing it up; for me, it reminds me that some of the use cases where an LLM might be an efficient tool are also the places where it can be wise to take the opportunity to learn and sharpen new skills.
Great writeup and also a great example of where LLMs can step in to help fill the gaps in areas where you don't have as much skill or interest. For instance, your wife used ChatGPT to come up with a name and you used AI to generate the admin flows that you weren't interested in building.
Sounds like Flutter was a good technology choice too, given its flexibility across platforms. As a designer, I know how frustrating it can be that the Google and Apple interface guidelines aren't too prescriptive but patterns vary so much across domains, that it's better to do what you did and evaluate what others do to solve similar problems. Great work!
Manual data entry is just too unreliable and time-consuming. I don't see how this could work short of integrating OBD-II fuel consumption data combined with some sort of presence tracking.
also fwiw for small things like this, unless you want to really learn image recognition, just send the image to gemini-flash-3 or something. Sure it's 0.5-1s latency, still faster than entering it manually and it's pretty cheap, I'd reckon it's under the free tier at least for you and your family.
Hmm an app where you can count the users on your fingers, and where it's not a big deal if it's slightly wrong.
Safe to LLM generate it, unless you want to learn something in the process, in which case do whatever parts you want to learn about manually.
Had an 100% generated app with one user - me - on my phone's home screen since some time last year.
Interesting, was actually planning on setting up a carshare for our cul-de-sac in Honolulu. This is a great reference, thanks for sharing.
Honestly, this is kind of the sweet spot for LLM-built apps.
Small thing, used by a few people, solves one annoying problem, and nobody really cares if it’s not “proper software”.
wow... so much yak shaving, including priceless bits like "sat with ChatGPT for a bit [...] we came up with OurCar" (I mean... how original is that, clearly powerful datacenters computing over a dump of the Internet was needed), I'm impressed.
All this to avoid doing one subtraction (km before, km now) then multiplication (result times average litter/km) in your head.
That's a LOT of effort to be lazy.
Integrate an OBDii dongle with Bluetooth and have the app read it from there.
Cool, glad you had fun building it.
Notably, the only parts of this that could not have been done by a well configured agent in a weekend with SOTA today is the futzing with app stores and the UX iterations.
Best family app to me is home assistant.
It's so powerful and you can build so many custom UIs on it.
I started it for smart home automations but on daily basis I use it more for managing tasks,scheduling reminders.
And with Claude code remote even my not so technical wife uses it to build her tiny utility apps.