Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I see this over and over again. I don't dispute your experience. My experience with ESP32 development has been unreasonably positive. My codebase is sitting around 600k LoC and is the product of several hundred Opus 4.x Plan -> Agent -> Debug loops. I review everything that goes through, but I'm reviewing the business logic and domain gotchas, not dumb crap like what you and so many others describe.

What is so strange to me is that surely there is more C# out there than ESP-IDF code? I don't have a good explanation beyond saying that my codebase is extensively tested and used; I would know very quickly if it suddenly started shitting the bed in the way you explain.

 help



600k lines of code for anything on the ESP32 sounds like the absolute polar opposite of “good”

Tell us you've never built anything significant without telling us?

Tell us you know nothing about embedded without telling us

Okay Mr. account created 22 days ago...

https://news.ycombinator.com/item?id=47213963


Everyone knows internet points make someone more of an expert. Especially on websites that have the most inane political discussions frequently and has tanked in quality to only marginally better than Reddit

Isn't it funny that you're literally the problem that you're describing?

> My experience with ESP32 development has been unreasonably positive. My codebase is sitting around 600k LoC and is the product of several hundred Opus 4.x Plan -> Agent -> Debug loops.

I feel like this is an example of people having different standards of what “good” code is and hence the differing opinions of how good these tools are. I’m not an embedded developer but 600K LOC seems like a lot in that context, doesn’t it? Again I could be way off base here but that sounds like there must be a lot of spaghetti and copy-paste all over the codebase for it to end up that large.


I don't think it's that large. Keep in mind embedded projects take few if any dependencies. The standard library in most languages is far bigger than 600k loc.

I work with ESP32 devices and 600k lines of code is insane.

I'm curious: What does this device do?

It's wild to come back to this after a day away and have the takeaway from my attempt to answer the question with punditry about the size of my codebase from people who don't have any idea what my device does.

Answering this question directly puts me in an awkward spot because I realized last fall that there was absolutely no way that I could talk about what I'm working on in a way that can be associated with my product because there's so much anti-AI activism right now. That sucks, because I'd like to be "loud and proud" but I have a family to feed. I strongly suspect that versions of my story are playing out for hundreds of entrepreneurs right now.

Here's what I can describe: it's an ESP32-P4 based consumer device with about 45 ESP-IDF components that all communicate over an event bus. There's a substantially modified LVGL front-end with a 3D rendering engine and SVG-like 2D animation in front of a driver for a customized variation of the ST7789. There is substantial custom code for both USB host and client functions across various modes of operation. There's custom drivers for several sensors and haptic feedback. There's a very elaborate menu UI system which is also backed by a BBS style terminal configuration system for power users. There's an assignable action system with about 40 actions that all have their own state machines and a lot of mutex locking. There's a very involved and feature-dense trigger scheduling system. There's a very flexible data stream routing matrix. There's a full suite of command line scripts for most functions. There's a self-hosted web app for configuration that also implements a screen share functionality via an HTML canvas object so that I can record videos of what's happening on the device with OBS without having to point a DSLR at it from a gantry.

Honestly, I could go on and on, but all of the people who think that 600kloc is a lot [sight unseen] are following YouTube tutorials and can eat me.

I responded to you because you asked politely. I hope it was an interesting reply.


The more code is out there, the worse is the average in the training dataset. There will be legacy approaches and APIs, poor design choices, popular use cases irrelevant for your context etc that increase the chances of output not matching your expectations. In Java world this is exactly how it works. I need 3-5 iterations with Claude to get things done the way I expect, sometimes jumping straight to manual refactoring and then returning the result to Claude for review and learning. My CLAUDE.md (multiple of them) are growing big with all patterns and anti-patterns identified this way. To overcome this problem model needs specialized training, that I don‘t think the industry knows how to approach (it has to beat the effort put in the education system for humans).

> To overcome this problem model needs specialized training, that I don‘t think the industry knows how to approach

We already have coding tuned models i.e. Codex. We should just have language / technology specific models with a focus on recent / modern usage.

Problem with something like Java is too old -- too many variants. Make a cut off like at least above Java 8 or 17.


> We should just have language / technology specific models with a focus on recent / modern usage.

The “just” part is a big assumption. It is far from easy, given that modern best practices are always underspecified. The effective model for coding must have reasoning signals to be much stronger than coding patterns, and that, I suspect, requires very different architecture.


I also believe this must be true. Try asking Claude to program in Forth, I find the results to be unreasonably good. That's probably because most of the available Forth to train on is high quality.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: