>I still feel like there's an unexplored space where you combine the benefits of both somehow though. Like a code-based CAD but it also has a GUI editor that stays in sync and avoids the need to type in coordinates by hand. That would be extremely difficult though.
I think you can do this if the data representation of operations and values is human readable. The simplest implementation would restrict the language/data format to operations representable in the gui.
Unlike trying to solve the "visual programming" problems, we don't need or desire Turing completeness.
>Compare that to someone writing embedded firmware for device microcontrollers, who would understandably be underwhelmed by the same.
One datum for you: I recently asked Claude to make a jerk-limited and jerk-derivative-limited motion planner and to use the existing trapezoidal planner as reference for fuzzy-testing various moves (to ensure total pulses sent was correct) and it totally worked. Only a few rounds of guidance to get it to where I wanted to commit it.
My comment above I hope wasn't read to mean "LLMs are only good at web dev." Only that there are different capability magnitudes.
I often do experiments where I will clone one of our private repos, revert a commit, trash the .git path, and then see if any of the models/agents can re-apply the commit after N iterations. I record the pass@k score and compare between model generations over time.
In one of those recent experiments, I saw gpt-oss-120b add API support to swap tx and rx IQ for digital spectral inversion at higher frequencies on our wireless devices. This is for a proprietary IC running a quantenna radio, the SDK of which is very likely not in-distribution. It was moderately impressive to me in part because just writing the IQ swap registers had a negative effect on performance, but the model found that swapping the order of the IQ imbalance coefficients fixed the performance degradation.
I wouldn't say this was the same level of "impressive" as what the hype demands, but I remain an enthusiastic user of AI tooling due to somewhat regular moments like that. Especially when it involves open weight models of a low-to-moderate param count. My original point though is that those moments are far more common in web dev than they are elsewhere currently.
EDIT: Forgot to add that the model also did some work that the original commit did not. It removed code paths that were clobbering the rx IQ swap register and instead changed it to explicitly initialize during baseband init so it would come up correct on boot.
Ah yes the magic is more developed for commonly documented cases than niche stuff, 100% sorry I misinterpreted your post to mean that they are not useful for embedded rather than less capable for embedded. Also, your stuff is way more deep than anything I am doing (motion planning stuff is pretty well discussed online literature).
It's remarkable how these papers show a deep understanding of programming 50 years ago. Even with anemic hardware, the limit is always in the programmers brain - as uncomfortable as that is to admit. Half a century of new tech and AI and the cloud etc, we still hit "terminal trauma" fairly quickly in the development cycle, almost like clockwork. All the tools and technical tricks don't seem to matter vs. our ability to hold the application in our heads.
Trying to get my Klipper-like thing for ClearCore up and running (currently called Cutter) so I can have a smaller mostly unchanging c++ codebase on the mcu and do higher-level application control of motors, outputs and sensors on linux in rust.
should be the transitive dependencies, not just top-level (so the lock file or equiv) or you just reward the "barely wrap it and give it a new name" js crowd even more.
What’s the deal with 5v, 3.3v and 24v “standards” for sensors? It seems like there are really three different markets and it sucks because crossing “lanes” is really annoying. I like how you all made the qwik connectors “just work”, but now that I’m trying more industrial stuff I’m having a hard time figuring out how to get my 24v world to play nicely with the 3.3v world but of course my 24v world only wants to do SPI over 5v.
Anyhoo, sorry we can’t just stick to the technical drama.
Those levels are based on the electronics themselves. Earlier circuits used TTL which needed higher voltages to signify a "High". Newer CMOS based electronics need less voltage.
Lower voltages help with power savings. Higher voltages can and do work better in high power, high noise environments though! 24V as you see is still very popular and useful inany applications.
> What’s the deal with 5v, 3.3v and 24v “standards” for sensors?
Historical garbage and different manufacturing technologies. Be happy if you can get away with only 5V and 3V3 rails in your project. 24V is usually to interface with industrial sensors. And sometimes you see 12V as well, for stuff that's RS232 based.
And on top of that you got a fifth standard, 4..20 mA current loops. That one is used for long range transmission of analog values of a single sensor per wire pair, with 4-20 mA being seen as the value (4 mA = 0%, 20 mA = 100%), and anything less being seen as a cable break, anything higher as a short circuit somewhere.
4 to 20mA signaling is only the start of a very specific rabbit hole. Someone had the brilliant idea to encode digital signals on top of the analog current loop. The result is the HART communication protocol, which is old, bloated, confusing, quirky - and it is really popular in industrial automation.
Those voltage standards are kind of meant for very local things. If you really get into industrial things, one should look to industrial standards that work over longer distances. That is things like RS422, RS485, and increasingly industrial versions of ethernet that use differential signals. One should also learn what a PLC is and understand that in an industrial context, implementing controls in an Arduino or rpi is probably reinventing the wheel to achieve less reliability than industry standards.
4 to 20mA sensors are great. Invented in the 50s (!) to replace pneumatic controls and to this day work great. iirc they are usually 24V these days. You missed an important detail; the first 4mA (96mW) powers the sensor/local microcontroller (no local power supply required), and the remaining 4-20mA gives a calibrated current output for voltage/pressure/whatever you are measuring. If the output is less than 4mA or more than 20mA you know something is wrong (and many devices will output 20.1, 20.2 etc currents as a kind of fault code).
Never used 2-10V but I learned about 0-10V when someone approached me to design a device to input 0-10V position signal and output two phase-shifted sinusoids to retrofit to a controller that only took resolver inputs. We shipped a couple dozen of them to repair broken machine tools. Fun project, but not going to get rich from it!
I'm guessing that the 2-10V is to detect line break conditions?
At least going between 3.3 and 5 Volts, they depend on the fab process used to make the chip. Fab capacity is more widely available for the lower voltages, making it inconvenient to keep supplying higher voltage chips. It has also gotten easier to do high performance analog at lower voltages.
Yeah, it's a pain. Many of my boards have both 3.3 and 5 Volt rails. There are quite a number of level-shifting logic buffers, for instance that are powered by 3.3 but accept 5-V inputs with no penalty.
For hobbyist type stuff, a 3.3 V CMOS chip will accept a 5 V logic signal if you feed it through a series resistor, since the built-in protection diodes of the CMOS chip will clamp the voltage. Don't let the engineers catch you doing it. ;-) But I often use a series resistor to provide a little bit of overload protection to a CMOS input.
Little or no logic ever operated at 24 V, other than relays. There's always some level translation needed there. The higher voltage follows the same rule as electric transmission lines: Correspondingly lower current allows for thinner wire, of importance if you're driving something like a solenoid valve.
So back in the day ducks under thrown bottles and tomatoes we had the standard 4000 series on 3V-18V and life was grand. The CMOS chips, less grand.
Then we had the 7400 series around then too, and boy howdy did that product line proliferate. The BJT TTL families got in so many things, but it demanded Vcc of 5V +-5% and I think it is safe to say anything expected to work with those chips standardized on 5V. Great, everyone is agre— ah.
74C is a CMOS line that happily uses 3V-15V. Ok, we can ignore that one it's just a... Nevermind, 74HC and friends do 2V-6V, neat. But it still works with the 5V so we're great.
So, CMOS got better, we got gates down to 3V or less with competitive propagation timing, and it turns out that less power use is good for clocking, but we can't actually feed anything 3V because of voltage drop so it's 10% higher cause... someone figured that was a nice round number or something?
As for why it keeps dropping... Thinner conductor channels, thinner insulation, you do the math. Yes, power consumption with more gates is a reason too, but I'm pinning the biggest blame badge on decreasing sizes in the process nodes.
Oh, you know all that already and want to know about just the sensor/interconnect split? My bad. Line drop, afaik. 3.3V signalling is fine on a PCB or when your wires are short. Want to move your sensors further away? Maybe 5V will be a tad more reliable. That sensor over 20m away? Sure, 24V sounds appealing. I don't know the exact lengths where voltage drop on the wiring is enough to cause issues but I'm pretty sure that kind of reasoning in why it persists beyond simple logic compatibility.
As a secondary concern of the same form, non-differential protocols are voltages you have knowledge and control of, the various kinds of line noise less so. A data line offset by a volt is more problematic at 3.3V than 24V
Oh and if I'm driving a small industrial motor or actuator, or a 400V rated relay, I definitely want to be doing so with more enthusiasm than 5V signifies. I also want less enthusiasm than electrocution cause I touched a logic line. 24V will (just about) generally not give you a meaningful shock unless you try to lick it.
great question! so historically microcontrollers (and sensors) were 5V 'CMOS' power and logic. this was way better than the up-to-12V for TTL logic but over time the desire for higher clock speeds / faster IO / lower power means the voltage needs to drop (since power = current * voltage lower voltage is lower power) the next voltage standard became 3.3V. these days, even 3.3V is a 'bit high' and we're seeing lots of device that are 1.8V or 1.65V or even 1.2V max (yeek!) one thing we do for all of our sensor breakouts is add level shifting up/down as necessary so they work with EITHER 5V older boards (yay no need to throw them out!) or with the newer 3.3V boards (woo forward compatibility) level shifting and regulation also reduces the risk of damage from over/under volting or plugging stuff in backwards. this is documented here: https://learn.adafruit.com/introducing-adafruit-stemma-qt/st...
maybe someone from sparkfun could post advice for you here too...
Ooh thank you! I often forget that everything is a capacitor/resistor/inductor all at once and i see how at higher frequencies that starts to matter! I think the 24v stuff is also more low frequency signaling over longer distances so rise/fall time is less of a worry but voltage drop / noise is perhaps more of one. Thanks!
Fwiw, I’m team adafruit on this. Hope it works out for y’all
Ah, yep. Did conflate coal with oil. I guess my nice analogy doesn't quite hold, but the point stands that plastic originally came from organic matter and is technically biodegradable.
https://wiki.c2.com/?PlanToThrowOneAway
reply