• 0 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle
  • That’s a valid point, the dev cycle is compressed now and customer expectations are low.

    So instead of putting in the long term effort to deliver and support a quality product, something that should have been considered a beta is just shipped and called “good enough”.

    A good example I guess would be a long term embedded OSS project like Tasmota, compared to the barely functional firmware that comes stock on the devices that people buy to reflash to Tasmota.

    Still there are few things that frustrate me like some Bluetooth device that really shouldn’t have been a Bluetooth device, and has non-deterministic behaviour due to lack of initialization or some other trivial fault. Why did the tractor work lights turn on as purple today? Nobody knows!


  • My type is a dying breed too, the guys who do their best to write robust code and actually trying to consider edge cases, race conditions, properly sized variables and efficient use of cycles, all the things that embedded guys have done as “embedded” evolved from 6800 to Pic, Atmel and then ESP platforms.

    Now people seem to have embraced “move fast and break things” but that’s the exact opposite to how embedded is supposed to be done. Don’t get me wrong there is some great ESP code out there but there’s also a shitload of buggy and poorly documented libraries and devices that require far too many power cycles to keep functioning.

    In my opinion one power cycle is too many in the embedded world. Your code should not leak memory. We grew up with BYTES of RAM to use, memory leaks were unthinkable!

    And don’t get me started on the appalling mess that modern engineers can make with functional block inside a PLC, or their seeming lack of knowledge of industrial control standards that have existed since before the PLC.







  • These microplastics are digestible by your immune system, though, which makes them ultimately harmless. PLA is used for drug delivery for this reason.

    Being concerned about incomplete PLA degradation is like being concerned about a piece of wood breaking down into micro-woods. Yet even if you get a dangerous shard of micro-wood embedded in your skin, your body can deal with this cellose polymer just fine.

    Ultimately it will break down completely someday and in the meantime, nothing will be harmed.


  • I love the term “write-only code”, it’s perfect. I used to love Perl as it felt like it flowed straight from my brain into the keyboard. What a free and magical language.

    So it turned out I had ADHD. Took meds, went back to C/++ with renewed appreciation, haven’t touched Perl since as it horrifies me to look at it. What a nightmare of dangling references and questionable typing. Any language that allows you to cast a string to a function and call it really needs to sit down and think about what it’s doing.


  • If you don’t want memory-safe buffer overruns, don’t write C/C++.

    Fixed further?

    It’s perfectly possible to write C++ code that won’t fall prey to buffer overruns. C is a lot harder. However yes it’s far from memory safe, you can still do stupid things with pointers and freed memory if you want to.

    I’ll admit as I grew up with C I still have a love for some of its oh so simple features like structs. For embedded work, give me a packed struct over complex serialization libraries any day.

    I tend to write a hybrid of the two languages for my own projects, and I’ll be honest I’ve forgotten where exactly the line lies between them.


  • A million tiny decisions can be just as damaging. In my limited experience with several different local and cloud models you have to review basically all output as it can confidently introduce small errors. Often code will compile and run, but it has small errors that can cause output to drift, or the aforementioned long-run overflow type errors.

    Those are the errors that junior or lazy coders will never notice and walk away from, causing hard to diagnose failure down the road. And the code “looks fine” so reviewers would need to really go over it with a fine toothed comb, which only happens in critical industries.

    I will only use AI to write comments and documentation blocks and to get jumping off points for algorithms I don’t keep in my head. (“Write a function to sort this array”) It’s better than stack exchange for that IMO.


  • I tried using AI tools to do some cleanup and refactoring of some legacy embedded C code and was curious if it could do any optimization or knew any clever algorithms.

    It’s pretty good at figuring out the function of the code and adding comments, it did some decent refactoring of some sections to make them more readable.

    It has no clue about how to work in a resource constrained environment or about the main concepts that separate embedded from everything else. Namely that it has to be able to run “forever”, operate in realtime on a constant flow of sensor data, and that nobody else is taking care of your memory management.

    It even explained to me that we could do input filtering by using big arrays to do simple averaging on a device with only 1kB RAM, or use a long long for a never-reset accumulator without worrying about what will happen because “it will be years before it overflows”.

    AI buddy, some of these units have run for decades without a power cycle. If lazy coders start dumping AI output into embedded systems the whole world is going to get a lot more glitchy.



  • We’re talking about replacing lost content here though. And as such you can use the streaming services as a “backup” by re-ripping your whole collection if you lose it.

    I’m actually doing this now as part of a library cleanup. Zotify + beets are a great combo to pull down vast quantities of music and properly sort and tag it.

    Then I stream it to my phone in my truck using ampache and ultrasonic, which does have a local buffering option.

    However if you have some exotics that you ripped from rare discs, demos or prerelease, live recordings with sentimental value etc. I would suggest keeping those properly backed up. I don’t have many of these, but the ones I do have are backed up both cloud and offsite.



  • I really don’t see how building a docker container afterward makes it easier

    What it’s supposed to make easier is both sandboxing and reuse / deployment. For example, Docker + Traefik makes some tasks so incredibly easy and secure compared to running them on bare metal. Or if you need to spin up multiple instances, they can be created and destroyed in seconds. Without the container, this just isn’t feasible.

    The dockerfile uses MySQL because it works. If you want to know if the core service works with PostgreSQL, that’s not really on the guy who wrote the dockerfile, that’s on the application maintainer. Read the docs, do some testing, create your own container using its own PostgreSQL or connecting to an external database if that suits your needs better.

    Once again the flexibility of bind mounts means you could often drop that external database right on top of the one in the container. That’s the real beauty of Docker IMO, being able to slot the containers into your system seamlessly due to the mount system.

    adapting can be a pita when the package is built around a really specific environment

    That’s the great thing about Docker, it lets you bring that really specific environment anywhere and in an incredibly lightweight manner compared to the old days of heavyweight VMs. I’ve even got Docker containers running on a Raspberry Pi B+ that otherwise is so old that it would be nearly impossible to install the libraries required to run modern software.


  • The image generation can be cheap, but I was imagining this sort of watermark wouldn’t be so much a visible part of the image, but an embedded signature that hashes the image.

    Require enough PoW to generate the signature, and this would at least cut down the volumes of images created, and possibly limit them to groups or businesses with clusters that could be monitored, without clamping down on image generation in general.

    A modified version of what you mentioned could work too, but where just these specific images have to be vetted and signed by a central authority using a private key. Image generation software wouldn’t be restricted for general purposes, but no signature on suspicious content and it’s off to jail.


  • In this specific scenario, you wouldn’t want to remove the watermark.

    The watermark would be the only thing that defines the content as “harmless” AI-generated content, which for the sake of discussion is being presented as legal. Remove the watermark, and as far as the law knows, you’re in possession of real CSAM and you’re on the way to prison.

    The real concern would be adding the watermark to the real thing, to let it slip through the cracks. However, not only would this be computationally expensive if it was properly implemented, but I would assume the goal in marketing the real thing could only be to sell it to the worst of the worst, people who get off on the fact that children were abused to create it. And in that case, if AI is indistinguishable from the real thing, how do you sell criminal content if everyone thinks it’s fake?

    Anyways, I agree with other commenters that this entire can of worms should be left tightly shut. We don’t need to encourage pedophilia in any way. “Regular” porn has experienced selection pressure to the point where taboo is now mainstream. We don’t need to create a new market for bored porn viewers looking for something shocking.


  • Thanks for your input, C# is a language I never really considered but it does sound like a good middle ground and possibility a good successor to Python for her. Very popular, powerful and a better approach to a “true OOP” language than Java IMO. Though as you state modern Java has come a long way from its origins.

    overusing global/shared variables

    I see you’ve been reviewing my Python code, lol. The structure of the language does lend itself to using globals as a shortcut when they shouldn’t be… And as a primary embedded dev I will admit that I’m already a heavier user of globals than most. But I agree being able to declare global variables inside a function is pretty gross, as is the scoping/declaration issue where you can easily end up with global and local variables with the same name without even throwing a warning.


  • if you are trying to learn software engineering it is not a good language to start out with

    Curious what options you would suggest instead? I’m an old C/++ embedded diehard, but I do use Python and have been considering it as the next step for my 9yo daughter after Scratch.

    Python feels like the modern replacement for Basic that I grew up with as a kid. Interpreted, garbage collected, good library support, sane typing and not too wordy or confusing. Lots of options to do fun things with it from games to robots.

    IMO for a young beginner the C-likes are too strict and segfault-y, Perl is too permissive and could breed sloppy habits, Basic is obsolete, all the web languages are way too application specific, I haven’t had a chance to get into Rust yet, and fuck Java as a matter of principle lol.