• 0 Posts
  • 33 Comments
Joined 9 months ago
cake
Cake day: April 7th, 2025

help-circle

  • vivendi@programming.devtoProgrammer Humor@programming.devaverage c++ dev
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    5 months ago

    Yeah and those are the ones currently identified (btw that issue isn’t completely fixed) because rust never was nor advertised itself as sound. Meaning, you gotta be careful when writing Rust code too. Not as much as C++, but it’s not a magical shield against memory problems like people have been shilling it as.






  • vivendi@programming.devtoProgrammer Humor@programming.devaverage c++ dev
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    edit-2
    5 months ago

    No there is not. Borrow checking and RAII existed in C++ too and there is no formal axiomatic proof of their safety in a general sense. Only to a very clearly defined degree.

    In fact, someone found memory bugs in Rust, again, because it is NOT soundly memory safe.

    Dart is soundly Null-safe. Meaning it can never mathematically compile null unsafe code unless you explicitly say you’re OK with it. Kotlin is simply Null safe, meaning it can run into bullshit null conditions.

    The same thing with Rust: don’t let it lull you into a sense of security that doesn’t exist.


  • Ignoring warnings is really not a good way to deal with it. If a compiler is bitching about something there is a reason to.

    A lot of times the devs are too overworked or a little underloaded in the supply of fucks to give, so they ignore them.

    In some really high quality codebases, they turn on “treat warnings as errors” to ensure better code.












  • Every image has a few color channels/layers. If it’s a natural photograph, the noise patterns in these layers are different. If it’s AI diffusion however those layers will be uniform.

    One thing you can do is to overlay noise that resembles features that don’t exist (using e.g Stable Diffusion) inside the color channels of a picture. This will make AI see features that don’t exist.

    Nightshade layers some form of feature noise on top of an image as an alpha inlaid pattern which makes the quality of the image look ASS and it’s also defeated if a model is specifically trained to remove nightshade.

    Ultimately this kind of stupid arms race shit is futile. We need to adapt completely new paradigms for completely new situations.