• 1 Post
  • 136 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2023

help-circle


  • Yup. For minor issues, first aid is all that is needed; you don’t need to see a doctor for a minor cut, as long as the first aid ensures it’s not infected. But for larger things, secondary aid is what provides more long-term recovery.

    If someone dislocates a shoulder, first aid is putting it in a sling and bracing it against the body, so it doesn’t get worse (for instance, the tendons and ligaments in the shoulder joint can tear) before they can get to a hospital.

    If someone is massively bleeding, first aid is stopping the bleeding to keep them alive until they can get rescued.


  • It can be, yes. One of the largest complaints with Docker is that you often end up running the same dependencies a dozen times, because each of your dozen containers uses them. But the trade-off is that you can run a dozen different versions of those dependencies, because each image shipped with the specific version they needed.

    Of course, the big issue with running a dozen different versions of dependencies is that it makes security a nightmare. You’re not just tracking exploits for the most recent version of what you have installed. Many images end up shipping with out-of-date dependencies, which can absolutely be a security risk under certain circumstances. In most cases the risk is mitigated by the fact that the services are isolated and don’t really interact with the rest of the computer. But it’s at least something to keep in mind.




  • Yeah, toxins are often the bigger risk when dealing with bacterial or fungal issues.

    For instance, botulism is caused by the toxin produced by botulinum bacteria. The toxin is a paralytic. The bacteria itself can typically be dealt with by the immune system, but the toxin wreaks havoc on the nervous system.

    That’s also why you should never feed honey to babies; botulinum is commonly found in honey. Babies’ immune systems aren’t equipped to deal with the botulinum bacteria, which allows it to bloom and start producing the toxin after they ingest it. This causes something called Floppy Baby Syndrome, from the baby being paralyzed by botulism toxin.


  • PM_Your_Nudes_Please@lemmy.worldtomemes@lemmy.worldName them
    link
    fedilink
    arrow-up
    33
    ·
    edit-2
    3 months ago

    I heard a very similar story, except it was one Italian grandma with a bunch of dudes in suits. She proceeded to serve him the single largest, most elaborate, and most delicious Italian dinner he had ever had. Apparently he could see into the kitchen, and she was making everything from scratch. He was there for like two hours, and she just kept bringing more plates out even though he hadn’t actually ordered anything. All because she was so excited to finally have someone to cook for. She even sat with him to chat, and was clearly happy to just have someone except the angry-looking dudes in suits to talk to. IIRC the suits didn’t even take payment before he was ushered out of the door.

    He tried to go back like a week later, but the place was totally deserted.








  • Yeah, this can be an unpopular opinion on Lemmy, because there’s a giant Linux circlejerk. But the unfortunate reality is that changing to Linux does have some major stumbling blocks. The “switching is so easy, just do it” crowd totally glosses over it, but that’s kind of rhetoric doesn’t help long term adoption. Because if some new user has only heard “switching is so easy” and immediately runs into issues, they’ll be more likely to go “well if it’s super easy and I can’t figure it out, I guess it’s just not for me” and abandon things.

    There’s also a very vocal (and toxic) part of the Linux community that basically just screams “RTFM” at every newbie question. New users shouldn’t be expected to dig into a 350 page technical document just to learn the basics of their new OS.





  • The fact that you don’t need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It’s gross, but it’s also hard to argue with.

    Yeah, this is basically the crux of the issue. When you get into the weeds and start looking at more than just surface-level “but it needs CSAM to make CSAM” misconception, arguments against it basically boil down to “but it’s icky.” Which… Yeah. It is. But should something being icky automatically make it illegal, even if there are no victims?

    I hate to make the comparison (for a variety of reasons) but until fairly recently homosexuality was psychologically classed as a form of destructive/dangerous kink. Largely because straight people had the same “but it’s icky” response whenever it got brought up. And we have tried to move away from that as time has passed, because we have recognized that being gay is not just a kink, it’s not just a choice, and it’s not inherently dangerous or harmful.

    To contrast that, pedophilia has remained stigmatized. Because even if it passed the first two “it’s not just a kink/choice” tests, it still failed the “it’s not harmful” test. Consuming CSAM was inherently harmful, and always had a victim. There was no ethical way to view CSAM. But now with AI, it can actually begin passing that third test as well.

    I don’t know how I feel about it, myself. The idea of “ethically-sourced” CSAM doesn’t exactly sit right with me, but if it’s possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don’t like it.

    This is really the biggest hurdle. To be clear, I’m not arguing that being an active pedo should be decriminalized. But it is worth examining whether we’re basing criminality purely off of the instinctual “but it’s icky” response that the public has when it gets discussed. And is that response enough of a justification for making/keeping it illegal? And if your answer to that was “yes”, what if it could help pedos avoid consuming real CSAM, and therefore reduce the number of future victims? If it could legitimately help reduce the number of victims but you still want to criminalize it, then you are not actually focused on reducing harm; You’re focused on feeling righteous instead. The biggest issue right now is that harm reduction is very hard to study, because it is such a taboo topic. Even finding subjects to self-report is difficult or impossible. So we’ll have no idea what kinds of impacts on CSAM consumption (positive or negative) AI will realistically have until after it is widely available.