• 0 Posts
  • 9 Comments
Joined 1 year ago
cake
Cake day: June 25th, 2023

help-circle
  • Not insane. This is true. Iirc, there’s some hormonal changes in the urine that causes the wheat/barley to grow first, depending on the sex of the fetus.

    The accuracy of this method is overexaggerated, though. Iirc, when tested, it was found to be something like 75% accurate. For what it’s worth, that’s pretty accurate for the ancient world


  • Yes, and people do do it. It’s just incredibly difficult to do it even for relatively simple programs, and the more complex the program is, the more exponentially hard the reverse engineering will be.

    The problem is not necessarily turning it into code, since many decompilers do it already for you nowadays. The issue is understanding what in the world the code is supposed to do. Normally, open source code would be commented and there would be documentation, so it’s easy to edit or build on the code. Decompiled code comes with no documentation or comments, and all the variable names are virtually illegible.

    It’s sometimes easier to build something new than to fix what’s broken, and this would be one of those cases where it’s true



  • Not gaslighting, and from what you seem to describe, doesn’t appear to be manipulative either. She just seems to be angry. Not to say that you can’t be both angry and manipulative, but I don’t see clear intent for her to try to guilt trip or gaslight you.

    Gaslighting would be if she lied and said that she sent you a message when in fact she didn’t. i.e., lying with the intent to make you question your judgment and perception

    Guilt tripping would be if she pressured you into giving her a gift as compensation for ignoring her message. i.e., taking advantage of someone’s feelings of guilt to get them to do something for you.

    I don’t see any lie, and I don’t see hee trying to extract anything out of you. Worst case interpretation, she’s being a bit petty. Best case interpretation, she’s scared of being alone outside.

    I noticed your final paragraph, and I would be cautious in general about saying that someone who’s trying to convince you that their anger is justified is automatically manipulative. That’s kind of just how anger works. People think that their anger is justified. Otherwise they wouldn’t be angry. Manipulation occurs when you start to feel like you are being used for their own motives.

    Either way, you should probably talk to her about it. It seems like she thinks the issue is more severe than you appear to think, and that is something that should be discussed with her



  • Not a paleontologist, but I think it’s a mix of both wrong information being spread back then and also new info being discovered.

    I’m pretty sure people knew that birds were dinosaurs for a while, but people just liked the idea that dinosaurs were monstrous lizards. Giant monsters just capture the imagination in a way that giant birds can’t.

    And then paleontologists started finding fossils that had imprints of feathers still on the body, and it became really hard to ignore that dinosaurs were a lot more bird-like than people would like to believe.

    My impression has generally been that once dinosaurs started to be viewed as bird-like, people started to see them as animals rather than as monsters, and that just kinda snowballed into dinosaurs becoming more and more bird-like




  • Asking ChatGPT for advice about anything is generally a bad idea, even though it might feel like a good idea at the time. ChatGPT responds with what it thinks you want to hear, just phrased in a way that sounds like actual advice. And especially since ChatGPT only knows as much information as you are willing to tell it, its input data is often biased. It’s like an r/relationshipadvice or r/AITA thread, but on steroids.

    You think it’s good advice because it’s what you wanted to do to begin with, and it’s phrased in a way that makes your decision seem like the wise choice. Really, though, sometimes you just need to hear the ugly truth that you’re making a bad choice, and that’s not something that ChatGPT is able to do.

    Anyways, I’m not saying that bosses are good at giving advice, but I think ChatGPT is definitely not better at giving advice than bosses are.