• kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    3
    ·
    3 months ago

    Artists in 2023: “There should be labels on AI modified art!!”

    Artists in 2024: “Wait, not like that…”

  • Hawke@lemmy.world
    link
    fedilink
    arrow-up
    33
    arrow-down
    5
    ·
    3 months ago

    Better title: “Photographers complain when their use of AI is identified as such”

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      14
      arrow-down
      3
      ·
      3 months ago

      “It was just a so little itsy bitsy teeny weeny AI edit!!”

      Please don’t flag AI please!

  • hperrin@lemmy.world
    link
    fedilink
    arrow-up
    33
    arrow-down
    5
    ·
    3 months ago

    The label is accurate. Quit using AI if you don’t want your images labeled as such.

    • BigPotato@lemmy.world
      link
      fedilink
      arrow-up
      17
      arrow-down
      2
      ·
      3 months ago

      Right? I thought I went crazy when I got to “I just used Generative Fill!” Like, he didn’t just auto adjust the exposure and black levels! C’mon!

  • pyre@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    3
    ·
    3 months ago

    or… don’t use generative fill. if all you did was remove something, regular methods do more than enough. with generative fill you can just select a part and say now add a polar bear. there’s no way of knowing how much has changed.

    • thedirtyknapkin@lemmy.world
      link
      fedilink
      arrow-up
      13
      arrow-down
      1
      ·
      edit-2
      3 months ago

      there’s a lot more than generative fill.

      ai denoise, ai masking, ai image recognition and sorting.

      hell, every phone is using some kind of “ai enhanced” noise reduction by default these days. these are just better versions of existing tools than have been used for decades.

  • A_Very_Big_Fan@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    3 months ago

    We’ve been able to do this for years, way before the fill tool utilized AI. I don’t see why it should be slapped with a label that makes it sound like the whole image was generated by AI.

  • IIII@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    3 months ago

    Can’t wait for people to deliberately add the metadata to their image as a meme, such that a legit photograph without any AI used gets the unremovable made with ai tag

  • harrys_balzac@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    3 months ago

    Why many word when few good?

    Seriously though, “AI” itself is misleading but if they want to be ignorant and whiny about it, then they should be labeled just as they are.

    What they really seem to want is an automatic metadata tag that is more along the lines of “a human took this picture and then used ‘AI’ tools to modify it.”

    That may not work because by using Adobe products, the original metadata is being overwritten so Thotagram doesn’t know that a photographer took the original.

    A photographer could actually just type a little explanation (“I took this picture and then used Gen Fill only”) in a plain text document, save it to their desktop, and copy & paste it in.

    But then everyone would know that the image had been modified - which is what they’re trying to avoid. They want everyone to believe that the picture they’re posting is 100% their work.

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    arrow-up
    7
    ·
    3 months ago

    This isn’t really Facebook. This is Adobe not drawing a distinction between smart pattern recognition for backgrounds/textures and real image generation of primary content.

  • glimse@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    3 months ago

    This would be more suited for asklemmy, this community isn’t for opinion discussions

  • Uncaged_Jay@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    3 months ago

    I saw this coming from a mile away. We will now have to set standards for what’s considered “made by AI” and “Made with AI”

  • Pika@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    3 months ago

    I’m not sure of the complaint, is the tag not accurate? If you use AI to make something are you not making it with ai? Like if I use strawberry to make a cake would the tag made with strawberries be inaccurate?

    Like I failed to see the argument, if you don’t want to be labeled as something accurate don’t use it otherwise deal with it.

    • efstajas@lemmy.world
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      3 months ago

      I do think it’s a problem when 100% of people seeing “made with AI” will assume the entire thing is AI-generated, even if all you did was use AI for a minor touch-up. If it’s really that trigger happy right now, I think it’d make sense for it to be dialled down a bit.

    • Solemn@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      The biggest use of AI in my editing flow is masking. I can spend half an hour selecting all the edges of a person as well as I can, or I can click the button to select people. Either way I do the rest of my edits as normal.

    • Bertuccio@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      3 months ago

      The complaint the photographer is making is that it’s an actual photograph where a small portion is made or changed with AI.

      They list expanding the edges of the image to change the aspect ratio, and removing flaws or unwanted objects etc.

      Removing flaws and objects at least is a task that predates modern computers - people changed the actual negatives - and tools to do it have improved so much a computer can basically do it all for you.

      I think people should just say how they modified the image - AI or not - since airbrushed skin, artificial slimming, and such have been common complaints before AI manipulation, and AI just makes those same problematic things easier.

  • PhlubbaDubba@lemm.ee
    link
    fedilink
    arrow-up
    8
    arrow-down
    6
    ·
    3 months ago

    I agree pretty heartily with this metadata signing approach to sussing out AI content,

    Create a cert org that verifies that a given piece of creative software properly signs work made with their tools, get eyeballs on the cert so consumers know to look for it, watch and laugh while everyone who can’t get thr cert starts trying to claim they’re being censored because nobody trusts any of their shit anymore.

    Bonus points if you can get the largest social media companies to only accept content that has the signing and have it flag when signs indicate photoshopping or AI work, or removal of another artist’s watermark.

    • Schmeckinger@lemmy.world
      link
      fedilink
      arrow-up
      12
      arrow-down
      2
      ·
      edit-2
      3 months ago

      That simply won’t work, since you could just use a tool to recreate a Ai image 1:1, or extract the signing code and sign whatever you want.

      • PhlubbaDubba@lemm.ee
        link
        fedilink
        arrow-up
        3
        arrow-down
        5
        ·
        3 months ago

        There are ways to secure signatures to be a problem to recreate, not to mention how the signature can be unique to every piece of media made, meaning a fake can’t be created reliably.

        • Schmeckinger@lemmy.world
          link
          fedilink
          arrow-up
          7
          arrow-down
          2
          ·
          3 months ago

          How are you gonna prevent recreating a Ai image pixel by pixel or just importing a Ai image/taking a photo of one.

          • PhlubbaDubba@lemm.ee
            link
            fedilink
            arrow-up
            1
            arrow-down
            5
            ·
            edit-2
            3 months ago

            Importing and screen capping software can also have the certificate software on and sign it with the metadata of the original file they’re copying, taking a picture of the screen with a separate device or pixel by pixel recreations could in theory get around it, but in practice, people will see at best a camera image being presented as a photoshopped or paintmade image, and at worst, some loser pointing their phone at their laptop to try and pass off something dishonestly. Pixel by pixel recreations, again, software can be given the metadata stamp, and if sites refuse to accept non stamped content, going pixel by pixel on unvetted software will just leave you with a neat png file for your trouble, and doing it manually, yeah if someone’s going through and hand placing squares just to slip a single deep fake picture through, that person’s a state actor and that’s a whole other can of worms.

            ETA: you can also sign the pixel art creation as pixel art based on it being a creation of squares, so that would tip people off in the signature notes of a post.

      • Feathercrown@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        3 months ago

        The opposite way could work, though. A label that guarantees the image isn’t [created with AI / digitally edited in specific areas / overall digitally adjusted / edited at all]. I wonder if that’s cryptographically viable? Of course it would have to start at the camera itself to work properly.

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 months ago

          Signing the photo on the camera would achieve this, but ultimately that’s just rehashing the debate back when this Photoshop thing was new. History shows us that some will fight it but ultimately new artistic tools will create new artistic styles and niches