• PhlubbaDubba@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I agree pretty heartily with this metadata signing approach to sussing out AI content,

    Create a cert org that verifies that a given piece of creative software properly signs work made with their tools, get eyeballs on the cert so consumers know to look for it, watch and laugh while everyone who can’t get thr cert starts trying to claim they’re being censored because nobody trusts any of their shit anymore.

    Bonus points if you can get the largest social media companies to only accept content that has the signing and have it flag when signs indicate photoshopping or AI work, or removal of another artist’s watermark.

    • Schmeckinger@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      5 months ago

      That simply won’t work, since you could just use a tool to recreate a Ai image 1:1, or extract the signing code and sign whatever you want.

      • PhlubbaDubba@lemm.ee
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        5 months ago

        There are ways to secure signatures to be a problem to recreate, not to mention how the signature can be unique to every piece of media made, meaning a fake can’t be created reliably.

        • Schmeckinger@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          How are you gonna prevent recreating a Ai image pixel by pixel or just importing a Ai image/taking a photo of one.

          • PhlubbaDubba@lemm.ee
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            5 months ago

            Importing and screen capping software can also have the certificate software on and sign it with the metadata of the original file they’re copying, taking a picture of the screen with a separate device or pixel by pixel recreations could in theory get around it, but in practice, people will see at best a camera image being presented as a photoshopped or paintmade image, and at worst, some loser pointing their phone at their laptop to try and pass off something dishonestly. Pixel by pixel recreations, again, software can be given the metadata stamp, and if sites refuse to accept non stamped content, going pixel by pixel on unvetted software will just leave you with a neat png file for your trouble, and doing it manually, yeah if someone’s going through and hand placing squares just to slip a single deep fake picture through, that person’s a state actor and that’s a whole other can of worms.

            ETA: you can also sign the pixel art creation as pixel art based on it being a creation of squares, so that would tip people off in the signature notes of a post.

      • Feathercrown@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        5 months ago

        The opposite way could work, though. A label that guarantees the image isn’t [created with AI / digitally edited in specific areas / overall digitally adjusted / edited at all]. I wonder if that’s cryptographically viable? Of course it would have to start at the camera itself to work properly.