• General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    12 days ago

    Make no mistake. The US is heading in the same direction. Look at the proposed anti-deepfake laws. That guy could be prosecuted extremely harshly under those.

    • SirEDCaLot@lemmy.today
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      3
      ·
      12 days ago

      It will be interesting to see that tested in court. I don’t think anyone would complain about for example a pencil sketch of a naked celebrity, that would be considered free speech and fair use even if it is a sketch of a scene from a movie.

      So where does the line go? If the pencil sketch is legal, what if you do a digital sketch with Adobe illustrator and a graphics tablet? What if you use the Adobe AI function to help clean up the image? What if you take screen grabs of a publicity shot of the actor’s face and a nude image of someone else, and use them together to trace the image you end up painting? What if you then use AI to help you select colors and help shading? What if you do each of those processes individually but you have AI do each of them? That is not very functionally different from giving an AI a publicity shot and telling it to generate a nude image.

      As I see it, The only difference between the AI deepfake and the fake produced by a skilled artist is the amount of time and effort required. And while that definitely makes it easy to turn out an awful lot of fakes, it’s bad policy to ban one and not the other simply based on the process by which the image was created.

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        12 days ago

        It’s messy legislation all around. When does it become porn vs art vs just erotic or satirical? How do you prove it was a deep fake and not a lookalike? If I use a porn actress to make a deep fake is that also illegal or is it about how the original source content was intended to be used/consumed?

        I’m not saying that we should just ignore these issues, but I don’t think any of this will be handled well by any government.

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          12 days ago

          That’s easy. The movie studios know what post-production went into the scenes and have the documents to prove it. They can easily prove that such clips fall under deepfake laws.

          Y’all need to be more cynical. These lobby groups do not make arguments because they believe in them, but because it gets them what they want.

          • jacksilver@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            12 days ago

            I was responding to an above comment. The guy who was arrested in op’s article was posting clips from movies (so not deep fakes).

            That being said, for deepfakes, you’d need the original video to prove it was deepfaked. Additionally, you’d then probably need to prove they used a real person to make the deep fake. Nowadays it’s easy to make “fake” people using AI. Not sure where the law sits on creating deepfakes of fake people who resemble other people.

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              12 days ago

              I didn’t make the point clear. The original scenes themselves, as released by the studio, may qualify as “deepfakes”. A little bit of digital post-processing can be enough to qualify them under the proposed bills. Then sharing them becomes criminal, fair use be damned.

        • SirEDCaLot@lemmy.today
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          12 days ago

          Actually I was thinking about this some more and I think there is a much deeper issue.

          With the advent of generative AI, photographs can no longer be relied upon as documentary evidence.

          There’s the old saying, ‘pics or it didn’t happen’, which flipped around means sharing pics means it did happen.

          But if anyone can generate a photo realistic image from a few lines of text, then pictures don’t actually prove anything unless you have some bulletproof way to tell which pictures are real and which are generated by AI.

          And that’s the real point of a lot of these laws, to try and shove the genie back in the bottle. You can ban deep fake porn and order anyone who makes it to be drawn in quartered, you can an AI watermark it’s output but at the end of the day the genie is out of the bottle because someone somewhere will write an AI that ignores the watermark and pass the photos off as real.

          I’m open to any possible solution, but I’m not sure there is one. I think this genie may be out of the bottle for good, or at least I’m not seeing any way that it isn’t. And if that’s the case, perhaps the only response that doesn’t shred civil liberties is to preemptively declare defeat, acknowledge that photographs are no longer proof of anything, and deal with that as a society.

          • jacksilver@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            12 days ago

            One solution that’s been proposed is to cryptographic ally sign content. This way someone can prove they “made” the content. It doesn’t prove the content is real, but means you can verify the originator.

            However, at the end of the day, you’re still stuck with needing to decide who you trust.

            • SirEDCaLot@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              11 days ago

              Probably the best idea yet. It’s definitely not foolproof though. Best you could do is put a security chip in the camera that digitally signs the pictures, but that is imperfect because eventually someone will extract the key or figure out how to get the camera to sign pictures of their choosing that weren’t taken by the camera.

              A creator level key is more likely, so you choose who you trust.

              But most of the pictures that would be taken as proof of anything probably won’t be signed by one of those.

        • Rekorse@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          12 days ago

          Same as anything else, if it causes someone harm (in american financial harm counts) it gets regulated.

          There are exceptions to allow people to disregard laws as well. Its legal to execute a death row prisoner.

      • Rekorse@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        12 days ago

        One is banned because it can affect someone’s earnings, and is theft, the other is not banned because noone is harming another party by making a pencil drawing of a celebrity or scene.

        • SirEDCaLot@lemmy.today
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          11 days ago

          I’m not talking about the copyright violation of sharing parts of a copyrighted movie. That is obviously infringement. I am talking about generated nude images.

          If the pencil drawing is not harming anybody, is the photo realistic but completely hand-done painting somehow more harmful? Does it become even more harmful if you use AI to help with the painting?

          If the pencil drawing is legal, and the AI generated deep fake is illegal, I am asking where exactly the line is. Because there is a whole spectrum between the two, so at what point does it become illegal?

          • Rekorse@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 days ago

            It becomes harmful once you start selling it for profit based on its similarity to a real person.

            Just so you know that does happen quite a lot on a small scale. Copyright law tends to be applied once a business pattern is established around the problematic content.

            Some people get away with it selling at craft fairs and such and just noone ever hears about it.