• Zelaf@sopuli.xyz
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    5 months ago

    As a photographer I’m a bit torn on this one.

    I believe AI art should definitely be labeled to minimize people being mislead about the source of the art. But at the same time the OP on the Adobe forums post did say they used it as any other tool for touching up and fixing inconsistencies.

    If I were to for example arrange a photoshoot with a model and they happened to have a zit that day on their forehead of course I’m gonna edit that out. Or if I happened to have an assistant with me that got in the shot but I don’t want to crop in making the background and feel of the photo tighter I would gladly remove that too. Sure Adobe already has the patch, clone and even magic eraser tool (Which also uses AI, that might or might not mark photos) to do these fix-ups but if I can use AI, that I hope is trained on data they’re actually allowed to train on, I think I would prefer that because if I’m gonna spend 10 to 30 minutes fixing blemishes, zits and what not I’d much prefer to use the AI tools to get my job done quicker.

    If the tools were however used to rigorously change, modify and edit the scene and subject then for sure, it might be best to add that.

    Wouldn’t it be better to not discourage the use of editing tools when those tools are used in a way that just makes one’s job quicker? If I were to use Lightrooms subject quick selection, should it be slapped on then? Or if I were to use an editing preset created with AI that automatically adjusts the basic settings of an image and further my editing from that, should the label be created then? Or if I have a flat white background with some tapestry pattern and don’t want to spend hours getting the alignment of the pattern just right as I try to fix a minor aspect ratio issue or want to get just a bit more breathing room on the subject and I use the mentioned AI tool in the OP.

    Things OP mentioned in his post and the scenarios I mentioned are all things you can do without AI anyways it just takes a lot longer sometimes, there’s no cheating in using the right tool for the right job IMO. I don’t think it’s too far off from someone who makes sculptures in clay uses an ice scream scoop with ridges to create texture or a Dremel to touch up and fix corners. Or a painter using different tools and brushes and scrapers to finish their painting.

    Perhaps a better idea would be if we want to make the labels “fair” there should also be a label that the photo has been manipulated by a program in general or maybe add a percentage indicator to see how much of it has been edited specifically with AI. Slapping an “AI” label on someone because they decided to get equal results by using another tool to do normal touch-ups to a photo could potentially be damaging to ones career and credibility when it doesn’t say how much of it was AI or in what reach, because now there’s the chance someone might be looking for their next wedding photographer and be discouraged because of the bad rep regarding AI.

    • parody@lemmings.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      trained on data they’re actually allowed to train on

      That’s the ticket. For touchups, certainly, that’s the key: did theft help, or not?

      • Zelaf@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        Indeed, if the AI was trained based on theft it’s neither right on their part or ethical on mine.

        I did some searching but sadly don’t have time to look into it more but there were some concerning articles that would suggest they have either used shady practices to get their training data or users having to manually check an opt out box in the app settings.

        I can’t make an opinion on it right now before looking into it more but my core argument about using AI itself in this manner, even if that data was your own on your own trained AI using allowed resources, I still believe somewhat holds.