Announcement

Collapse
No announcement yet.

Krita Looking More At GPU Acceleration & AI In 2024

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by Weasel View Post
    you are NOT allowed to write the same book yourself after from your own memory​
    That works, because a human can understand what it is allowed to do and what not. It can understand the code and license and decide if it is allowed or not to the same book. Note, I am allowed to write the same book, if the license allows me to do so. But I have to license it correctly (GPL in example). Some books, for entertainment or education, have special rights what you can do, share or mix with the content. While not all humans are doing it correctly, but we are talking about what is done correctly, not that some humans do it wrong. The point is, humans can do it correctly.

    Originally posted by Weasel View Post
    The user of the tool does.​
    That's just dancing on words. When I say the tool (ai) does not have the rights to do what I was explaining earlier, off course I mean the humans who build it and those who use it later on. Because of the problems I described earlier. A user of an AI tool, that is trained on data it was not allowed to, should not have the same rights as the user who does not use such an AI tool (unless the data is safe declared to, due to licensing issues).

    An AI does not do this correctly, on libraries and repositories of millions and billions of data. The AI doesn't care about the licenses and a human who uses the output of the AI cannot know what license the parts it is using and does not know the source. That's the issue here. That's why I think the Krita team does it correctly if they use responsible AI training data, where there is no such concern.

    I think the discussion lost its original topic a bit (even further thanks to our analogies and examples). It just simply comes down to that we don't agree with each other and your salty comments does not help either. People will off course try to use AI for content creation, because it makes some things easier. This can be build and used correctly, but most of them are not. That's why I applaud Krita devs for being sensible here.

    Comment


    • #42
      Originally posted by byteabit View Post
      That's just dancing on words.
      Bear in mind that, like with compiler optimizers, there are inherent "the law doesn't see things the way humans do, but this is the best we have for making an internally consistent set of rules" situations. That's also why the Creative Commons NonCommercial license variants tend to get shunned. There's no bright line that delineates "commercial use". (eg. Are you suddenly guilty of infringement if YouTube decides to start running ads on a non-monetized YouTube video you uploaded and you don't respond by making the video private? It's unclear.)

      I think it'll wind up depending on how the Abstraction-Filtration-Comparison test or some variation of it applies to artwork.
      Last edited by ssokolow; 17 March 2024, 06:47 PM.

      Comment


      • #43
        Originally posted by ssokolow View Post
        So basically a similarity check. This is just vague, but it only counts for the resulting product and does not address the data it was trained with. So I do not think this is the solution to the problems I was talking about.

        In example on the gaming platform Steam, Valve does not allow selling games that were developed with AI tools, where the origin of the AI stuff is unknown (such as ChatGPT or any generative AI image generators and such). Meaning games with content that are generated by AI is only allowed, if the developers can prove that the AI was trained with data they own or have the rights to train the AI with.

        Comment


        • #44
          Originally posted by byteabit View Post

          So basically a similarity check. This is just vague, but it only counts for the resulting product and does not address the data it was trained with. So I do not think this is the solution to the problems I was talking about.

          In example on the gaming platform Steam, Valve does not allow selling games that were developed with AI tools, where the origin of the AI stuff is unknown (such as ChatGPT or any generative AI image generators and such). Meaning games with content that are generated by AI is only allowed, if the developers can prove that the AI was trained with data they own or have the rights to train the AI with.
          However, it is treated as the standard by most courts, so it's likely what the law will turn to, regardless of our opinions of it.

          Comment


          • #45
            Originally posted by ssokolow View Post
            However, it is treated as the standard by most courts, so it's likely what the law will turn to
            I see and understand that reality sometimes does not function perfectly. But they rely on it at the moment, because there is no law and better way handling it. Like an emergency situation until a good solution is found; I mean better than not to do anything. It's also US centric and I am not sure if most other countries around the world care about. And it was developed in 1992, without the AI problems and knowledge of today. And it is about computer programs mostly, if I read it correctly. So it does not apply to art or basically anything an AI can generate.

            So yes, we need a "new" modern solution for modern problems. Laws can't catch up as fast as tech innovates and changes.

            Comment


            • #46
              Originally posted by byteabit View Post
              That works, because a human can understand what it is allowed to do and what not. It can understand the code and license and decide if it is allowed or not to the same book. Note, I am allowed to write the same book, if the license allows me to do so. But I have to license it correctly (GPL in example). Some books, for entertainment or education, have special rights what you can do, share or mix with the content. While not all humans are doing it correctly, but we are talking about what is done correctly, not that some humans do it wrong. The point is, humans can do it correctly.
              Yeah. And if AI can infringe on it, then you sue the owner for damages, same with animals or other tools which can't take responsibility.

              Originally posted by byteabit View Post
              That's just dancing on words. When I say the tool (ai) does not have the rights to do what I was explaining earlier, off course I mean the humans who build it and those who use it later on. Because of the problems I described earlier. A user of an AI tool, that is trained on data it was not allowed to, should not have the same rights as the user who does not use such an AI tool (unless the data is safe declared to, due to licensing issues).

              An AI does not do this correctly, on libraries and repositories of millions and billions of data. The AI doesn't care about the licenses and a human who uses the output of the AI cannot know what license the parts it is using and does not know the source. That's the issue here. That's why I think the Krita team does it correctly if they use responsible AI training data, where there is no such concern.

              I think the discussion lost its original topic a bit (even further thanks to our analogies and examples). It just simply comes down to that we don't agree with each other and your salty comments does not help either. People will off course try to use AI for content creation, because it makes some things easier. This can be build and used correctly, but most of them are not. That's why I applaud Krita devs for being sensible here.
              Nah, I actually agree about that shouldn't be allowed if it truly did infringe. But it's the same with artists, that's my point. If artists are restricted then of course so should AI.

              However there's a big problem here, and neither you nor Krita devs understand it.

              If you have difficulties proving an AI was inspired from an artwork that it doesn't have license to redistribute or whatever, what makes you think you can prove a human did it?

              So ban all artists on the off chance they did?

              See how that short-sighted it is? Why are you so strict against AI but not artists?

              I'm NOT saying I disagree with you when it comes to it being illegal. In an ideal world, where we had such perfect proof, I'd 100% agree. I'm saying that it's unfortunately hard for both AI and human artists to prove if they did do it illegally. So why ban AIs only and why so strict only against AIs but not artists?

              You're literally assuming "guilty until proven innocent" for AIs but not artists, and that's what my point was and what I had a problem with.

              Comment


              • #47
                I see, we do agree more than I thought. The thing is that humans doing it manually (let's not complicate the language, you know what I mean, even if automation is involved) have the tools and ability to provide source code or to attribute. With an AI tool you don't have that. With an AI, it's impossible to do it right, because we cannot know where the source came from (from the perspective of the creator using AI tool to generate stuff). It is always a liability (Edit: I am actually not entirely sure if liability is the correct word here, English is my third language... ) and makes people care less, because there is no way to cite, credit or otherwise link to sources. It encourages and makes it basically impossible to do it right.

                You're literally assuming "guilty until proven innocent" for AIs but not artists
                I am not, especially not in the internet. "natural" artists have the ability to do it right. Using an AI there is no way to know. Besides my other point that I think an AI should not be allowed to train on data where it does not have the license to... but that's again walking in a circle at the moment. I personally think we need laws and special copyright licenses that addresses this, how, which and if an AI is allowed to train. I think AI is very important in the future of mankind and very special case of tooling that we NEED special rules that deal with this, to make it clear for everyone.

                Yes, I'm referring to your previous statements and think an AI should be handled differently. Because we have two things talking here, the case when a user uses AI tool (which you was talking about previously), but I am also talking about the rights of an AI tool. Similar to robots.txt for search engines, but make it a law.
                Last edited by byteabit; 18 March 2024, 06:35 PM.

                Comment

                Working...
                X