MunchkinWorks Posted March 24, 2023 Posted March 24, 2023 Everyone is jumping on the AI generated images bandwagon, including Adobe. It's all fun. Except for the artists who didn't consent to have their artwork used to train datasets, as Adobe Stock doesn't even have an opt-out feature yet from what I have read so far. What I would love to see is how my art software can help protect my artwork against AI theft. There are already experimental tools being developed like Glaze. Maybe something with the export settings on an image where I can select a type of protection: noise like Glaze uses, heavy watermark pattern, etc. You could create these manually but I think it's more convenient if there are ready options in the export panel where you can change the level of protection. Would the team behind Affinity be interested in providing something like this? fiery.spirit, Thomahawk, PaoloT and 13 others 16 Quote
kokoro Posted March 28, 2023 Posted March 28, 2023 Bumping and also interested to know this information/if a feature will be coming out to protect artist created content? Mr. Doodlezz, fiery.spirit and Chills 3 Quote
PaulEC Posted March 28, 2023 Posted March 28, 2023 Apparently although there is software intended to prevent AI from stealing your artwork, some AI company’s are already trying to find ways to circumvent this, as they consider it is their right to take any images they want, even if the creator/copyright holder of those images explicitly takes steps to prevent them doing so. Nice people! Mr. Doodlezz, paleolith and fiery.spirit 3 Quote Acer XC-895 : Core i5-10400 Hexa-core 2.90 GHz : 32GB RAM : Intel UHD Graphics 630 – Windows 11 Home - Affinity Publisher, Photo & Designer, v2 (As I am a Windows user, any answers/comments I contribute may not apply to Mac or iPad.)
StrixCZ Posted March 29, 2023 Posted March 29, 2023 AFAIK there is no way to do this that wouldn't ruin the image for everyone (like the heavy watermark you mention). Any image publicly visible on the internet can be stolen used for training by AI models - expecting some EXIF tag or opt-out in the TOS of a public gallery to prevent it is just naive and unrealistic. Sorry to break this to you (I'm not very happy about it either) but that's just the way things are... Quote
Bobby Henderson Posted March 29, 2023 Posted March 29, 2023 13 hours ago, PaulEC said: Apparently although there is software intended to prevent AI from stealing your artwork, some AI company’s are already trying to find ways to circumvent this, as they consider it is their right to take any images they want, even if the creator/copyright holder of those images explicitly takes steps to prevent them doing so. Nice people! I wonder how those same nice people will feel if someone steals the source for their "AI" software. The hypocrites will get really big on upholding copyright law if that happened. thedzko, ASUNDER, fiery.spirit and 1 other 4 Quote
Mr. Doodlezz Posted March 29, 2023 Posted March 29, 2023 There's already this script for Windows users available, maybe someone with coding knowledge could come up with an add-on? I've also seen a kind of label that you can put on your work to make a more obvious statement, similar to the CC symbols, but I can't seem to find them again. 😟 4 hours ago, StrixCZ said: AFAIK there is no way to do this that wouldn't ruin the image for everyone Not entirely true though, there are ways to add »invisible« watermarks to your images. At least they are invisible to the human eye and consist of patterns that change the hue/brightness insignificantly enough that machine learning cannot properly analyse the images. 😉 MunchkinWorks 1 Quote
Pšenda Posted March 29, 2023 Posted March 29, 2023 11 minutes ago, Mr. Doodlezz said: Not entirely true though, there are ways to add »invisible« watermarks to your images. At least they are invisible to the human eye and consist of patterns that change the hue/brightness insignificantly enough that machine learning cannot properly analyse the images. 😉 But these watermarks only confirm the author of the work, but do nothing to prevent AI technology from exploiting the content of the work for its own (unauthorized) purposes. Mr. Doodlezz 1 Quote Affinity Store (MSI/EXE): Affinity Suite (ADe, APh, APu) 2.5.7.2948 (Retail) Dell OptiPlex 7060, i5-8500 3.00 GHz, 16 GB, Intel UHD Graphics 630, Dell P2417H 1920 x 1080, Windows 11 Pro, Version 24H2, Build 26100.2605. Dell Latitude E5570, i5-6440HQ 2.60 GHz, 8 GB, Intel HD Graphics 530, 1920 x 1080, Windows 11 Pro, Version 24H2, Build 26100.2605. Intel NUC5PGYH, Pentium N3700 2.40 GHz, 8 GB, Intel HD Graphics, EIZO EV2456 1920 x 1200, Windows 10 Pro, Version 21H1, Build 19043.2130.
Mr. Doodlezz Posted March 29, 2023 Posted March 29, 2023 20 minutes ago, Pšenda said: But these watermarks only confirm the author of the work, but do nothing to prevent AI technology from exploiting the content of the work for its own (unauthorized) purposes. Did you actually read the article? Because exactly the opposite is the case here! See the linked script above in my comment. Quote
Pšenda Posted March 29, 2023 Posted March 29, 2023 (edited) 1 hour ago, Mr. Doodlezz said: Did you actually read the article? Because exactly the opposite is the case here! See the linked script above in my comment. Yes, I've read both - but neither provides any "evidence" as to why such marked content could not be usable/exploitable for AI learning. Simply, if the watermark is "By “invisible”, I mean real invisible — invisible to the human eye.", then there is no reason for it to be invisible even to AI. Of course, the learning algorithm will be influenced to a certain extent by the images marked in this way, but at the level of something that is not even visible to the human eye. P.S. The only mention of AI from said article is "The purpose of this watermark is probably for filtering out AI-generated images in the future to avoid them being used in training new AI models." Does this sentence really instill in you absolute/unshakable confidence in the security and unbreakability of this anti AI "security"? Edited March 29, 2023 by Pšenda Quote Affinity Store (MSI/EXE): Affinity Suite (ADe, APh, APu) 2.5.7.2948 (Retail) Dell OptiPlex 7060, i5-8500 3.00 GHz, 16 GB, Intel UHD Graphics 630, Dell P2417H 1920 x 1080, Windows 11 Pro, Version 24H2, Build 26100.2605. Dell Latitude E5570, i5-6440HQ 2.60 GHz, 8 GB, Intel HD Graphics 530, 1920 x 1080, Windows 11 Pro, Version 24H2, Build 26100.2605. Intel NUC5PGYH, Pentium N3700 2.40 GHz, 8 GB, Intel HD Graphics, EIZO EV2456 1920 x 1200, Windows 10 Pro, Version 21H1, Build 19043.2130.
Mr. Doodlezz Posted March 29, 2023 Posted March 29, 2023 You are right, it is not well explained. Here is my understanding after reading some other articles. Basically it's supposed to work something like reverse psychology applied to AI or even better, lying to the AI. Let's take a step back and do a mini-excursion: AI trains only with human-generated images. How does AI distinguish this? AI adds said (nearly) invisible marker/pattern to it's generated images, i.e. watermark. A kind of pattern that extends over the entire generated image. AI does not train with self-generated images. How does AI recognise these images and filter them out of the training material? See point 2, it looks for possible self-generated watermarks in the analysis and ignores these images as training material. Back to the script: By already inserting these marks ourselves and preventively into our images, you trick the AI, because it thinks it has already trained with the images. Now there's still at least one issue: Different AI's apply different patterns. DALL•E, Midjourney, Stable Diffusion – probably all use some kind of slightly different pattern. I'm not sure if it is possible to create a watermark that tricks all AI's. Quote
Mr. Doodlezz Posted March 29, 2023 Posted March 29, 2023 (edited) I found these visual labels that you can add to your creations, but they are not the ones I had in mind and are not very self-explanatory to me without explanation.Maybe it's also because I don't like the colours chosen, but of course that's very subjective. 😅 However, there are currently no official labels like CC. The labels are all self-initiated, even the ones I can't find at the moment. Edit: Found the other labels. Looking meh as well (why the smiley?). Maybe we can come up with some proposals … Edited March 30, 2023 by Mr. Doodlezz Added the other set of labels. fiery.spirit 1 Quote
Pšenda Posted March 29, 2023 Posted March 29, 2023 However, all of the mentioned procedures require a fair and correct approach by the AI to reject images that were not authorized by the author for its learning. But the topic is completely different, see OP "how my art software can help protect my artwork against AI theft.". Unfortunately, the answer is as mentioned by StrixCZ. StrixCZ 1 Quote Affinity Store (MSI/EXE): Affinity Suite (ADe, APh, APu) 2.5.7.2948 (Retail) Dell OptiPlex 7060, i5-8500 3.00 GHz, 16 GB, Intel UHD Graphics 630, Dell P2417H 1920 x 1080, Windows 11 Pro, Version 24H2, Build 26100.2605. Dell Latitude E5570, i5-6440HQ 2.60 GHz, 8 GB, Intel HD Graphics 530, 1920 x 1080, Windows 11 Pro, Version 24H2, Build 26100.2605. Intel NUC5PGYH, Pentium N3700 2.40 GHz, 8 GB, Intel HD Graphics, EIZO EV2456 1920 x 1200, Windows 10 Pro, Version 21H1, Build 19043.2130.
MunchkinWorks Posted March 29, 2023 Author Posted March 29, 2023 I know at the moment you can't 100% protect your art against AI theft, be it automated or purposeful, but if you can add in a little something extra to make it a little bit harder for AI theft to happen, it still helps. It's the difference between stealing an object that's just sitting there vs having to lockpick a safe to get the object inside: theft may still happen, it will just take a bit longer and be harder to achieve. I don't want to hear about the whole "if you don't want your art stolen, don't post it online" sad argument. More than often those thieves are lucky they live in different countries and are mostly anonymous online, and that hiring a lawyer is expensive and not within the reach of many artists (who are often the main victims, sometimes even minors). For many artists having their work posted in public is a way to attract commissions and jobs, and they should not have to worry about it being stolen and used in crappy mass printed notebooks, enefftees or AI training without permission. So it's not 100% but we can still make it harder to happen. Invasive watermarks aren't pretty but still help, so having something like a watermark tool in Export mode to create a pattern out of saved assets automatically, or automatically watermark every single artboard in the same position, size and transparency would be neat so you don't need to turn the watermark layer on and off when you want to export the clean version to show to a client. Noise filters like Glaze uses could also help in making the images useless for AI training. Are there methods of going around this? Yes, but takes extra steps. Also, and in my personal opinion, if Affinity were to deliver something like this, even if it's only a small help against theft and may not even do much in the end, it would show that they care about the artists using their software, especially those who left other software for Affinity due to reasons. While other software and even social platforms like deviantart and artstation were quick to jump into the AI bandwagon, it would show Affinity is considering their costumers first by giving them something that will help them be less anxious about posting their work online. Even if later they do adopt some AI assisting tools (keyword: assisting. Not text to image). I imagine it might end up being an inevitability in order to compete against Adobe and others. ASUNDER, Westerwälder, Mr. Doodlezz and 1 other 4 Quote
v_kyr Posted March 29, 2023 Posted March 29, 2023 28 minutes ago, MunchkinWorks said: I know at the moment you can't 100% protect your art against AI theft, be it automated or purposeful, but if you can add in a little something extra to make it a little bit harder for AI theft to happen, it still helps. ... You can't even protect your art < 20% from that. Actually you can't really differentiate between an AI-crawler and a common Search-Engine Indexer at all. Further most AI services do (...if they've taken/catched some of your images) create new bytecode out of it, they don't place your images bytewise 1:1 into their imaging results. So most of the techniques described above (watermarking) won't work either way here as a protection. Also providing/setting up a robots.txt or explicite licensing requirements for websites, doesn't actually prevent them from crawling your website. - Though the later maybe is something you or your lawyer could sue them then for, as far as you could prove that your images were initially used for AI results. ... and so on ... A bunch of other company services offers and makes use of ChatGPT by intergrating plugins to it via it's API (the whole list of third parties is not known) ... ChatGPT API Hack: More than 80 other plugins are available There are also some open letters against AI models, so things like ChatGPT ... Open letter: Musk, Wozniak and Co. are demanding a forced break for models like the GPT-4 And problematic usages of generating fake images and the like ... The AI Pope in the down coat should be a warning So all in all there is actually much too much hype about (ab)usage of AI-tools here! ASUNDER 1 Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2
StrixCZ Posted March 30, 2023 Posted March 30, 2023 17 hours ago, Pšenda said: However, all of the mentioned procedures require a fair and correct approach by the AI to reject images that were not authorized by the author for its learning. But the topic is completely different, see OP "how my art software can help protect my artwork against AI theft.". Unfortunately, the answer is as mentioned by StrixCZ. Thanks, exactly my point. At best, we can hope that the AI models will ignore images containing certain "copyright protected" / "AI generated" symbols or patterns. However, if their creators decide to ignore/circumvent these measures, it will be rather easy for them to do so, and it's safe to assume that at least some AI models will do exactly that, especially if lot of people starts to "protect" their artwork. It's unfortunate, but the question the artists today need to be asking really is "how can I make profit from my art despite the AI being able to immitate my style". The only real option to make sure* AI won't use your images for training is never publishing any of your artwork online which is next to impossible for most people trying to make money from their art. * And even that won't be 100 % bulletproof solution as the AI can still hypothetically use a photo of your artwork someone else publishes, even though it would be as close to keeping your art protected as possible. Pšenda 1 Quote
Pšenda Posted March 30, 2023 Posted March 30, 2023 16 hours ago, v_kyr said: You can't even protect your art < 20% from that. I even think that the partial "at least something is better than nothing" solutions proposed here are ultimately more harmful because they give the authors a false sense of "protection". In my opinion, on the contrary, it is better to count on it and know that there is simply no protection yet. Of course, watermarking a work that confirms its authenticity/author is something else entirely - this has been requested many times here, and of course Affinity should be able to do that. Quote Affinity Store (MSI/EXE): Affinity Suite (ADe, APh, APu) 2.5.7.2948 (Retail) Dell OptiPlex 7060, i5-8500 3.00 GHz, 16 GB, Intel UHD Graphics 630, Dell P2417H 1920 x 1080, Windows 11 Pro, Version 24H2, Build 26100.2605. Dell Latitude E5570, i5-6440HQ 2.60 GHz, 8 GB, Intel HD Graphics 530, 1920 x 1080, Windows 11 Pro, Version 24H2, Build 26100.2605. Intel NUC5PGYH, Pentium N3700 2.40 GHz, 8 GB, Intel HD Graphics, EIZO EV2456 1920 x 1200, Windows 10 Pro, Version 21H1, Build 19043.2130.
v_kyr Posted March 30, 2023 Posted March 30, 2023 5 hours ago, Pšenda said: ...and know that there is simply no protection yet Neither offers any protection (wishful thinking). - Watermarking is at best also only a gimmick nothing else, as easily as you can apply them you can also remove them. StrixCZ 1 Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2
MunchkinWorks Posted March 31, 2023 Author Posted March 31, 2023 21 hours ago, v_kyr said: Neither offers any protection (wishful thinking). - Watermarking is at best also only a gimmick nothing else, as easily as you can apply them you can also remove them. Watermarks are still useful to prevent some types of theft, like being used for t-shirts without permission or used in scam portfolios. It depends on how they're applied. If it's a little thing in the corner of course it's easy to erase or crop out. But if it's a more invasive watermark over the illustration itself that's more difficult to remove, it makes them "worthless" to people trying to make a quick buck off of them behind your back. It's what I have started doing with my illustrations posted in social media, and then I have my portfolio pdf to show to potential clients or hirers without those watermarks. I found out early on some of my old art was used to train Stable Diffusion. And so far there is nothing I can do about it and it's annoying. But I also do not like the idea of not posting my art online at all. Aside from work (having my concept art on artstation is literally how I got my most recent job), I like sharing silly things in my corner and cheer up or make someone laugh. Sure I could also do that in a private discord group, but what if I want to reach out to others online? There are artists who like to share their art with the world, it's just too bad this is the current situation we're in and there are people who only see the artists' work as "data training". fiery.spirit 1 Quote
v_kyr Posted April 1, 2023 Posted April 1, 2023 On 3/31/2023 at 4:58 PM, MunchkinWorks said: Watermarks are still useful to prevent some types of theft, like being used for t-shirts without permission or used in scam portfolios. ... In my experience such things don't really prevent image theft. Services like Shutterstock etc. do place watermarks all over the images they try to sell and it would take you two minutes in APh or PS etc. to remove all that. So that may scare off inexperienced users or amateurs, but anyone who only knows a little bit about image editing will get around it in no time. - So it would be better if you maybe also additional apply some metadata into unusual non visual & accessable places of your images. Though even that won't give an entire security here, since in IT one true rules is, where there is a will, there is a way to undo or hack (reverse engineer it) something, one of Murphy's laws. Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2
StrixCZ Posted April 3, 2023 Posted April 3, 2023 On 4/1/2023 at 9:22 PM, v_kyr said: Services like Shutterstock etc. do place watermarks all over the images they try to sell and it would take you two minutes in APh or PS etc. to remove all that Expert PS user / professional graphic designer with 15+ years of experience here and I can tell you this is simply not true. At best, you can remove watermarks from some images (simple illustrations with white background are the easiest to deal with) but even that won't usually take 2 minutes. Removing a watermark from a complex photo/illustration - well, good luck with that 😅 Not to mention that the watermarked images are generally low-res (meant as a preview / to be used for mock-ups) so typically you wouldn't get a decent print from them even without the watermark. Anyway, the watermarks are actually quite effective in terms of protecting images from being used without permission - otherwise, nobody would use them anymore. The problem is that while this solution is suitable for stock images, it's not nearly as suitable for artists trying to display their work as big ugly watermarks basically prevent everyone from fully appreciating the artwork... v_kyr 1 Quote
v_kyr Posted April 3, 2023 Posted April 3, 2023 12 minutes ago, StrixCZ said: Not to mention that the watermarked images are generally low-res (meant as a preview / to be used for mock-ups) so typically you wouldn't get a decent print from them even without the watermark. AFAI understood it, that's the way the OP uses them too, just as a preview and then for real interested customers some without placed on watermarks. 17 minutes ago, StrixCZ said: At best, you can remove watermarks from some images (simple illustrations with white background are the easiest to deal with) but even that won't usually take 2 minutes. Then you are doing something wrong! 😀 - As there are nowadays also a bunch of AI based apps to remove that stuff, like ... https://www.watermarkremover.io/ https://pixcut.wondershare.com/watermark-remover.html https://www.avaide.com/watermark-remover/ https://www.anymp4.com/watermark-remover-online/ https://www.aiseesoft.com/watermark-remover-online/ https://www.topmediai.com/remove-watermark/ ... and so on ... Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2
StrixCZ Posted April 3, 2023 Posted April 3, 2023 19 minutes ago, v_kyr said: https://www.watermarkremover.io/ Fair enough, I just tried that one and I have to admit that it produced far better result than I thought it would (not perfect so that the original would still be clearly distinguishable under close inspection but pretty good anyway). Frankly, trying to steal someone's work was never my main focus 😃 so I had no need to look for such tools (and removing it manually really would be a chore for most images)... Quote
PaulEC Posted April 3, 2023 Posted April 3, 2023 It is interesting how many AI tools seem to be designed to help steal other peoples work, but, apparently, that's the world we live in and we should all embrace it! 😒 Mr. Doodlezz and Westerwälder 1 1 Quote Acer XC-895 : Core i5-10400 Hexa-core 2.90 GHz : 32GB RAM : Intel UHD Graphics 630 – Windows 11 Home - Affinity Publisher, Photo & Designer, v2 (As I am a Windows user, any answers/comments I contribute may not apply to Mac or iPad.)
v_kyr Posted April 3, 2023 Posted April 3, 2023 2 hours ago, StrixCZ said: Frankly, trying to steal someone's work was never my main focus 😃 so I had no need to look for such tools (and removing it manually really would be a chore for most images)... Well I also wouldn't take foreign images & work, since I like and prefer to do my own programming, photography and also own specific vector drawings here. - If anything, I would only reuse & refer to royalty-free and rights-free material here. However, as I've already said and shown above, there is also a bunch of AI stuff nowadays available which moves here in a quasi gray area. - And that's IMO also a result of todays too much artificially generated hype made about AI-based systems etc. Since finally everyone (every player in that field) wants to play along commercially here and also have a piece of the pie. StrixCZ 1 Quote ☛ Affinity Designer 1.10.8 ◆ Affinity Photo 1.10.8 ◆ Affinity Publisher 1.10.8 ◆ OSX El Capitan ☛ Affinity V2.3 apps ◆ MacOS Sonoma 14.2 ◆ iPad OS 17.2
StrixCZ Posted April 4, 2023 Posted April 4, 2023 18 hours ago, PaulEC said: It is interesting how many AI tools seem to be designed to help steal other peoples work, but, apparently, that's the world we live in and we should all embrace it! 😒 Well, maybe not embrace it but we surely need to accept the way things are and learn to adapt since this can of worms can't really be closed now that it's open. Trying to fight it is as futile as trying to play tug of war with a speeding train... Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.