Jump to content
You must now use your email address to sign in [click for more info] ×

Adobe's updated terms of service for content


Recommended Posts

12 hours ago, NotMyFault said:

This is mandated by law in US and many other countries for every online storage service - which is part of Adobes offering. 

If Affinity would offer online storage (or cloud based image editing including upload / processing / storage of images) in US it probably would be required to  include similar terms in their TOS statement.

This is not Adobe vs. Serif. It is any company offering online service vs. not offering online services (which include storage of user provided  documents)

I've never heard of such a law in the US. Do you have a link to it so I can read it?

 

Thank you.

Link to comment
Share on other sites

I remember that in the CommunityPlus Forum that Serif used to have for its Plus Range products that there were wide-ranging licenses required but only for stuff uploaded to that forum, and I mused in a thread as to whether having acquired all those licenses that there should be an obligation to use them.

Just imagine if that were the case here.

One could write a play, then sit back and wait for its public performance.

New theatres could be built to house those performances, actors would get more work, there could be new television channels devoted to public performances.

One could write a screenplay and wait for the movie to be produced at no cost to oneself and wonder if it will win an Oscar.

And suppose someone designs a starship ...

William

 

 

Until December 2022, using a Lenovo laptop running Windows 10 in England. From January 2023, using an HP laptop running Windows 11 in England.

Link to comment
Share on other sites

4 hours ago, Gorgons Grimoire said:

I've never heard of such a law in the US. Do you have a link to it so I can read it?

 

Thank you.

I don’t find the actual laws, but e.g. Dropbox (and onedrive, google etc.) take hashes of all files uploaded allowing them to detect copyright or illegal data.

There is a reason that copyrighted movies, apps, and even images have almost vanished from regular search engines and cloud storage.

Of course you will find search / storage for that content in the dark areas of the web.

Google uses AI for more invasive scans.

 

https://www.extremetech.com/internet/179495-how-dropbox-knows-youre-a-dirty-pirate-and-why-you-shouldnt-use-cloud-storage-to-share-copyrighted-files

https://blog.google/technology/safety-security/how-we-detect-remove-and-report-child-sexual-abuse-material/

Mac mini M1 A2348 | Windows 10 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080

LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5

iPad Air Gen 5 (2022) A2589

Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps.

 

Link to comment
Share on other sites

2 hours ago, NotMyFault said:

I don’t find the actual laws, but e.g. Dropbox (and onedrive, google etc.) take hashes of all files uploaded allowing them to detect copyright or illegal data.

???? That is considerably different & waaaay more limited than what Adobe's 4.2 covers!!!! As it says Adobe can "use, reproduce, publicly display, distribute, modify, create derivative works base on, publicly perform, and translate the Content ..." And 4.1 defines "Content" so broadly that it covers anything anyone creates using Adobe's services or software.

Even of you consider the 'to improve or operate' proviso as a limitation, this has nothing to do with detecting copyright infringement, pornography, etc.

All 3 1.10.8, & all 3 V2.5.2 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

3 hours ago, NotMyFault said:

I don’t find the actual laws, but e.g. Dropbox (and onedrive, google etc.) take hashes of all files uploaded allowing them to detect copyright or illegal data.

There is a reason that copyrighted movies, apps, and even images have almost vanished from regular search engines and cloud storage.

Of course you will find search / storage for that content in the dark areas of the web.

Google uses AI for more invasive scans.

 

https://www.extremetech.com/internet/179495-how-dropbox-knows-youre-a-dirty-pirate-and-why-you-shouldnt-use-cloud-storage-to-share-copyrighted-files

https://blog.google/technology/safety-security/how-we-detect-remove-and-report-child-sexual-abuse-material/

So there is no actual law and this is all done by the companies to ensure compliance with the DMCA.

 

You can still find copyrighted material online using Google or another search engine. At least in the United States.

Link to comment
Share on other sites

1 minute ago, Gorgons Grimoire said:

So there is no actual law and this is all done by the companies to ensure compliance with the DMCA.

Except that has nothing to do with Adobe requiring users of its software and/or its services to grant Adobe a non-revocable license to use, publicly display, etc. any & all content those users in any way use said software or services to create, modify, embed, etc. said content. Not only that, they can allow their sublicensees to do the same!

All they have to do is claim that in some unspecified way or other this could improve their products or facilitate operating them, whatever that is supposed to mean.

BTW, some reading this might remember Apple's ill-fated plan to scan photos uploaded to its cloud serve to detect child pornography, & the backlash over privacy rights that caused that convinced them not to implement it.

All 3 1.10.8, & all 3 V2.5.2 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

2 minutes ago, R C-R said:

Except that has nothing to do with Adobe requiring users of its software and/or its services to grant Adobe a non-revocable license to use, publicly display, etc. any & all content those users in any way use said software or services to create, modify, embed, etc. said content. Not only that, they can allow their sublicensees to do the same!

All they have to do is claim that in some unspecified way or other this could improve their products or facilitate operating them, whatever that is supposed to mean.

BTW, some reading this might remember Apple's ill-fated plan to scan photos uploaded to its cloud serve to detect child pornography, & the backlash over privacy rights that caused that convinced them not to implement it.

Yup, I was replying to the claim that there was a US law requiring companies to scan information. 

Link to comment
Share on other sites

Of course there is no law directly dictating the exact phrases of the TOS.

There are laws that lead to companies include specific terms into their TOS to be on the safe side. And once one big started and got through, all others follow sooner or later.
Of course companies try to make TOS in such a way that it gets maximum plus on their side - and not the customer.

Of course you find copyrighted material. But if you use the big tech cloud storage, all vendors have implemented very similar api to automatically remove such content, or demonetize content creators, or shift earnings to those who claim to have the copyright. And the TOS of all those vendors allow them to act in this way, and possibly block your account, and ban you personally for life time.

my major argument is:

  • Serif is not naturally better vs. other vendors, as it simply avoids the whole topic by not offering such services which would put Serif into a position to either face extreme legal risk, or include very similar terms in their own TOS.

Mac mini M1 A2348 | Windows 10 - AMD Ryzen 9 5900x - 32 GB RAM - Nvidia GTX 1080

LG34WK950U-W, calibrated to DCI-P3 with LG Calibration Studio / Spider 5

iPad Air Gen 5 (2022) A2589

Special interest into procedural texture filter, edit alpha channel, RGB/16 and RGB/32 color formats, stacking, finding root causes for misbehaving files, finding creative solutions for unsolvable tasks, finding bugs in Apps.

 

Link to comment
Share on other sites

1 minute ago, NotMyFault said:

Of course there is no law directly dictating the exact phrases of the TOS.

There are laws that lead to companies include specific terms into their TOS to be on the safe side. And once one big started and got through, all others follow sooner or later.
Of course companies try to make TOS in such a way that it gets maximum plus on their side - and not the customer.

Of course you find copyrighted material. But if you use the big tech cloud storage, all vendors have implemented very similar api to automatically remove such content, or demonetize content creators, or shift earnings to those who claim to have the copyright. And the TOS of all those vendors allow them to act in this way, and possibly block your account, and ban you personally for life time.

my major argument is:

  • Serif is not naturally better vs. other vendors, as it simply avoids the whole topic by not offering such services which would put Serif into a position to either face extreme legal risk, or include very similar terms in their own TOS.

Your argument fails when current Serif TOS for Affinity suite does not allow them to take your intellectual property and use it for free. 

 

I'm not here to argue with anyone. I posted this up to help Serif capitalize on the mistake Adobe did. Whether or not you agree with it is immaterial to the facts of what Adobe's new TOS says.

 

Have a wonderful day.

Link to comment
Share on other sites

17 minutes ago, Gorgons Grimoire said:

You can still find copyrighted material online using Google or another search engine. At least in the United States.

Not necessarily a problem.

My first novel is copyright, my copyright. I am in England.

The novel is on the web, free to read, no registration requested or required.

(Produced by using Serif PagePlus, as well as my writing)

William

 

Until December 2022, using a Lenovo laptop running Windows 10 in England. From January 2023, using an HP laptop running Windows 11 in England.

Link to comment
Share on other sites

Adobe provided clarification about the updated Terms of Use on their blog yesterday (06/06/2024). I’m not going to post a link to an Adobe site in an Affinity forum, but the title of the blog post is: A clarification on Adobe Terms of Use.

Now hopefully we can all get back to talking about Affinity rather than the ‘A’ word. 🙂

Windows 10 22H2, 32GB RAM | Affinity Designer/Photo/Publisher 2 (MSI/EXE)

Link to comment
Share on other sites

23 minutes ago, NotMyFault said:

But if you use the big tech cloud storage, all vendors have implemented very similar api to automatically remove such content, or demonetize content creators, or shift earnings to those who claim to have the copyright. And the TOS of all those vendors allow them to act in this way, and possibly block your account, and ban you personally for life time.

First, that is not true for all cloud services -- please refer to Apple's aborted plan to scan photos uploaded to iCloud.

But more to the point Adobe is not reserving the right to remove content or block accounts or ban users or any some such, it is in fact saying it may use that content in many different ways, including but not limiting it to publicly display it or sharing it with its sublicensees.

All 3 1.10.8, & all 3 V2.5.2 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

Yes, that's the point. 4.1 still says content is anything and 4.2 is still allowing them to do more or less anything with the content.

And the new wording in the block post has a lot of "for example".

It would be easy to write "we only do exactly a, b, c" But they write: we do for example  a, b, c, but we don't tell you that we also do e, d and f.

And the wording like: we don't train Firefly with customer content, can easily be bypassed by the "we can sublicence" your work in clause 4.2. If they don't train themselves, no problem, you give them the right to let others do it for them. They can give your data to stability AI and get the trained models back from them.

There was also a post, I think on X, where they wrote something like, we don't have the technical possibility to scan you local content. I just need to add a "yet". Better would have been to write, we are not scanning your local content, even if we would have the technical possibility.

To me it fells like they want to wriggle out of the current outcry without really changing what the TOS allows them to do.

Link to comment
Share on other sites

22 minutes ago, Gorgons Grimoire said:

If you read it then you see they didn't actually touch section 4.2. Thanks for the new information.

Yes, it still reiterates the "access content solely for the purpose of operating or improving the services and software" verbiage, while adding "and to enforce our terms and comply with law, such as to protect against abusive content" bit, which is more in keeping with legal requirements.

It also says "Adobe does not train Firefly Gen AI models on customer content" but that is not mentioned anywhere I can find in the actual TOS. There is also a link to this, which discusses how Adobe uses machine learning & how users can opt out of having their content analyzed for that but a careful read of the "When does content analysis opt-out not apply?" section shows that even this may not be opted out of, if for any reason Adobe decides it will improve a feature a user uses.

All 3 1.10.8, & all 3 V2.5.2 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

6 hours ago, R C-R said:

Yes, it still reiterates the "access content solely for the purpose of operating or improving the services and software" verbiage, while adding "and to enforce our terms and comply with law, such as to protect against abusive content" bit, which is more in keeping with legal requirements.

It also says "Adobe does not train Firefly Gen AI models on customer content" but that is not mentioned anywhere I can find in the actual TOS. There is also a link to this, which discusses how Adobe uses machine learning & how users can opt out of having their content analyzed for that but a careful read of the "When does content analysis opt-out not apply?" section shows that even this may not be opted out of, if for any reason Adobe decides it will improve a feature a user uses.

If you read that you can't actually opt out of anything or the software won't launch. Personally, I'm not invested in this topic beyond alerting Serif that they have a chance to capture more market share due to Adobe's stupidity and they can claim your work without paying you a dime. I have a problem with that when you are creating for profit. It affects me directly because I want to produce TTRPGs and sell them.

Link to comment
Share on other sites

29 minutes ago, Gorgons Grimoire said:

due to Adobe's stupidity and they can claim your work without paying you a dime.

They are not exactly claiming your work (you still own it), but you have to give Adobe a license to use it as they see fit, as long as they claim it is to operate or improve their products. It is so vague & open-ended a stipulation it could mean anything they want it to mean, & as far as I can tell they do not even to inform anyone about how this would or could do either one.

All 3 1.10.8, & all 3 V2.5.2 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

12 minutes ago, William Overington said:

I am still wondering about

publicly perform

I think it refers to videos, animated GIF's, audio tracks & such.

All 3 1.10.8, & all 3 V2.5.2 Mac apps; 2020 iMac 27"; 3.8GHz i7, Radeon Pro 5700, 32GB RAM; macOS 10.15.7
Affinity Photo 
1.10.8; Affinity Designer 1.108; & all 3 V2 apps for iPad; 6th Generation iPad 32 GB; Apple Pencil; iPadOS 15.7

Link to comment
Share on other sites

The guy in that video is making a fallacy of his own: he is claiming that Adobe is lying about not using content for "generative AI" because the contract says "machine learning" - after he himself acknowledged that generative AI is one form of machine learning.  He completely ignores the possibility that Adobe may be planning on using them to train forms of machine learning other than generative AI.

This is not to say that what Adobe is doing is in any way appropriate or acceptable - he is right to be upset over this - but applying a fallacy of his own is not helpful in making his case.

Link to comment
Share on other sites

From what I have read, and I could be wrong, is they can only access files in the cloud, so local files are not accessible to them.  Out of curiosity, are a lot of people using the cloud for their work? I can't imagine working on jobs that are all stored in the cloud and dependant on a network connection. 

Link to comment
Share on other sites

On 6/12/2024 at 6:41 PM, wonderings said:

From what I have read, and I could be wrong, is they can only access files in the cloud, so local files are not accessible to them.  Out of curiosity, are a lot of people using the cloud for their work? I can't imagine working on jobs that are all stored in the cloud and dependant on a network connection. 

I keep all data out of cloud storage and we use local NAS servers with remote access handled via VPN. However, this is no longer sufficient to keep data private.

Some factors to consider:

  • Some programs like Photoshop, Lightroom and Camera Raw process some commands on Adobe servers without explicitly notifying the user when it happens. This includes some masking tools, neural filters, generative fill, some healing and fixing tools. CC Libraries also automatically send all data to Adobe servers.
  • Some Adobe programs, particularly for iPad, automatically upload all your documents to Adobe servers without your consent and with no way of opting out. If you disconnect network access, it will sync as soon as the device gets internet access again
  • Lightroom Clasic and Lightroom Mobile can only work together if you sync via cloud. You can't work off your own server (neither local, nor via vpn or a 3rd party hosted server)
  • None of Adobe's sync features use end-to-end encryption, only transport encryption, so your data is always accessible to Adobe and whoever they please to give it to
  • All Adobe software is infested with analytics (for example connections to omtrdc.net). Any attempt to opt out of this. using any combination of settings, does not seem to stop this completely. This could, at any time, be expanded to include (partial) documents, or existing data analyzed via "AI"
  • Adobe support has in the past asked for my permission to control my computer to do things for me (which I always declined). This indicates they have some sort of remote access trojan installed with the software. (We also don't know about the extent of its capabilites and who has access to these features on the support side in outsourcing countries, but that's a different issue).

Furthermore, the new terms are concerning for other reasons:

  • Adobe is effectively holding customer data hostage through proprietary file formats and forcing users to accept new terms at their leisure to be able to access their data, with no way to opt out
  • It is not necessarily about what Adobe are doing, but what the new terms give them permission to do. This includes analyzing local data never put in cloud storage and uploading your local files or analyzing your work or behaviour inside the applications
  • Their damage control PR statements use language that is deliberately deceptive. "We never trained AI using your data" is not the same as "We have never and will never use your data to train AI and we have amended our TOS in response to your comments to reflect this".
  • If the content you edit with your software does not comply with their restrictions, which can be determined in ways that are not transparent (could just be analytics in your local app with just your local data), you can be locked out of your software and your work, which can potentially have disastrous effects on your work, business or reputation
  • Given their use of dark patterns (always enabling "Automatically install Updates" again every time there is an update in the hope you forget to switch it off at some point, making things opt-out instead of opt-in, settings that you switched off being on again after an update etc.), it can be difficult to prevent your content from ending up on Adobe servers against your will
  • Contracts in law as we have them right now were meant to be negotiated. However, this type of behaviour in big tech with "accept this or don't use our products" in combination with a monopoly position creates a power imbalance, especially when clauses are included that are not necessary for normal operations. People who are not willing to give up their data, information, work, skill or privacy are excluded from certain kinds of work (you still have to be able to use Adobe software and open Adobe files if you want to work professionally in the media industry), or even parts of public life when essential apps are only available on Apple and Google app stores so you have to agree to their TOS to use public transport for example.

In other words, this is as if housing in your city was 90% in the hands of a single company and you have to rent from them, but one day you find a lock on the door of your apartment with your stuff inside and a notice that they will only give you the key if you agree that they get the right to install hidden CCTV cameras in every corner of your apartment without notifying you and that they get to use and rent out all your personal items that you put in the attic, whether you put them there in the future or in the past. And you know that whenever they do maintenance on the building, they will secretly come into your apartment when you're not home and randomly take items into said attic. When you confront them and say you told them not to do this last time, they will say that this objection of yours only applied to last time they were there. And more and more, things you put in certain rooms of your appartment will always automatically be moved to the attic on principle, no matter if you want it or not. Also more and more random corners of your apartment are also turning into attic transport hubs, but there is no way for you to know for sure in advance which ones. When your fellow renters complain about this, the housing company says "don't worry, we haven't yet rented out your things in the attic, at least not in the past, and so far we haven't given anyone else access to cctv feeds from your bathroom except that company in India we use to handle our customer support". And they changed the measurements of the doors so that if you want to move to an apartment not owned by them, you cannot move your furniture out of there.

Your statement is like responding to that situation with "Oh, but I don't put anything in the attic, so it's probably fine" and signing the paper. The issue here is that this is proof that they are fundamentally untrustworthy as a company and that there is no more way to use their products (which they would rather have us call "services") without giving up control over your data and that of the people you interact with.

Link to comment
Share on other sites

I apologize if this is somewhat off the main topic of this thread, but it's also related, as it discusses subscriptions, Adobe policies, tactics, etc. I read this morning that...

The U.S. government is suing Adobe, accusing the software maker of steering customers toward a pricey subscription plan while concealing how much it costs to cancel the service.

Here's a link to the article for anyone interested in reading it. 

https://www.cbsnews.com/news/adobe-ftc-federal-lawsuit-cancel-subscription/?utm_source=join1440&utm_medium=email&utm_placement=newsletter

2017 15" MacBook Pro, 16 MB RAM, Ventura v13.6.7, Affinity Photo/Designer/Publisher v1 & v2, Adobe CS6 Extended, LightRoom v6, Blender, InkScape, Dell 30" Monitor, Canon PRO-100 Printer, i1 Spectrophotometer, i1Publish

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.