Photoshop maker Adobe has run afoul of its user base with changes to its terms of service that, among other things, give it the right to look at your files and existing projects in the name of content moderation. 

In Adobe’s words, the changes clarify that the company “may access your content through both automated and manual methods, such as for content review.” 

Adobe’s reasoning for giving itself the right to comb through user content is the detection and removal of illegal content, such as child sexual abuse material, as well as abusive content or behavior, including spam and phishing.

The company also cited the advent of generative AI, a breakthrough technology that makes it much easier to create realistic images and human-sounding text and audio.

It isn’t the only company that does things like this. Google uses a mixture of automated scanning and manual review to detect CSAM and Microsoft does something similar.

What are Adobe’s changes?

There were four total changes to the terms of service, with the first two taking place in sections 2.2 and 4.1. In section 4.1, Adobe says “we reserve the right (but do not have the obligation) to remove Content or restrict access to Content, Services, and Software if any of your Content is found to be in violation of the Terms.”

In addition, section 14.1 shortened the number of days a user can file a formal dispute from 60 days to 30 days, while section 5.3 states that Adobe now reserves the right to delete content from inactive accounts. Adobe says it’ll attempt to provide a notice to inactive accounts to help them avoid deletion, but does not say how long an account has to be inactive before account deletion takes place. 

The first two changes have the creative community up in arms over what amounts to the classic double-edged sword that is privacy versus security. 

The verbiage seems to be directed only at files that are uploaded to Adobe Cloud as part of the Adobe services.

Why are people angry about Adobe’s new terms of service?

The answer is simple enough. Adobe’s loose, wide-reaching language could potentially give the company carte blanche to scan, look at and review any content that passes through an Adobe app or Adobe Cloud servers. This has irked creators, who took to Reddit to complain about the changes, because many of them use Adobe products for professional work that is generally sensitive

One such example is NDA work — content that is protected by a nondisclosure agreement. The creator signs such an agreement to get access to files with the expectation that they keep the files stashed away from prying eyes until the date when the nondisclosure agreement expires. It’s understandable that people in that space don’t want Adobe looking at something that the creator doesn’t have permission to show.

In addition, creatives who have tried to do something about it have been met with resistance from Adobe. Earlier this week, conceptual artist Sam Santala posted on X about his experience of not being able to talk to an Adobe customer service representative, cancel his subscription to Adobe’s services or even uninstall Photoshop without first agreeing to the new terms of service. 

Adobe responds with clarification

The terms of service update happened on Feb. 17. Adobe’s terms of service webpage says it was last updated on that date and was effective as of that date. When users were notified about the change, or first noticed it, is harder to say, except that the complaints about it have surged in recent days. 

Even so, The Register points out that Adobe has been using similar language for years, so while the verbiage may be more explicit and unsettling, it’s not intrinsically different from what it was before.

In addition, Adobe says that it only scans files on its cloud service and not on users’ PCs. According to the software giant, “Adobe performs content analysis only on content processed or stored on Adobe’s servers; we don’t analyze content processed or stored locally on your device.” That verbiage has not changed. 

We reached out for comment and Adobe directed us to a June 6 blog post where it further clarifies its new stance. 

“The focus of this update was to be clearer about the improvements to our moderation processes that we have in place,” Adobe says. “Given the explosion of generative AI and our commitment to responsible innovation, we have added more human moderation to our content submissions review processes.”

The blog post also reiterates that Adobe does not train its Firefly AI using files stashed on Adobe Cloud. 


Source link