AI Can Take Off Your Clothes, But Can Copyright Put Them Back On?

The rapid expansion and progression of AI capabilities has left many excited and impatient to see what the future of AI holds. However, many have warned about the dangers of this rapid advancement without adequate protections.These warnings are becoming more real than ever, as AI is now able to take the clothes off a subject in a photograph through clothes remover bots that use AI algorithms; even without their consent. This situation gives rise to substantial ethical and legal concerns as it infringes upon the core right to privacy, potentially culminating in acts of exploitation, harassment and the distribution of explicit content otherwise known as revenge porn.

In the search for a solution to this problem it has been suggested that copyright could be the answer to getting your clothes back on. Section 5 of the Copyright Act states that copyright exists in “every original literary, dramatic, musical and artistic work”. Photographs are protected just like any other artistic works such as sculptures, paintings, drawings and engravings as they are clearly fixed and original. To be original it needs “at least some sort of intellectual effort” which is satisfied through the taking of the photograph. While this does not mean all photos are protected by copyright, it shows why in many cases they would be.

But, is the damage not already done? After the use of the AI technology and the publishing of the results, the image is out there to be seen by the world and as the saying goes, ‘once on the internet, always on the internet’. With this in mind, is copyright the best deterrent to stop this from occurring without the subjects’ consent? Furthermore, is a copyright infringement a sufficient punishment for people who commit such acts?

There is no wonder why people have drawn analogies with this and the offence of non-consensual distribution of intimate images, an act that is criminalised and, if prosecuted by indictment, carries a maximum sentence of five years imprisonment. If the effects of the distribution of the AI image and a hypothetical real intimate image are the same, should they be treated and criminalised the same way?

In light of the many ethical and legal concerns with this technology it is argued that “developers should consider implementing safeguards and mechanisms to prevent misuse of the technology, such as adding watermarks or digital signatures that indicate tampering.” With a watermark, it would be easy to establish that the image has been tampered with and it cannot be trusted. Yet here is where AI steps in again, with free and easy to use watermark removal technology. If a perpetrator is willing to remove your clothes with AI, do you think they would draw the line at removing watermarks?

As one can easily see, this technology also causes serious problems with individuals right to privacy. This is because it can entail the unwarranted modification and distribution of personal images. In many jurisdictions such as in Vancouver, BC if individuals are found to be infringing people’s right to privacy they may be subject to legal action. However in Vancouver, a breach of ones right to privacy is a common law tort, rather than criminal offence. Again, is this a sufficient deterrent?

There are, as it has been shown, many legal and ethical implications of the use of this technology. And so we are back to the question, what is the best way of stopping the wrongful use of it? Both copyright and privacy, while not being criminal actions can work together to deter individuals and provide compensation to a victim. However, creating a more serious deterrent and punishment by aligning the act with the offence of non-consensual distribution of intimate images will likely provide more protection. However, this raises the question: can a photo altered by clothes removing AI bots be considered an “intimate image”?

An intimate image, defined in the criminal code, includes a photo, film, etc… The image must be intimate in nature, meaning that the person depicted is either naked, exposing their, anal area or breasts, or is engaged in explicit sexual activity. Could this include and image altered by AI to make it look as if the person is naked? Should it include that? What about deep fakes?

AI capabilities are progressing and will continue to do so at an incredibly fast rate. Therefore, it is necessary that adequate protections are mandated in jurisdictions that want to protect individuals from potential harm. In this case, copyright law and privacy rights can work to protect individuals from AI clothes remover bots. However, for more comprehensive and effective protection and punishment, it may prove better to approach this issue through the criminal sphere, by either creating new offences, or aligning them with different but analogous offences as previously argued.

Please see some articles used in my research for this post below:

[1] https://www.wired.com/story/uk-ai-summit-declaration/

[2] https://linguix.com/blog/unveiling-the-cloth-remover-ai/#:~:text=By%20utilizing%20this%20technology%2C%20anyone,%2C%20harassment%2C%20and%20revenge%20pornography

[3] https://aitechtonic.com/ai-clothes-remover-telegram-bot/ (para 2.2)

[4] https://www.ppoc.ca/news_articles/article_32.php?thread=20#:~:text=Under%20the%20Canadian%20Copyright%20Act,protected%20under%20Canadian%20copyright%20law

[5] https://linguix.com/blog/unveiling-the-cloth-remover-ai/#:~:text=By%20utilizing%20this%20technology%2C%20anyone,%2C%20harassment%2C%20and%20revenge%20pornography

[6] https://www.weisberg.ca/intimate-images/#:~:text=The%20offence%20of%20non%2Dconsensual,sentence%20of%20five%20years%20imprisonment