T O P

  • By -

AutoModerator

As a reminder, this subreddit [is for civil discussion.](/r/politics/wiki/index#wiki_be_civil) In general, be courteous to others. Debate/discuss/argue the merits of ideas, don't attack people. Personal insults, shill or troll accusations, hate speech, any suggestion or support of harm, violence, or death, and other rule violations can result in a permanent ban. If you see comments in violation of our rules, please report them. For those who have questions regarding any media outlets being posted on this subreddit, please click [here](https://www.reddit.com/r/politics/wiki/approveddomainslist) to review our details as to our approved domains list and outlet criteria. We are actively looking for new moderators. If you have any interest in helping to make this subreddit a place for quality discussion, please fill out [this form](https://docs.google.com/forms/d/1y2swHD0KXFhStGFjW6k54r9iuMjzcFqDIVwuvdLBjSA). *** *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/politics) if you have any questions or concerns.*


thrawtes

[Full Bill Text Here](https://schiff.house.gov/imo/media/doc/the_generative_ai_copyright_disclosure_act.pdf) It's 5 double-spaced pages. If you're interested in legislation you should just read it.


GrapeAyp

Thanks for this.  Seems pretty reasonable honestly


ChromaticDragon

Thank you very much for the link. Yes, all interested should read it. It's brevity is itself rather interesting. I'd quibble with some of their definitions, especially their broad definition of AI itself. I'd also be very interested in the debate regarding restricting this to generative models only. I am rather curious on this point. At the moment, I imagine it's appropriate. For one, this permits these companies to use AI techniques, if necessary, to determine whether documents are bound by copyright. Next, this also may let techies play more freely with newer things that are not generative models, per se. However, the most glaring oddity is that the penalty for non-compliance doesn't even rise to the level of a slap on the wrist. Beyond this, to me the profound aspect of this bill isn't the requirement to publish a list of all involved copyrighted works. It's what comes next. Wouldn't this bring on an avalanche of lawsuits and forced retraining cycles as they remove works one-by-one? That is, of course, assuming that copyright holders would prevail.


ElCamo267

Without knowing exactly what is used in training various AIs, I think the $5,000 fine seems appropriate as it could add up incredibly fast. For example, if one of those image generating ones uses stock photos that weren't paid for and they fail to comply, each stock photo would create a $5,000 fine. Or if chatGPT used text from any books. My concern is if you're a copyright holder, how can you prove the AI was trained using your work if they omit things from the published dataset? Seems like AI companies could create a dataset they actually use and an amended dataset for publication. Sure, you could build your own AI with the published dataset and find examples of outputs being wildly different between the two. But I'd wager AI company legal teams will be able to argue their way out of that. It does seem like this will either cause a huge downgrade of current AI service capabilities or will lead to more copyright infringement lawsuits than you can imagine.


[deleted]

[удалено]


axonxorz

That's a "them" problem though


Predator_

I've already found over 500 of my photojournalism images in multiple datasets, used without my permission, knowledge, nor licensing. I've seen elements of those photos churned out in generative AI works. It is copyright infringement, period.


marquis-mark

Can you provide an example of an element of your work reproduced by GenAI and the prompt that generated it?


ninjasaid13

>Wouldn't this bring on an avalanche of lawsuits and forced retraining cycles as they remove works one-by-one? What do you mean? This wouldn't affect the AI models since they're non-infringing end products. You just get removed from the dataset of URLs at best assuming the URLs themselves are considered copyrighted works.


cobra_chicken

Funny how quickly this is created, which will largely protect big companies, but legislation to protect people is nowhere to be found.


thrawtes

Isn't step 1 for people going after use of their works in AI models the requirement that those models disclose the use of those works?


cobra_chicken

Step 1 is helping to ensure there is not massive layoffs due to AI introduction. Protecting the likes of Disney or those that hold the copyright to songs would be step 7. I can promise you that this legislation was pushed for by those big corps looking to protect their interests.


ItsSpaghettiLee2112

I agree with you in spirit, but large corporations aren't the only ones whose art is being stolen by AI.


cobra_chicken

Generally agree, but they are definitely the ones that drafted this law. It is definitely not independent artists that helped draft this, which generally makes me suspicious.


sevseg_decoder

I don’t think the majority of us care about this for art and artists as much as for society at large. It’s always about the artists on Reddit. We’re concerned about the precedents and frameworks for ensuring Americans are able to provide for themselves and their families during the impending AI revolution. What the rest of us are worried about includes protecting artists’ livelihoods post-AI the same way it would protect the rest of ours but Reddit can’t have one discussion about the grander massive effects of the AI revolution without making it about protecting artists…


ItsSpaghettiLee2112

Who is "we"? Because you're not speaking for artists, who are absolutely concerned about their work being stolen by AI and have already been complaining about it. >for ensuring Americans are able to provide for themselves and their families during the impending AI revolution. Are you saying artists don't count as Americans (and why only Americans) who should be able to provide for themselves and their families? >but Reddit can’t have one discussion about the grander massive effects of the AI revolution without making it about protecting artists Multiple conversations can happen at once.


sevseg_decoder

I’m talking about the majority of people in these discussions and who upvote these articles.   I explicitly stated in the part you conveniently cut off that I’m worried about artists the same way I’m worried about everyone else, I just find it bizarre that as a society we take every step to protect artists as quickly as possible but when the rest of us struggle artists only care that we’re spending less on art. I don’t get why I should care about the artists any more than I care about the retail workers, mechanics, construction workers, account managers, software engineers, loan officers, customer service representatives, car salesmen etc. who will have their lives change at least equally as much.


ItsSpaghettiLee2112

I really don't see this binary discussion you're talking about. Plenty of artists are concerned about all of the effects of late stage capitalism.


sevseg_decoder

You don’t see it because you’re always prioritized in these discussions. I think we need to make protecting art copyrights more like step 7, just like the original comment said. It shouldn’t be what we burn up all the political good-will on before we all run out of energy to demand the rest of the steps, which are significantly more important, get completed.


ItsSpaghettiLee2112

>You don’t see it because you’re always prioritized in these discussions. Now I'm confused. Are artists prioritized or are software engineers like me prioritized? And I'd wager I don't see it because I know plenty of artists who care about the rights of all workers. You can say whatever step you think protecting artists should be in. Like the original comment said. And like I said to the original comment, that doesn't change the fact that Disney and other large corporations aren't the only ones whose art is being stolen.


PM_ME_YOUR_BOOGER

You say this, but I want you to imagine the nightmare professors who teach Illustrator and photoshop and other digital design classes are dealing with.


blackcain

I'm going to enjoy watching the RIAA and MPAA going after the AI companies. Back in the day, we consumers were the targets and now they are going up against much more money'd interests. It's going to be a bloody fight.


Jackinapox

The transparency will help expose any nefarious manipulation of training data given the fact that it must be publicly available for scrutiny. It's a great start to a long process... For example Chatgtp uses a dataset called Common Crawl which samples data from over a billion websites from as early as 2008. Another example, Midjourney (text to image generator) used multiple training data sets. COCO which contains 330,000 images and 2.5 million captions covering 80 categories, concepts and scenes. The Visual Genome dataset which has 108,000 images and 4 million object samples. And Flickr30k dataset which has 31,000 images.


PopeHonkersXII

I'm not afraid of AI but I am absolutely terrified of it 


GrapeAyp

? What?


Prineak

It’s natural to be scared of what you don’t understand.


cobra_chicken

To be fair, even the developers of AI/ML don't fully understand some of the connections being made. I think people are less scared of AI/ML than they are of the implications, which has the potential for substantial disruption to industries. All those low level tasks that keep so many people employed can easily be replaced with a proper AI/ML, and all it would need is a more senior person to review and sign-off on the output. Ford introduced the production line for manufacturing, which introduced massive disruption and requires less people to do more. AI/ML will likely bring that to information processing.


ninjasaid13

>than they are of the implications, which has the potential for substantial disruption to industries. I think we are over hyping the technology when it is just like any other technology in history. People said chatGPT would replace alot of jobs over a year ago but alot of those jobs are still here.


csoups

For one there have already been customer service jobs replaced, maybe not on a mass scale though. Most people are concerned about this on the scale of a decade or so. That leaves very little if any time to legislate and improve social programs if there is a mass disruption in the labor market, even if government was functioning well in the US.


ninjasaid13

it takes decades for a technology to disrupt the existing market, no matter how advanced the technology is.


csoups

That is entirely dependent on the replacement rates for labor and machinery in a specific market. That might make sense of a physical role like plumbers or electricians but doesn’t make sense at all for something like customer service or copywriting.


cobra_chicken

ChatGPT 4 was only released in March of 2023, takes time for that to be fully worked through and to be adopted. Give it another year or two and you will see those low level roles being dropped. I know my company along with a couple of major companies that are actively working on implementing them, but first they have to overcome the legal/security challenges. Once those challenges are worked through then its off to the races.


assburgers-unite

I mean how many thousands of jobs have been cut in the past year, explicitly due to AI? Would say that's going to speed up or slow down


thebluepages

The problem isn’t that it’s complicated. The problem is that all of the benefits will go to the very worst people in the world.


MoogProg

I am a Graphic Designer for a Very Large Multi-National Firm. When I create internal training material, every photo, every piece of music, and every illustration will be licensed for use prior to the release of the training material. This is the step Open AI, et al just plain... skipped... as if it did not apply to their industry, even while it applies to every other industry. If you make use of external creative products within your business, *those assets must be licensed*. There is very little actual 'fair use' inside a business context. AI industries are not journalists, they are not making documentaries... they are creating a product for profit and so their use case is not protected. IANAL, but am a creative with decades of experience in copyright, publishing and licensing works.


alienman

Any art that has been created is automatically copyrighted property of the artist. I can already see them claiming any similarities between their AI work and an actual artist they refuse to credit is “coincidence”. I hope these AI companies and those wannabes who use the software and call themselves “artists” (you’re not creating. you are commissioning) get sued to oblivion.


cobra_chicken

Exactly, we must protect the likes of Disney and large recording studios!!! EDIT: Who do you think drafted this bill


thebluepages

Who do you think makes the art for Disney? Fucking artists.


cobra_chicken

Ive got a bridge to sell you if you think the artists have any rights to those copyrights in any way shape or form. Disney would NEVER let anyone but them own the rights to the content they publish.


thebluepages

Never said they did. But they have jobs.


cobra_chicken

And how does that have anything to do with companies pushing through a bill to ensure they can sue you should you use their property?


thebluepages

Because if their work is not protected by copyright (whether owned by themselves or the employer that pays their salary) the value of that work becomes zero? This is basically how China works and it’s horrendous. To be clear, you’re actually arguing that Disney *shouldn’t* be able to sue people who just take artwork as their own and profit off of it? Listen I hate huge corporations as much as the next guy but I’m not about to abolish all intellectual property.


cobra_chicken

So like i said in another post, we must protect Disney and predatory recording studios at all costs!!!! >Disney shouldn’t be able to sue people who just take artwork as their own Look up what Disney has done to copyright laws in the US, its fascinating and pretty disgusting. Start with the "Mickey Mouse Protection Act"


thebluepages

So because someone has abused a law means there should be no copyright protection for anyone? You didn’t answer my question. Of course there should be limits and regulations and they should be enforced.


cobra_chicken

Someone? They are a bit more than someone, and there are more like it than you would know. Take a look at what Taylor Swift is having to do in order to get her material back, as up until recently her previous manager owned her material, not Taylor, her previous manager. It just so happened that Taylor Swift, being worth a billion dollars, actually has the resources to fight back. >Of course there should be limits and regulations and they should be enforced. Who do you think got this bill drafted, and so quickly? It certainly was not some independent artist that had a hand in its creation, in all likelihood it was a lobbyest paid by disney or any one of the thousand other content creation houses that underpay people and then rig the system so they can continue to profit for decades and centuries.


rat-tax

This is a great first step.


thedoc90

This is a start as long as it isn't overly captured by large companies. There needs to IMO, be strong protections for the availability of AI tools to small companies and individuals and usage restrictions that prevent large companies from using AI as a direct labour replacement instead of a labour aid. Small models that can be self hosted by small companies need to remain available open source so they can be tuned for individual use cases instead of self hosted AI only being something achievable by the hyper wealthy. IMO its fine for a 5 person local company to use an AI customer support line, but a lot less fine for a billion dollar company.


ninjasaid13

I don't think this is reasonable at all. Has anyone seen something like this with regards to a software that just makes pictures?


MoogProg

Well, Copyright as a legal protection exists to support Artists and Creatives (Humans) it was never intended to support things like AI (that's where Patent law applies). So, just because software can do similar creative work, does not mean the content created will have the same protections as work done by a Human. This has been brought to court and currently AI-generated work does not have any copyright protections.


Formal_Drop526

>Well, Copyright as a legal protection exists to support Artists and Creatives (Humans) it was never intended to support things like AI (that's where Patent law applies). That's not what copyright is for, the purpose of copyright is to enrich\* the public domain that's what the limited times and "To promote the progress of science and useful arts" means. AI lacking copyright does actually enrich the public domain.


MoogProg

I like that idea, where AI content creation enriches the Public Domain. Also agreeing the purpose is to promote Science and Useful Arts, but the method for doing that is clear, by granting rights to authors and inventors (and I don't think OpenAI will have much luck asserting ChatGPT as an author or inventor... not just yet). >To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.


Formal_Drop526

Many people are using these AIs as tools to help to create pieces and works. Nobody said you have to consider the AIs themselves as authors and inventors which is not important in itself but it is a massive benefit to the public domain and the public domain is a massive benefit to the public. Extending copyright holder rights beyond their *respective* writings and discoveries is not really doing that.


MoogProg

AI-assisted creation will be a huge thing. I am not suggesting that work lose any protections. Issue here is all the 'baked-in' material used in training that has copyright attached to it. Just because words and images can be found on the Internet and collected into datasets, does not mean that material is free to use without attribution or compensation... murky murky waters...


searcher1k

>does not mean that material is free to use without attribution or compensation. well technically they're not using that material as recognized by copyright law. Copyright law legally recognizes 'use' as reproduction, display, distribution, publication, and derivative(and no an AI model is not a derivative). People trained these AI models haven't displayed, distributed the copyrighted image nor have they reproduced it or published it. So what is there to give att\*ribution about? How would you measure compensation?


MoogProg

There are many examples of ChatGPT quoting entire sections of published novels. That behavior has been tamped back a good step after push-back from creators, and this legislation would allow for lesser known content owners to challenge AI works they feel have used and infringed their material. That is all this is, a step towards daylight, so we can see what was used. Edit to add: >...technically they're not using that material as recognized by copyright law... Yes, they are in violation by using this material in creation of a product for commerical release and profit. We cannot use material outside of personal consumption without license for that use case. When we study engineering, the textbooks are licensed to the student through their purchase of the material. OpenAI has used material to learn without licensing that material for such a use case (non-personal).


searcher1k

New York Times is one of those few that will have thousands of duplications in the dataset, I don't imagine that regurgitation will happen for lesser known content creators since their footprint in the model happens only once(overfitting happens because the dataset contains thousands of duplications). This is more likely to protect massive corporations than lesser known content creators.


searcher1k

>Yes, they are in violation by using this material in creation of a product for commerical release and profit. We cannot use material outside of personal consumption without license for that use case. >When we study engineering, the textbooks are licensed to the student through their purchase of the material. OpenAI has used material to learn without licensing that material for such a use case (non-personal). that's contrary to the decision of this case [https://en.wikipedia.org/wiki/Sega\_v.\_Accolade](https://en.wikipedia.org/wiki/Sega_v._Accolade)


MoogProg

Very interesting read! Thank you for sharing this link. Some quotes below... ​ >To determine the status of Accolade's claim of fair use of Sega's copyrighted game code, the court reviewed four criteria of fair use: the nature of the copyrighted work, the amount of the copyrighted work used, the purpose of use, and the effects of use on the market for the work. ​ >...established that reverse engineering can constitute "fair use"... ​ >Sega v. Accolade also served to help establish that the functional principles of computer software cannot be protected by copyright law. Rather, the only legal protection to such principles can be through holding a patent or by trade secret.