This bill reads like it was written by Adobe.
This provenance labelling scheme already exists. Adobe was a major force behind it. (see here: https://en.wikipedia.org/wiki/Content_Authenticity_Initiative ). This bill would make it so that further development will be tax-funded through organizations like DARPA.
Of course, they are also against fair use. They pay license fees for AI training. For them, it means more cash flow.
It’s rather more than that. In the very least, it is a DRM system, meant to curtail fair use. We’re not just talking about AI training. The AutoTLDR bot here would also be affected. Manually copy/pasting articles while removing the metadata becomes illegal. Platforms have a legal duty to stop copyright infringement. In practice, they will probably have to use the metadata label to stop reposts and re-uploads of images and articles.
This bill was obviously written by lobbyists for major corpos like Adobe. This wants to make the C2PA standard legally binding. They have been working on this for the last couple years. OpenAI already uses it.
In the very least, this bill will entrench the monopolies of the corporations behind it; at the expense of the rights of ordinary people.
I don’t think it’ll stop there. Look at age verification laws in various red states and around the world. Once you have this system in place, it would be obvious to demand mandatory content warnings in the metadata. We’re not just talking about erotic images but also about articles on LGBTQ matters.
More control over the flow of information is the way we are going anyway. From age-verification to copyright enforcement, it’s all about making sure that only the right people can access certain information. Copyright used to be about what businesses can print a book. Now it’s about what you can do at home with your own computer. We’re moving in this dystopian direction, anyway, and this bill is a big step.
The bill talks about “provenance”. The ambition is literally a system to track where information comes from and how it is processed. If this was merely DRM, that would be bad enough. But this is an intentionally dystopian overreach.
EG you have cameras that automatically add the tracking data to all photos and then photoshop adds data about all post-processing. Obviously, this can’t be secure. (NB: This is real and not hypothetical. More)
The thing is, a door lock isn’t secure either. It takes seconds to break down a door, or to break a window instead. The secret ingredient is surveillance and punishment. Someone hears or sees something and calls the police. To make the ambition work, you need something at the hardware level in any device that can process and store data. You also need a lot of surveillance to crack down on people who deal in illegal hardware.
I’m afraid, this is not as crazy as it sounds. You may have heard about the recent “Chat Control” debate in the EU. That is a proposal, with a lot of support, that would let police scan the files on a phone to look for “child porn” (mind that this includes sexy selfies that 17-year-olds exchange with their friends). Mandatory watermarking, that let the government trace a photo to the camera and its owner, is mild by comparison.
The bill wants government agencies like DARPA to help in the development of better tracking systems. Nice for the corpos that they get some of that tax money. But it also creates a dynamic in the government that will make it much more likely that we continue on a dystopian path. For agencies, funding will be on the line; plus there are egos. Meanwhile, you still have the content industry lobbying for more control over its intellectual “property”.