> You can't spit out a black box that no one knows what it does and how it was trained?
What if I want to? Who on earth has the ethical authority to claim the right to control my ability to release a set of weights? Only what is done with those weights should be legislated, and extremely conservatively.
Maybe a lot of software developers got into this career for the money, or because they like solving problems. But for me, politics inseparable from software engineering. Politics are why I devoted myself to the craft. This is the exact kind of situation in which my knowledge and skill become tools of protest.
I don't think this law is meant to stop these things at all. It's meant to make sure that companies and governments who do this and use models as an excuse go free and have someone to blame if they get caught.
> Do you also complain about so many other things that "unethically curb your abilities"?
You curiously omitted the last part of the sentence: "...to release a set of weights". Please don't pretend that I was speaking about anything else, and don't overgeneralize my statements; that becomes a straw man argument. Believe it or not, some laws are unethical. I am happy to provide examples.
It's a case-by-case basis which involves evaluating the overall impact on human rights for all parties involved.
The ideal scenario is one where all rights are preserved under good faith, and publishing/owning models is treated no different than any other software project, while the actual use of such software continues to be subject to existing laws. In this case, can strengthen consumer rights without weakening developer rights.
> Ah yes, let's regulate this black box with no insight into what it does, and only guess at its possible outcome. Whatever can go wrong?
I specifically said we should not be legislating weights, so I'm confused about which point you are trying to make. Weights are the black box. Company policy, employee behavior, and business logic are not, and are accessible for scrutiny by the courts if needed. So no, let's not regulate the existence of software, which sets an incredibly dark precedent for digital sovereignty.
> while the actual use of such software continues to be subject to existing laws. In this case, can strengthen consumer rights without weakening developer rights.
Emphasis mine
--- start quote ---
ANNEX IV
TECHNICAL DOCUMENTATION referred to in Article 11(1)
...
2. A detailed description of the elements of the AI system and of the process for its development, including:
...
where relevant, the data requirements in terms of datasheets describing the training methodologies and techniques and the training data sets used, including information about the provenance of those data sets, their scope and main characteristics; how the data was obtained and selected; labelling procedures (e.g. for supervised learning), data cleaning methodologies (e.g. outliers detection);
--- end quote ---
So let's see Article 11(1)
--- start quote ---
The technical documentation of a high-risk AI system shall be drawn up before that system is placed on the market or put into service and shall be kept up-to date.
I've made no claims as to the nature of the law, I've only asked questions about particulars and responded to the answers. If anyone is spreading FUD, it isn't me.
What if I want to? Who on earth has the ethical authority to claim the right to control my ability to release a set of weights? Only what is done with those weights should be legislated, and extremely conservatively.
Maybe a lot of software developers got into this career for the money, or because they like solving problems. But for me, politics inseparable from software engineering. Politics are why I devoted myself to the craft. This is the exact kind of situation in which my knowledge and skill become tools of protest.