The people writing these regulations don't know what the problems of the future are. The problems of the present are tricky because every problem has a constituency (if it didn't it wouldn't be a problem.).
The EU is trying to get around this by acting quickly before constituencies can develop. This will make no one happy - it's like my wife telling me that a car's going to hit me if I walk out the front door so she's locking me inside. All I'll know about is that I can't go for a walk. I might hear that the neighbor got hit by a car but I'll also hear that they get to go for walks so to the degree the strategy is successful it's also to the degree that it fails.
> ...remote biometric identification systems in publicly accessible spaces, biometric categorisation systems (e.g. categorizing by gender, race, ethnicity, citizenship status, religion, political orientation) and the use of AI for predictive policing.
> AI systems which can influence voters in political campaigns and by use of suggestion systems on very large platforms...
> New transparency and risk assessment requirements for providers of (generative) foundation models like GTP.
> Clarified exemptions for research.
Putting these kinds of restrictions in place is absolutely a good thing. While they might not get everything right, this is a step in the right direction. Our laws and understanding as a society has been lagging behind technological development for decades now. That fact has enabled a large amount of exploitation to take place, which has (in the last decade especially) had a large hand in massively undermining our democracies.
>New transparency and risk assessment requirements for providers of (generative) foundation models like GTP
This is absurd. For a relatively small sum in the grand scheme of things, I could rent a few A100s, download a free dataset and train a model like LLaMA 30B, which is comparable to GPT3 (and indeed there already are such efforts popping up). Such a law could potentially make it illegal to upload such a thing if you live in the EU without going through a potentially expensive and bureaucratic process. It will completely stifle AI development the same way requiring people to going through a bunch of paperwork to upload a new library would stifle web development.
Did you actually read the proposed text of the AI act? I certainly want more oversight over a medical startup or airplane manufacturer doing AI in the essential system components than otherwise. I think adding a high-risk category is a brilliant move.
The problem with this type of regulation is that you need to add huge amount of vagueness in the law to be future proof. Leading to huge amounts of uncertainty for companies, unnecessary red tape and higher legal fees.
As well as giving the power to a judge to punish companies if the public sentiment goes negative.
Laws can be iterated on. Uncertainty, red tape and legal fees all seem like worth-while brakes on a potentially dangerous industry. Better to be safe than sorry.
For Americans, this concept seems to be almost alien given the (at least from an European POV) more or less constant gridlock between House, Senate, Presidency and whatever the 50 states make out of that regarding enforcement.
Or, to put it differently, they prefer the Wild West and barely self-regulated markets because they have completely lost any trust in government to create and modernize laws - a viewpoint that does make sense given the ridiculous age of key players in Congress. Feinstein is 90 years old, both likely Presidential candidates are over 75, Senators' median age is 65. How can anyone expect these people to even understand modern issues?!
I think this is also the cause why so many American companies failed or have massive difficulties entering the European market. They simply cannot think that other countries have governments that actually govern and regulatory agencies that don't take it well if foreign companies try to buy their way out of trouble.
Eventually though it leads to a situation where no one has any trust left in government, which is extremely dangerous from a democracy perspective - it breeds resentment, splintering/secession and people taking the law into their own hands (or to put it bluntly, shooting at everything they deem a threat - including children playing hide-and-seek [1]).
That's also the reason why there are so many doomsday preppers in the USA vs. everywhere else on the planet that isn't an active warzone. These people simply don't trust the government to keep them alive in a time of crisis.
> extremely dangerous from a democracy perspective
How is it extremely dangerous? Over a hundred million people died in the 20th century from trusting their government too much, nothing compares to that.
Simple: if enough people do not trust the government and do not go to vote, the government loses its democratic legitimacy - and fringe extremists gain ever more power. What happened to the Republican Party should ring all alarm bells - the Bush era was bad enough, but look just how far the moderates have eroded from the party since these times. The fact that the current top runner for the GOP Presidency nomination in 2024 will be a man twice impeached and convicted of (for now at least) sexual assault or that a complete fraud (George Santos, if that even is his legal real name?!) could gain a seat in Congress is worrying - where have all the people gone that would have said "no, we want someone who can at least behave themselves in a somewhat decent manner worthy of the office"?
The alternative can be seen in France: many have voted Macron purely because he was (and is) better than le Pen and the other parties have all but eroded - and now the country is embroiled in riots because, surprise, the population didn't vote for this shit of a pension reform: they voted to simply not have a fascist in office.
> Over a hundred million people died in the 20th century from trusting their government too much, nothing compares to that.
Hitler's rise to power was precisely the other way around - mainly due the exploding inflation after WW1, an economy hampered by reparations and the subsequent loss of trust in democracy and the government. The people flocked to Hitler because he ran on a platform of scapegoating - Hitler's platform was to blame the "rich Jewish elites" and that their extinction would save the people.
The most troubling thing for me is just how many parallels the rise of Hitler has with our current economic situation. Rampant inflation and explosion of costs of living, government budgets strained by the combined cost of massive economic crises (2008ff financial crisis, euro crisis, migration crisis, COVID, Russian invasion), external enemies to rally the people behind (China), a loss of trust in democracy accompanied by a world-wide rise of charismatic strongmen (Trump, Putin, Erdogan, Bolsonaro, Xi, Salvini/Meloni), lies and propaganda running unchecked, open violence in the streets... history is repeating itself, right as the last survivors of the 1933-1945 era have died - and those few that are still alive have kept sounding the alarm for years now without being heard.
the US has more of a culture of self sufficiency. This isn’t necessarily a bad thing. It also has higher incomes that allow for such types of frivolous spending
> It also has higher incomes that allow for such types of frivolous spending
We have the same net income as the Americans in Europe (ludicrous tech salaries aside), we simply pay collectively with our taxes for stuff that Americans have to pay for on their own, first and foremost healthcare and retirement.
Yeah changing and updating laws sucks everywhere. I'm sure some places are worse or better. But most democracies are set up on purpose to frustrate the process.
It's hard to find the line between better safe then sorry.
On the surface level you are right.
But look at the cost of medicine development, it has exponentially risen to billions in the last decades, stifling innovation. And although it maybe got a bit safer, it didn't get exponentially safer.
TÜV certification is the red tape I want. Just how in the North America people respect appliances with UL marking.
But regarding the uncertainty, I agree with you. And I guess, it's inevitable given the pace of change and innovation. I am expecting more ISO standards to be created and updated in response to the AI Act, which will limit the uncertainty to some extent.
Laws can change, but if hard damage is done, there is usually nothing you can do about.
AI at the moment is moving fast and unreasonable, we already have first waves of victims. Putting a lid on it, and slowing it down seems reasonable, even if it won't be a perfect solution. Interesting part is, this is pretty similar to the situation in 2020, when the pandemia started. Nobody knew exactly what's coming and to navigate it, but everyone tried their best to survive what everyone saw unfolding on global scale.
The "self driving" cars have already killed several people, and nobody can object those aren't sold as "modern AI systems" by the companies that produce them.
I guess you meant to say something like "our first wave" rather than "no first wave"?
> every problem [in the present] has a constituency
The internet as a way of distribution is not exactly new, plenty of legislations on various levels have been dealing with that fact for a while, for instance GDPR.
"The people writing these regulations" cannot predict the future indeed, but that is obvious. It should not serve as an excuse for doing nothing, though.
The EU is trying to get around this by acting quickly before constituencies can develop. This will make no one happy - it's like my wife telling me that a car's going to hit me if I walk out the front door so she's locking me inside. All I'll know about is that I can't go for a walk. I might hear that the neighbor got hit by a car but I'll also hear that they get to go for walks so to the degree the strategy is successful it's also to the degree that it fails.