I don't think number of parallel agents is the right productivity metric, or at least you need to account for agent efficiency.
Imagine a superhuman agent who does not need to run in endless loops. It could generate 100k line code-base in a few minutes or solve smaller features in seconds.
In a way, the inefficiency is what leads people to parallelism. There is only room for it because the agents are slow, perhaps the more inefficient and slower the individual agents are, the more parallel we can be.
Yeah, I don't disagree with your assessment at all. I think the H2A ratio is still a good metric for the AI adoption rate of an organization. At a higher H2A ratio, you will also start to hear people measuring things using token volumes, which I think is also a similar metric (because most models nowadays run on a relatively fixed Tokens/second speed).
All of this is not a direct signal to a productivity boost. I think at higher volumes, you will need to start to account for the "yield" rate of the token volumes above: what are the volumes of tokens that get to the final production deployment? At which stage is it a constraint on the yield? Is it the models, or is it the harness, or something else (i.e. Code Review, CI/CD, Security Scans etc...)? And then it becomes an optimization problem to reduce the Cost of Goods Sold while improving/maintaining Revenues. The "productivity" will then be dissolved into multiple separate but more tangible metrics.
> Parents can’t help but worry about how a lack of AI preparedness will affect their kids’ future career prospects.
The parents I've interacted with who have this worry tend to be the least tech/AI literate. It reminds me of the pressure some parents in my non-English speaking country place on kids and English teachers. Usually the parents who don't understand English well themselves and are full of anxiety about it.
Realistically, it's been just over 3 years since ChatGPT arrived and the paradigms shift every year. All of us had to learn on the go as adults, in a short time, and we did. So why worry so much about kids missing out?
UMA's security model assumes the cost to corrupt the oracle exceeds the profit from corruption. It is quite interesting because it doesn't consider the Polymarket side at all in the calculation.
Doesn't this whole model break down when the Polymarket market far exceeds UMA's market cap?
In the last 90 days France's CO2 footprint is at 78% of Iceland's.
Also, what lessons learned in Iceland, Norway or Albania should we apply in central Europe? We don't have their geothermal and hydro potential (all your examples are not solar+wind but hydro primarily).
Right at this moment Germany's electricity mix has 364gCO2eq/kWh carbon intensity, France is at 21. That is because 37% of Germany's production comes from gas and coal.
Even from an environmental standpoint, France is doing much better than Germany and that is thanks to nuclear.
Also, by closing operating power plants, Germany weakened European energy production at the time when we geopolitically need it the most.
Still not Habecks legacy. This was decided under Merkel.
Besides, CO2eq are often wrongly measured with nuclear energy, ignoring building emissions and effect on the water temperature in rivers (every summer more and more plants need to shut down because of this), etc. Even if this would be done right, there are again and again longer periods where Germany exports energy to France because their reactors are often in maintenance. In the end I would say it is always a bad idea to rely on only one technology to a large degree. Only a well done mix makes you resiliant.
I agree on this point, me not being anti-Nuclear doesn't mean I am anti-wind or solar. Every country has different circumstances, I live in a landlocked country with mild mountains, temperate climate and modest rivers. In our case nuclear energy seems like the most reliable and scalable option. For countries with huge coastline off-shore wind absolutely makes sense, simialrly with solar.
> Besides, CO2eq are often wrongly measured with nuclear energy, ignoring building emissions.
I think this point is overestimated. Based on a brief search, studies show nuclear carbon intensity around 6-12g, and the building emissions just around 13% of total lifetime emissions [1].
> there are again and again longer periods where Germany exports energy to France because their reactors are often in maintenance.
Valid point but the 2022 French nuclear "disaster" hasn't repeated at that scale so far. In recent years France is a net exporter to Germany. I can imagine that as with many problems in renewables having technical solutions the water temperature problem is also solvable technically.
How strong correlation is there with energy exports from Germany and the status of France nuclear reactors? From a casual look, exports seems to be correlated with weather and access to excess energy, rather than demand from neighboring countries. I would like to see some support that "Germany exports energy to France because their reactors are often in maintenance", rather than Germany export energy to France because windy/sunny weather is producing excess energy.
Looking at Denmark, their export and imports have little correlation to neighboring countries demand or supply. They will try exporting if the wind farms produce excess energy, and they will try importing energy when demand exceed supply. Neighboring countries demand will mostly only have impact on export price.
> Also, by closing operating power plants, Germany weakened European energy production at the time when we geopolitically need it the most.
Ironically that was France which needed to shut down a lot of its nuclear reactors in 2022 and 2023 due to repairs. So according to your own logic France "weakened European energy production at the time when we geopolitically need it the most."
Here in Switzerland the reason given for the "energy crisis" was also mostly France as Switzerland usually imports a lot of energy from France.
Its tricky to use statistics for personal decisions. In general something might be correct but not for your specific subgroup. I know many people who changed for worse.
If you are in a bad position then change, but if you like the company and role, don’t take it for granted and think carefully.
This advice is consistent with the broad statistic if more than half of the sample is currently in “bad position”.
Since we are talking about BigTech, I can’t imagine to a first approximation any IC up to and including senior or a low level manager being at any BigTech company for a reason besides wanting to maximize their income via cash and RSUs.
Does anyone stay in the same position/team for more than two or three years even at the same company?
Imagine a superhuman agent who does not need to run in endless loops. It could generate 100k line code-base in a few minutes or solve smaller features in seconds.
In a way, the inefficiency is what leads people to parallelism. There is only room for it because the agents are slow, perhaps the more inefficient and slower the individual agents are, the more parallel we can be.