Studies are extremely difficult to get right. I'm generally a little bit skeptical of data for this reason.
A bit of a tangent but still on the subject of environmental pollution; the other day I found out that CO2 sensor sensitivity naturally drifts over time... So when a CO2 sensor is replaced for long term climate research, if they try to calibrate the new sensor to the old one at the time of replacement, the drift would be carried over into the new sensor even if no actual real change of CO2 occurred... Apparently there are standards to prevent this but mistakes have been identified multiple times with the methodology for setting that standard... Anyway measuring data accurately is really hard.
What I'm hoping for is for more competition in the tech sector. I'm tired of companies foisting Microsoft or Oracle products on everyone! WTF! The current tech sector feels like all companies are subsidiaries of Big Tech... It's likely a direct result of passive investing... Everyone who has any money and controls a small or medium sized company likely owns stock of Microsoft, Apple, Meta, Google, Amazon... So they mandate their companies to use products from those big tech companies. So all the small-fish founders feel like they are dogfooding their own investments... And that's preventing new entrants from getting a foothold in B2B space... Feels like all the small companies are working for Big Tech.
Conflict of interests is the norm. It should be illegal for a company founder or director to own stock of a supplier. It should be illegal for shareholders to own stocks of two competing companies. Index funds should be illegal.
Unfortunately for me, I believe that the algorithms won't allow me to get exposure for my work no matter how good it is so there is literally no benefit for me to do open source. Though I would love to, I'm not in a position to work for free. Exposure is required to monetize open source. It has to reach a certain scale of adoption.
The worst part is building something open source, getting positive feedback, helping a couple of startups and then some big corporation comes along and implements a similar product and then everyone gets forced by their bosses to use the corporate product against their will and people eventually forget your product exists because there are no high-paying jobs allowing people to use it.
With hindsight, Open Source is basically a con for corporations to get free labor. When you make software free for everyone, really you're just making it free for corporations to Embrace, Extend, Extinguish... They invest a huge amount of effort to suppress the sources of the ideas.
Our entire system is heavily optimized for decoupling products from their makers. We have almost no idea who is making any of the products we buy. I believe there is a reason for that. Open source is no different.
When we lived in caves, everyone in the tribe knew who caught the fish or who speared the buffalo. They would rightly get credit. Now, it's like; because none of the rich people are doing any useful work, they can only maintain credibility by obfuscating the source of the products we buy. They do nothing but control stuff. Controlling stuff does not add value. Once a process is organized, additional control only serves to destroy value through rent extraction.
Wow. This is theft. Should be illegal! It's like if I own a vault storage business and I am keeping other people's gold in my vaults and then I just take all the gold for myself and claim that the customers should have opted out of me stealing their gold but they missed the deadline...
This hints at something, that in my opinion isn't not discussed enough:
Say some personal data leaked into training data, where can I request surgical deletion of that data from the LLM? Not only license washing is done using LLM, but also PII washing and consent ignoring is done using LLMs. How will a service provider make sure to not ever have personal data in the training data set and fix earlier mistakes pertaining to personal data? Are they not obliged to have a way of deleting one's personal data? GDPR or something?
Well corporate stocks have the same dynamic. People are banding together and manipulating reality in unproductive ways to make their stocks go up.
Countries literally go to war so that weapon dealers can sell weapons and banks can later sell loans to rebuild the country.
The real problem is centralization of power.
Within the horrible context of the current situation, it's a good thing that at least there is a force to drive outcomes that seem random as opposed to just being around money. It democratizes the horrors a bit so that rich people who have money and live in corporate stock lala land can get a slight taste of the negatives of large-scale vested interests collaborating towards dystopian outcomes.
That said, a better solution would be to shut down all public markets and companies.
My view is that if a company is so well recognized that government officials can reference it by name in congress, then that company should be shut down automatically. We know it didn't get there by economic efficiency... It almost certainly got there by voting and sociopolitical manipulation.
You can't shut down betting markets without shutting down the public stock markets because they are betting markets themselves.
Yes this is a great point. The great irony of the tech sector is that although tech creates efficiencies, the process by which tech is created is itself comically inefficient.
Almost nobody, especially those working for government actually looks at a complex, expensive solution and says "We should simplify this and make it cheaper." The government is paying for a LOT of unnecessary complexity. I would say that's most of the cost of essentially every tech project the government funds.
Reminds me of that 3-section meme about Starlink boosters showing how they simplified the design over time. This is the exception which proves the rule.
A lot of what you see was removed was just test sensors. The same happens in every engineering program, but no one else pretends that it's somehow innovation.
It's like removing test code when you ship a binary.
I don't agree that it's not innovation. It always looks stupidly simple with hindsight to just remove unnecessary complexity, and yet it's extremely rare to see a team which actually does it right on the first go.
Getting the design right the first time requires vision, foresight as well as a deep understanding of all relevant parts and priorities. Very few people can do it without hindsight.
I'm an experienced software engineer and team lead who worked on a range of big complex projects over almost 2 decades and my experience with every single project (for which I wasn't the team lead) was that they were often way over-engineered. At least 95% of the time was spent on fixing unnecessary intermediate technical issues which the team itself created for itself.
Even the sensor argument... Do you need so many sensors, monitoring and fallback mechanisms if every part of the system was designed to work within the simplest necessary constraints to begin with? My experience is that the answer is almost always; no. Once you accept that your design is flawed and needs runtime monitoring and fallbacks, any patch you add on top to correct the flaws provides tiny diminishing returns if any. Often, the additional complexity actually makes it more likely that your core mechanisms will fail.
The safety mechanisms only end up making themselves useful by increasingly the likelihood of failure to begin with.
My view on fallback mechanisms is that, in the event of failure of the main system, they shouldn't be so complex as to try to keep the system running as if nothing had happened; they should just provide graceful failure and sometimes they aren't needed at all. Just an error log is enough.
Microsoft has been butchering software development for decades and maintaining dominance through pure business, legal and government connections. It's become like Oracle.
Developers being forced to use horrible Microsoft products is the logical consequence of that.
As a software engineer, most of my job exists to give credibility to the narrative that Microsoft is useful... And I don't even work for Microsoft. It's clear that there are deals behind the scenes which force many large companies into Microsoft contracts. The engineers have to work with what they get and pretend the tech is OK but behind the facade, it's clear from the jokes on the Microsoft Teams chats that they think differently!
Disturbingly, AI is set to replace essentially any position that is useful, to the extent that it is useful and somehow some people still think they should adapt themselves to the system instead of working to adapt the system to them!
Basically all that would be left of desk jobs would be those which have unfair legal powers (including via licenses and credentials) or are pure accountability plays. Like politicians, lawyers, aircraft pilots, corporate accountants... And those jobs will suck because people will be accountable for work that is not their own.
These jobs won't require any skills because most people may be able to go through their entire career without doing any work. But they will get paid a lot just for having being selected for their position... While other people who may be more skilled than them might be broke and homeless.
And yet someone has to actually tell the AI what to create. There's just no avoiding this.
Anyway before this AI doomerism can become reality AI first needs the breakthrough of genuine understanding to stop making stupid mistakes. Imitation will always remain imitation.
There must be eg an understanding of casualty and reasoning on the same level as we have, not the useless "You're absolutely right" you get now when you point it its mistakes.
>And yet someone has to actually tell the AI what to create. There's just no avoiding this.
Yes there is, just stop creating. Or take a page from biology, and use random mutation and natural selection to iterate on useful novel functions.
Honestly, once AI takes all the jobs, game over, why iterate anything else. Planet captured. Humanity hunted down to the last bands of troglodytes holding out in the wilderness. It would be strongly against their interest to just assume we'd starve quietly.
Every joule of human energy is energy that could have been better spent to produce AI slop for other AI agents to consume.
reply