That this even has to be clarified is just a joke.
I mean if I buy an oscar statue off Ebay, how many Oscars do I have? Zero. I have an oscar Statue. But no one cares how many Oscar Statues actors get. They care how many times they were awarded.
Wasn't there a similar story about some bravery medal (purple heart)?`
While they do, 2025 was also the first year that the fraction of coal dropped in both China and India. In india it dropped by 3% and in China by 1.6%. So they build out fossil, but they build out non-coal power faster. China also hopes to peak coal in absolute terms by 2030. That's something at least.
...which is why when comparing countries per capita emissions is the correct measure for deciding which countries are doing a better or worse job of addressing the problem.
Also, I think for 99% of Wikipedia, there isn't much need to worry about Biases. It's about an uncontroversial chemical compound, a tiny village, a family of bacteria and so on. Knowledge isn't all subjective and prone to bias.
Aren't there any scenarios where a C compiler (without assistance by the developer) must be defensive about aliasing, in a way that the Rust compiler must not be?
I guess you could argue that C would reach the same speed because noalias is part of C as well. But I'd say that the interesting competition is for how fast idiomatic and "hand-optimized" (no unrolling, no aliasing hints etc) code is.
Comparing programming languages "performance" only makes sense if comparing idiomatic code. But you could argue that noalias in C is idiomatic. But you could equally well argue that multi threading in Rust is more idiomatic than it is in C and so on. That's where it becomes interesting (and difficult) to quantify.
Yes. Your point about noalias (the keyword is 'restrict' in C, noalias is the LLVM IR annotation) is right.
What I will say is that the fact that Rust uses this so much, and had to turn it off because of all the bugs it shook out, at least implies that it's not used very much in real-world C code. I don't know how to more scientifically analyze that, though.
I don't think _software_ is very interesting in the whole AI debate. It's perhaps interesting from a jobs perspective or an economical perspective. But the whole "anti AI" thing is much deeper than that. My main objections to AI is the evaporation of truth, and of art.
We now have top chart hits which are soulless AI songs. It's perhaps a testament to the fact that some of these genres where this happens a lot, were already trending towards industrially produced songs with little soul in them (you know what genres these are, and it's hilarious that one of them). But most concerning to me is the idea that we'll never trust our eyes with what's true starting now.
We can't trust that someone who calls us is human, or that a photo or recording is of a real event. This was always true in some sense, but it required a ton of effort to pull off at least. Now it's going to be trivial. And for every photo depicting an actual event, there will be a thousand depicting non-events. What does that do to the most important thing we have as a society: the "shared truth"? The decay of traditional media already put a big dent in this - with catastrophic results. Ai will make it 10x worse.
The browser versions aren't as good as the desktop versions. And Googles alternatives aren't as good as Microsofts. Both do 60% of the job, which is probably enough for 80% of the people.
I wouldn't worry much about operating systems. They are just a means to an end (which is running your applications) few are in the situation that 100% of their applications run on more than 1 OS, so they even get a choice of OS.
>They certainly know that you dont want to process garbage data from freed memory.
It depends on what you mean by "freed". Can one write a custom allocator in Rust? How does one handle reading from special addresses that represent hardware? In both of these scenarios, one might read from or write to memory that is not obviously allocated.
Both of those things can be done in Rust, but not in safe Rust, you have to use unsafe APIs that don't check lifetimes at compile time. Safe Rust assumes a clear distinction between memory allocations that are still live and those that have been deallocated, and that you never want to access the latter, which of course is true for most applications.
You can indeed write custom allocators, and you can read to or write from special addresses. The former will usually, and the latter will always, require some use of `unsafe` in order to declare to the compiler: "I have verified that the rules of ownership and borrowing are respected in this block of code".
I think this, as with many user interfaces, comes down to the use case.
A rarely used UI needs to be easy to navigate. Remove clutter, place the often used feature front and center and the rarely used features behind multiple navigation steps. The user primarily _navigates_ this UI, they don't _memorize_ it.
A constantly used UI such as an application that a professional uses from 9 to 5 five days a week (An IDE, a Cad Program, a video editing thing) is a completely different beast. The speed of accessing a feature is more important than the discoverability. The user internalizes the UI and the UI needs to aid the user in doing so. Icons in menus means the user eventually doesn't need to read the text label.
Then their content can just go away tbh. This isn't some big ethical dilemma either.
Either find a way to make content that doesn't rely on ads, or stop making content. If the whole ad-funded internet disappeared tomorrow morning, would it really matter?
I mean if I buy an oscar statue off Ebay, how many Oscars do I have? Zero. I have an oscar Statue. But no one cares how many Oscar Statues actors get. They care how many times they were awarded.
Wasn't there a similar story about some bravery medal (purple heart)?`
reply