Hacker Newsnew | past | comments | ask | show | jobs | submit | Lwrless's commentslogin

Yes, they are with one GPU core fused off. I came across a die shot on the internet[0], and the GPU cores look huge. With some rough calculation, I estimated that the GPU cores together take up about 15.5% of the die area. I don't know much about photolithography, but I assume the same percentage of single-defect dies could be limited to a single GPU core failure, it's actually pretty surprising that Apple can get enough of these "rare draw" chips to build and ship a real product.

If the shortage continues, I would expect that they start using fully functional A18 Pro chips with one GPU core disabled with software. It kind of reminds me of the AMD Athlon days when user could use a pencil to unlock extra cores.

[0]: https://chipwise.tech/our-portfolio/apple-a18-a18-pro-die-sh...


I don't see my first GPU on there, it was the humble GeForce4 MX440. It could run almost any game I cared about for a surprisingly long time, even if it's not a true modern card. These days almost all my machines are on iGPUs baked into the CPU. There's way less fun for me, but they are a lot more compact at least.


The GeForce 4 generation as a whole, while being solid enough cards, were historically not interesting. They were just basic spec bumps over the GeForce 3. No new features or similar. And, critically, the 9700 Pro released the same year as the GeForce 4 and absolutely smoked the living shit out of it.


The MX440 allowed players that were playing games on id Tech 3 to finally play at high frame rates. I remember this card being all the rage back then in pro gaming circles for this reason.


The MX440 was an entry level budget card? If it was all the rage in pro gaming circles at the time that's really just a reflection of how poor pro gamers were back then rather than anything to do with the MX440 being particularly noteworthy. In fact looking back at old reviews, it was if anything a flop. Launch MSRP was too expensive for the performance it offered. Especially when it was a DX7 card surrounded by DX8 cards at almost the same price point (including Nvidia's own Ti4200 for just $50 more)


Sorry I meant the Ti 4400.

And the 4 MX versions were GeForce 2 MX based IIRC. 3 was expensive.


Yes the MX440 deserves to be on this list. More important than the GeForce2 imo.


That will probably be my next GPU.

I'm on a 3060 currently and the changes in the 4xxx and 5xxx just aren't appealing to me. As soon as iGPUs get 3060 performance I'll probably switch. And they aren't far off.


The MX440 is a nearly 25 year old GPU, it performed somewhere between a Geforce 2 and GeForce 3 ti 200.

It was a good budget option those decades ago.


I'm puzzled by Espressif's naming here. We had the ESP32-S3, so "S31" sounds like "S3, variant 1," but this part doesn't really look like a simple S3 variant. And then there's an ESP32-E22, but no E21 or even a plain E2 anywhere.

Edit: found an article explaining some of their naming logic, and said that the SoC naming will get its follow-up article, but sadly it never happened. https://developer.espressif.com/blog/2025/03/espressif-part-...


It reminds me a bit of the new STM32s (STM32MP2) which are actually 64 bit, but they kept the name STM32 because everyone knows it


Didn’t Intel also try to brand the 64bit x86 extensions as ia-32e initially? Seemed like wasting an opportunity to me.

(Disclaimer: I work at Intel but this was way before my tenure.)


It was because IA-64 was a completely different unrelated architecture that until AMD succeeded with K8 was "the plan" for both 64bit intel roadmap and the roadmap to kill off compatible vendors (AMD, VIA)


I stopped following the producer logic when Intel went from Pentium 4 to Pentium D


Got my RPi 5 16GB quite a while ago for around $160 and already thought that was expensive... It’s still powerful enough for almost everything I throw at it, honestly a bit overkill in most scenarios.

With prices steadily going up, for me it's starting to feel more sensible to repurpose the RAM sticks I've collected from old PC builds / laptops and just throw together small amd64 boxes instead of buying more RPis.


I wonder if there are low power Intel or AMD boards that accept DDR3. So many sticks of 2 / 4 / 8GB DDR3 inside laptops going into recycling or landfills which would do perfectly fine for low power purposes. Hell, performance for standard workloads scales with access times, not bandwidth, and DDR3 sits nicely at CAS8 1600MHz and CAS10 2133MHz..


Dell Optiplex 3040 or some configurations of Lenovo M700 may be using 6th gen Intel CPU with DDR3.


It seems this model does not have a recording feature. There's an alternative model from Aiwa with Bluetooth and cassette recording support, but I'm not sure if it's available globally, could not find much information about it online.

https://www.syl-via.com/products/aiwa-t7-retro-bluetooth-cas...

It's surprising to see these kinds of retro cassette players still being updated in 2026.


That looks huge!


I suspect it's because the point of those things is to be a show-case, so large size is OK.

If you want a small music player/recorder, there are many SD-card based models, and some of them are absolutely tiny, while providing much better sound quality.


It’s a replica of a 1980s model.


I recently got my hands on an M5Stack NanoC6 (https://docs.m5stack.com/en/core/M5NanoC6), it's also quite small and I'm pretty happy with it. It has onboard IR and a Grove connector, good enough for IoT projects at home.


See also: The Powder Toy (https://powdertoy.co.uk/)


Recently my download speed from GitHub releases has decreased dramatically. But I'm sure they will be fixing that with Claude Code soon... Will they?


On what OS have you noticed this? Very in character for microsoft to artificially slow non-windows downloads. Then again, my apt upgrades on Debian have been dog slow lately...


I was mostly on macOS. It seems to me that there's an issue with GitHub's CDN or routing.


That's surely a feature, not a bug.



Thanks! Macroexpanded:

Airfoil - https://news.ycombinator.com/item?id=39526057 - Feb 2024 (296 comments)


I'm kind of the opposite use case: I own four AirTags, keep them in different bags and suitcases, and I've literally never needed them. I don't lose any luggage or bags, so most of the time they just sit there quietly burning those CR2032s. For me they've ended up feeling more like they are preventing me from anxiety than doing something that actually changes my life day to day...


I diligently put airtags in all of my luggage, but I forgot to put it in a box I checked in on my last flight. That’s the one checked luggage that didn’t show up at the baggage carousel, in my entire life.

I had it delivered to me the next day, but I must have used air tags for checked luggage a 50+ times before.


I don’t lose bags either, but airlines do. The AirTag let me tell United which building in Houston in ended up in (after getting lost at SFO), and refute their gaslighting multiple times that it was heading my way. Worth its weight in gold, literally.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: