Hacker Newsnew | past | comments | ask | show | jobs | submit | some-guy's commentslogin

This is how I use my Canon t3i. Once in awhile everything will align perfectly, require very little editing and I feel a huge sense of accomplishment.

With practice and patience, the aligning perfectly will come into your control :) Just get out there and shoot more!

I'm at a large enterprise outfit, and "shoving things in your face" has been a problem with large software suites for a long time, long before the AI craze. I keep telling my skip level leadership that we need more User-Experience "mob goons" that have authority across product domains to (metaphorically) beat the living daylight out of bad "PM-brained" ideas.

My work 64GB M1 Max Macbook Pro is consistently out of memory. (To be fair my $LARGE_ENTERPRISE_EMPLOYER reserves about half of it to very bad Big Brother daemons and applications I have no control over)


I have a 128GB M3 Max from my employer. Due to some IT oversight, I was able to use it for a few months without the corporate "security" crapware. Didn't even ever noticed this machine had a fan before the "security theatre" corporate rootkits were installed.


> My work 64GB M1 Max Macbook Pro is consistently out of memory

What are you doing that needs that much memory?


I lived a block away from a hydrogen fuel station in Oakland, and in the ten years I was there I maybe saw two different Mirais use it.


I have only purchased Toyota vehicles (currently in the market for an EV) and it baffles me that Dodge created a Charger in EV form and Toyota hasn’t made even an EV Corolla or Camry.


> it baffles me that Dodge created a Charger in EV form and Toyota hasn’t made even an EV Corolla or Camry

Dodge's Charger EV has been a sales flop [1] and pretty much universally panned by critics as something that nobody asked for.

The Camry and Corolla were the best-selling sedan and compact sedan of 2025 [2]. I think this shows that Toyota is listening to what Corolla and Camry drivers want - something inexpensive and reliable to get them to and from work every day without issue.

Some day Toyota will make an EV sedan. I think their 2026 bZ Woodland [3] shows that they are starting to figure out how make compelling EVs. And Toyota's EV strategy seems pretty reasonable to me overall - their delays to develop a decent EV don't seem to put them under threat from any legacy automakers. They are being threatened by Chinese EV makers, but so is Tesla - so even a huge head start likely wouldn't have benefited Toyota much either in that regard.

[1] https://www.roadandtrack.com/news/a69927938/dodge-charger-da...

[2] https://www.caranddriver.com/news/g64457986/bestselling-cars...

[3] https://arstechnica.com/cars/2026/02/looks-a-lot-like-an-ele...


An electric Corolla or Camry is my ultimate. I hate driving.

I want an appliance that just works. The Corolla and Camry were this for petrol.

I love my Leaf but it isn’t a Carolla.

What’s with the turning circle on the Leaf?


That's essentially the bZ3. But a Corolla branded BEV will eventually happen:

https://electrek.co/2025/10/13/toyotas-best-selling-car-elec...


> Yes, while I use Fedora on my laptop, I also know Fedora is generally not a good option for a server.

Why is Fedora not considered good for a server?


It's a cutting-edge distro with 6-month release and 13-month support cycles.

Whereas Debian/Ubuntu have 5 years and RHEL/Alma/Rocky have 10 years.


I don't feel like this really answers the question thought, right? At least not at face value.

I could see the side of maintenance burden being a potential point, meaning that one would be "pushed" to update the system between releases more often than something else.


Typically you want stability and predictability in a server. A platform that has a long support lifecycle is often more attractive than one with a short lifecycle.

If you can stay on v12.x for 10 years versus having to upgrade yearly yo maintain support, that’s ideal. 12.x should always behave the same way with your app where-as every major version upgrade may have breaking changes.

Servers don’t need to change, typically. They’re not chasing those quick updates that we expect on desktops.


Yeah, and that's the take I assumed to hear based on what was said.

However, for something like ARM and the use case this particular device may have, in reality you would _want_ (my opinion) to be on a more rolling release distros to pick up the updates that make your system perform better.

I'd take a similar stance for devices that are built in a homelab for running LLMs.


Depends on what you're building an ARM system for. There are proper ARM servers out there; server work isn't the exclusive domain of x86, after all.

For homelabs, that's out the window. Do whatever you want/fits your needs best. This isn't the place where you'd likely find highly available networks, clustered or highly available services, UPS with battery banks, et. al.


I take it as no more than someone's personal opinion, since there is no reference provided whatsoever.


It's more maintenance due to its frequent release cycles, but it's perfectly good as a server OS. I've used it many times, friends use it.

You can't mess up the release cycle because their package repos drop old releases very quickly, so you're left stranded.

A friend recently converted his Fedora servers to RHEL10 because he has kids now and just doesn't have the time for the release cycle. So RHEL, or Debian, Alma, Rocky, offer a lot more stability and less maintenance requirement for people who have a life.


I'd also love to hear what folks have to say about this.

For myself I've had nothing but positive experiences running Fedora on my servers.


I think it's highly circumstantial. For example, my personal servers run a lot of FreeBSD and even though I could stay on major releases for a rather long time, I usually upgrade almost as soon as new releases are available.

For servers at work, I tried running Fedora. The idea was that it would be easier to have small, frequent updates rather than large, infrequent updates. Didn't work. App developers never had enough time to port their stuff to new releases of underpinning software, so we frequently had servers with unsupported OS version. Gave up and switched to RockyLinux. We're in the process of upgrading the Rocky8-based stuff to Rocky9. Rocky9 was released 2022.


I generally agree with you. As a recent father with a toddler, and two parents with a full time job, I’ve found that the only way I can make time for those personal side projects is to use AI to do most of the bootstrapping, and then do the final tweaks on my own. Most of this is around home automation, managing my Linux ISO server, among other things. But it certainly would be more fun and rewarding if I did it all myself.


This feels like the same moment for me when I realized I couldn't keep using Gentoo and needed to move on to a Linux distribution that was ready to go without lots of manual effort. I have a family and kids I need those hours. I had the same feeling as OP of losing a fun learning activity. No longer progressing on Linux knowledge just maintaining. Granted it was good enough level to move on but it's still a loss.

I do the same as you with AI now, it's allowing me to build simple things quickly and revise later. Sometimes I never have to. I feel similarly that I'm no longer progressing as a dev just maintaining what I know. That might change I might adapt how I approach work and find the balance but for now it's a new activity entirely.

I've talked to many people over the years who saw coding as a get shit done activity. Stop when it's good enough. They never approached it really as a hobby and a learning experience. It wasn't about self progression to them. Mentioning that I read computer books resulted in a disgusted face "You can just google what you need when you need it".

Always felt odd to me, software development was my hobby something I loved not just a job. Now I think they will thrive in this world. It's pure results. No need to know a breath of things or what's out there to start on the right foot. AI has it all somewhere in it's matrix. Hopefully they develop enough taste to figure out what's good from bad when it's something that matters.


I’m at $LARGE_ENTERPRISE_SAAS and I agree. There is a mass psychosis going on around what these LLM tools (which I use daily) are capable of *at scale*. The amount of business processes and tasks these software suites can, and must perform at near 100% correctness every time is massive, across an insane number of domains, accounting for an insane number of laws, countries, languages, browser configurations, business requests, legal teams. The list goes on, and while you can bootstrap a front end that appears to do 80% of a large dinosaur competitor like ours, the reality is it can’t, and the context windows to get there are in the orders of magnitude larger than they are today.

The weird part is that people at our company also fail to see this. “This vibe coder is going to recreate 20+ years of code, use cases, business processes and integrations for thousands of companies across hundreds of domains!” is uttered every day and just simply isn’t true.


It's certainly not true yet, but LLM abilities now vs two years ago leads really makes you think. It may not be easy to replicate all you do, but new entrants could easily just go after the highest margin parts of your business. Why try to tackle All the countries and browser configs when you can get 90% of the profitable ones, and just address that part of the market?

i.e. Apple does a ton of work to ensure I'm paying taxes and complying with laws in hundreds of places I'll probably never make a sale in. Sure, some high paying people might need all of that, but I'd be happy with just USA. I only utilize the other parts because it was a few clicks.


Most companies only need a subset of the features that these mega-platforms offer, as they operate within single industry, targeting specific customers many times in a single country with a simpler legal landscape.

I have no idea for sure, but odds are 80% of the revenue of these current saas providers is generated from 20% of the features they offer. Lightweight newcomers can just focus on that 20% and ignore the other 80%.


I haven't read much into it, but The Simpsons to me became terrible around the same time Family Guy came out. I don't know if trying to be like Family Guy made the show worse, or if the type of humor The Simpsons championed for so long became unfashionable.

A lot of people say Family Guy copied the Simpsons, but in reality I actually found that the Simpsons tried to copy Family Guy's style of humor and did a very terrible job at it.


There's probably a term of art for it, but that Family Guy style cut to some previous reference hadn't really been done before, certainly not to the extent that Seth McFarlane did it. Simpsons copied it, but it never felt good when they did.

If you watch The Simpsons DVD commentary on the very first season DVDs, they talk about how Matt Groening's team would draw the key frames and then they would ship them to an asian animation studio to provide the animation frames. The very first time they did this, they got an animation style that was all over the place - not just the quality of drawing, but the actual animation style was jello-y and way more wobbly than they wanted. You can see it on YouTube here: https://www.youtube.com/watch?v=sx-wjF5AMmk

The main reason that they sent it back was that the style and physics represented in the cartoon wasn't the one they were going for, and it changed how the show felt. I feel like the rapid cut references they adopted from Family Guy did a very similar thing. It changed the flow of the show, which, maybe (?) is actually more of a sign of the times and attention span than animated show style, but still, I wasn't a fan and I didn't feel like The Simpsons did it naturally or that it fit, and it takes me out of the narrative every time they do it.


There seems to have been a big turnover of writers in 98-99. As in very few people who were there in the beginning were left by that point. You're not considering that the original charm of the show was lost with the original creative force.

Family guy debuted in 1999. It's hard to say Simpsons tried to copy family guy's style. Family Guy is really known for its cutaways (usually to some non sequitur) & somewhat crude humor. A lot of the jokes at this time hinged on Stewie not being understood by anyone but Brian, Brian himself being a dog. There were also a lot of references to musical theater. The Simpsons was different from this.


The Zombie Simpsons explanation paints it (fairly convincingly) as a combo of deaths, turnover in the writing room, and the influence of The Simpsons and other “subversive” shows affecting the mainstream so much that it was no longer distinctive, with the result that the show became a heightened, silly, absurd, but basically straight, and (more, at least) earnest, version of the family sitcom it had started off lampooning.

https://deadhomersociety.wordpress.com/zombiesimpsons/

Under this explanation, the early show is basically a totally different thing from what it became by somewhere around season 10. Even if it didn’t “get worse” (I think it definitely did also do that, but it’s not necessary for this explanation to work) it became something so different that it’s not surprising that a lot of people who liked the early show, don’t like what it has been since the change.


I remember reading years ago that in the early days the executive producer had a two year tenure, then from season 8 or 9 it's been the same guy with no change.


Looking back and playing my Dreamcast again, I also believe the lack of dual shoulders (only one L and R) hurt because some games simply couldn’t be easily played with the standard controller (and not everyone is buying the keyboard and mouse)


You make it sound like dual shoulders were standard, but PS(2) was the odd one for having that. None of N64, Xbox nor Gamecube had them. It wasn't really until the Xbox 360 that you'd see it outside of Sony.


Now that you mention it, maybe I'm thinking more of the dual analog sticks than the shoulders. But yeah, Gamecube at least had a Z trigger in addition to the shoulders.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: