Hacker Newsnew | past | comments | ask | show | jobs | submit | Pingk's commentslogin

The article doesn't mention if tips are included in their calculation (I suspect not).

Are Uber/Lyft still cheaper after a 10-15% tip?


Assuming the rides are comparable, the article has a table which includes price/km (weird) of Lyft: $7.99, Uber: $8.36, and Waymo: $11.22. On that data, Waymo is roughly 40% higher, so way more than just a tip.


Assuming you didn’t upgrade to a different tier or pay for priority to get your uber faster or a nicer ride.

Uber also can increase the cost of the ride on you with unexpected routes or time. Yes you can complain, but I am sure plenty don’t even notice.

The math isn’t wrong, but it’s not so black and white.

I’m in the camp though of “I would pay double not to deal with a human”


in my limited experience, you're not usually tipping a percent but a flat dollar amount of like $2-5 per ride, so $3 on an $8 ride basically removes the price difference between lyft/uber and waymo



Important caveat to that study from right at the end of the article:

the data set used for the study, while massive, was limited to 2017 data. [...] Uber only added a tipping function to its app in 2017

So the study was either before you could even tip in the app or soon after and when it was still new.

A more recent study would interesting.


My thoughts exactly. I usually tip well - too well if I’m drinking and that’s usually when I’m taking an Uber.


it's funny, but tipping is one of the things many people will pay more to avoid.


About the same.


Given that no models are profitable for the parent company afaik, it's only a matter of time before the money-squeezing begins


Consulting companies destroy valuable institutional knowledge and make the government less effective, so I'm onboard in theory.

The flipside of removing them is you now need to hire experts/specialists to do the work properly, not fire them...

https://youtu.be/ycVBoWsGLJs


> The flipside of removing them is you now need to hire experts/specialists to do the work properly, not fire them...

Except that currently hired experts/specialists who don't argue correctly what their purpose is, are fired by Musk. The hole left is then to be filled up by external consultancy.


> Consulting companies destroy valuable institutional knowledge and make the government less effective

Can you explain?

Because the entire idea is that they bring knowledge to the institution that hires them, and proposes/implements processes to make government more effective.

If a department needs full-time experts in an area for decades, they generally hire them. But very often, a department needs to make a complicated decision in an area that is not its core competency, and after the decision is made and implemented it doesn't need those experts anymore. Their job is done. That's where consulting shines.


Institutional knowledge stays in the institution and builds up after people have moved from those roles even. Consulting companies bring the operational knowledge to help with a specific situation. Sometimes you may need the institutional knowledge (if it aligns with the mission of the organisation) and sometimes it is better to concentrate on your mission and use the consultancy's knowledge to help you. Blanket statements are problematic. Particularly for something as complex of the goverment as a whole.


That is the idea and that is not what happens. Management consulting companies like this are the literal analog of pharmacy companies not wanting to cure anything because then you lose business. If you spend more than a few days working with them you realize that they promote people who find and make long-term revenue streams, not anyone who solves or actually helps anything.

The actual people who do work are generally clueless recent grads on their first job, are trained to produce just enough to pass what is being sold, and they leave (or move up to selling) when they understand the game they have been playing.


See old.reddit.com/r/consulting for a ground-level view into the consulting machine


The only palpable defense of hiring them, as a practice, lies in the fact that they are big entities and there's implied trust. In my view reality has shown time and time again that big companies are a liability to individuals and other larger structures.

Some fun reading

"Rental car agency Hertz filed a $32 million lawsuit in April against consulting giant Accenture because it “failed to deliver the website and apps for which it was so generously paid.”" https://madeintandem.com/blog/massive-hertz-accenture-lawsui...

"Deloitte treated Marin County as little more than a "trial-and-error public sector training ground" for its inexperienced consultants, the lawsuit claimed." https://www.reuters.com/article/business/deloitte-hit-with-3...

"And in May 2022, New York City stopped using McKinsey’s system for classifying detainees. In the end, the city spent $27.5 million on McKinsey’s services, with precious little to show for it. McKinsey, on the other hand, collected its money and moved right along." https://www.thenation.com/article/society/mckinsey-whistlebl...

For those bored, wikipedia pages are a good starting point. I would be surprised if there were many, out of the top 100 consulting companies, that didn't have a controversies (or similar) section on their wiki page.

https://en.wikipedia.org/wiki/McKinsey_%26_Company#Controver...

https://en.wikipedia.org/wiki/Deloitte#Litigation_and_regula...

https://en.wikipedia.org/wiki/Accenture#Controversies


That's the shiny picture. The reality is that they let outsource risk and competence, turning the business into an empty shell of a former valuable self.


That strikes me as expecting to find real followers of Jesus in American evangelical churches. Or maybe expecting the pro peleton at Soul Cycle.


> Because the entire idea is that they bring knowledge to the institution that hires them, and proposes/implements processes to make government more effective.

No. That's the idea they sell.

What they bring is a cohort of interchangeable recent graduates, with flashy powerpoint presentations, fast talk, and "multidisciplinary skills". Which you can charitably translate to "no relevant knowledge".


LLMs killed this grift.

They showed how easy it is to churn out consultant-level reports.

The bar will be raised for consultants now, and only those who can actually bring unique contextually relevant and sound insights will survive. Such consultants exist, but are rare.


Institutions can become caught in a trap of paying high prices for consultants and low/no prices for internal experts. Obviously, consultants are incentivized to cultivate this codependency. In many cases, the institution would be better served by building expert capabilities in the domains they care about rather than using consultants.


You can buy knowledge in all sorts of form, from books to consultation hours, but the most efficient when applying this knowledge to a task is still a human being. A consulting company might not even guarantee you that the same person will be sent to you for the whole time of the project.


It's cheaper and more profitable to make thing out of plastic rather than something more durable/sustainable, and companies lobby against strong environmental regulations so they don't need to care about improving.


This is the reason I chose to go with AMD's 7000 series for my 2022 build.

I wasn't aware of Intel's limitation when I built my first computer in 2016 so when I wanted to upgrade a few years later I wasn't expecting to need a new motherboard since it still had everything I needed - it felt so wasteful!

Instead I just waited for AM5 based on the longevity for AM4.I'm really hoping AMD support AM5 for a few more generations so I can do the same as you in 2026/7


To make 1 gram of antimatter, from E=mc^2, would take about 90 Terajoules. For reference, the atomic bomb that dropped on Hiroshima released about 60 Terajoules of energy.

So you would need at least (and with the efficiency loss of production, much more than) 1.5 Little Boy atomic bombs worth of energy to make a single gram of antimatter.


Or vice versa you can store the energy of 1.5 little boy atomic bombs into a handy gram of antimatter ...


The Sun outputs the energy equivalent of over 4 million tons of matter every second. The same as the energy of over a billion hydrogen bombs every second.

Only a tiny amount reaches the Earth, and we use only a tiny amount of that. But if we could capture even a small percentage of the total energy of the Sun we could produce antimatter by the ton.


I always liked the star gate universe (?) series where the ships skimmed the stars they passed to charge up. We’ve essentially got a bunch of EV chargers dotted around the universe, although galaxy to galaxy might have some range anxiety.


I'm not worried about the label, but I am worried about the implication - since software made the jump from physical to digital/OTA distribution, there's been a decline in software release quality because "we'll patch it later".

The historical financial punishments to writing buggy software are gone, and now it's infecting cars I'm concerned that safety standards will begin to slip and potentially injure someone.

Side note: I know a common response to buggy software is the market won't pay because it takes longer to develop etc. But writing robust software is a hard skill,and it you haven't seen an industry write robust software for a long time, why should you trust they still can?


Tech isn't siloed for no reason.

In the UK government, before programming was considered a high-value skill, the vast majority of programmers were women. So much so that programming was measured in girl hours (which were paid less than man hours).

When it became clear that programming was going to be a big deal, women were systematically excluded, flipping the gender balance (although they had trouble hiring initially because men saw it as lesser work).


It flipped because the roles programmer (largely women) and analyst (mostly men) became programmer-analyst. The role women were dominating was collapsed into the one men already dominated.

At the exact same time (at least in the US), which was the 1980s, law and medicine (as in doctors, not nurses) rapidly shot toward near-parity of participation by men and women, while both being high-pay and much higher-prestige than anything to do with computers—now, still, but especially then. That the profession becoming higher-paying and a “big deal” was the cause of this shift doesn’t make much sense, given what else was going on at the same time.

[edit] to be clear, I’m not denying the existence of a gap, or making claims about whether it should be addressed—in fact, I think understanding the cause is vital if we do want to address it.


The whole 9000 series has been disappointing, in terms of price/performance you're better off getting something from 7000.

It seems like 9000 (and the newly announced Intel 200 series) have a lot of restructuring work and lay the bed for future generations to push further


It is disappointing only for gamers.

For scientific and technical computing, a 9950X provides the greatest improvement in performance per dollar since five years ago, in 2019, when the first 16-core desktop computer, 3950X, has been introduced by AMD.

This is caused by the doubling of the AVX-512 throughput per core in desktop and server Zen 5.

The new Intel Arrow Lake S desktop CPU, 285K, also provides a 50% increase in AVX throughput over Alder Lake/Raptor Lake, but this increase remains small in comparison with the throughput doubling in 9950X, which ensures a 4/3 throughput in 9950X (actually even more than that in practice, due to the better AVX-512 instruction set).

For games and for applications that are not dominated by array operations, for instance for software project compilation, the performance of AVX or AVX-512 code does not matter much, but there are a lot of professional applications where 9950X will provide a great jump in performance.

For things like software project compilation, it is likely that Intel 285K will provide the best performance per dollar, due to its 24 cores. Unlike the older Intel E-cores, the new Skymont cores have a multithreaded performance that can be similar with that of the old Zen 4 cores or of the P-cores of Alder Lake/Raptor Lake, except for the applications where a high contention between threads for their shared L2 cache memory happens.

So this time one can truly consider Intel 285K as a 24 core CPU (without SMT) from the point of view of the applications dominated by integer/pointer computations, while in Alder Lake and Raptor Lake the E-cores were better viewed as half cores, due to their lower performance, so the top models of Raptor Lake were better thought as equivalent with a 16C/32T CPU, not with a "24-core" CPU, as advertised.

The 9950X has become more important than the past desktop CPUs, because the prices of the server CPUs have greatly increased during the last decade, so now, for most small businesses or individuals, it has become not cost-effective to use a "real" server CPU. Instead of that, it is much more economical to use servers with 9950X (and ECC memory). Multiple servers with 9950X are much cheaper than a single server of similar capacity with an Epyc CPU.


I believe also that the 9000x3D series (from my memory of rumours) also has the 3D cache on both CCXs, meaning no latency with cross-CCX communication.


I think that has the possibility to make it worse honestly. It’s not like the contents of the cache is duplicated. Instead it’s split across a ccx boundary and if the data is in the wrong cache you’ll be hit. Now clever thread management can help avoid this but so far the 9xxx series has shown terrible thread affinity choices with many existing games and apps. I’ll wait and see how the 3D cache helps here.


AMD claims that the 9000X3D series is the product that will provide the game performance increase expected by gamers for a new product generation.

Of course, that remains to be seen, but it is plausible.


Long term as games start using AVX512 I expect the 9000 series will be seen as a big step up against previous generations. One of those "fine wine" things.


That doesn't matter now. Gamers will vote with their dollar, and they are fickle.


Maybe, but it was already public knowledge:

> In September 2023 the government regulator, the Office of Rail and Road (ORR), had issued an improvement notice to Network Rail about overcrowding at the station, warning: “You have failed to implement, so far as reasonably practicable, effective measures to prevent risks to health and safety of passengers (and other persons at the station) during passenger surges and overcrowding events at London Euston Station.”

It's concerning to me that Hendy was the chair of Network Rail from 2015 before becoming Transport Minister, and here he is sacking someone after a comment about his former workplace. Should definitely be an investigation into his motives/incentives IMO


"public" via a set of documents hidden deep on an official webpage is very different to "public" as a news headline.


Here's the news headline from the time for you:

https://news.sky.com/story/network-rail-failing-to-stop-unac...


But it is presumably the "Correct internal channels"?


Irrelevant.


I wonder if there are still overcrowding at that station, or if it really was fixed in 2023.

A bit of a Streisand effect going on here.


Well, when I was in Euston rail station a few weeks ago, it was very overcrowded. It seemed worse in the day than the night. Seems like the minister is missing the necessity of acting with integrity and transparency, a lesson they frequently need reminding of. Surely there must be better person the PM could find for the job, that don't feel a need write harassing letters, bullying train companies into firing staff?


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: