Hacker Newsnew | past | comments | ask | show | jobs | submit | RhysU's commentslogin

> It really ought to be possible to structure the utility contracts such that a new data center lowers every one else’s rates instead of raising them.

That would imply that increased demand strictly decreases prices, no?

Given fixed supply, no dice.

To expand supply, one would use incrementally more expensive mechanisms to generate the incremental supply. (Because, why wouldn't you already be generating via the cheapest supply?) Either all existing customers would pay the same and the new customer would pay the higher rate OR smear the incremental costs across everyone. Prices might hold steady under the former choice but they would not decrease.

Is the idea that the new customer would unlock some better generation capabilities through capital investment? Something not already incentivized by the distributed grid?

Or is the idea that one should soak the new customer to subsidize the existing ones? Maybe rejigger some pricing tiers to push more of the existing customers into lower tiers while charging the new customer more. My guess is you're proposing this last option. I otherwise can't see how to square your suggestion with supply vs demand.


I can think of a couple of utility models that could work and one that definitely doesn’t.

First, the bad. Build a facility that consumes 1GW at existing rates in a market with slow growth like the US. The supply sources are roughly fixed, so the grid will need to run more expensive sources. Prices go up.

Now the good. Choice A, in a growing market like China and like the US arguably should have. Lots of demand is coming online all the time (not just datacenters), and everyone plans for this. Power plants of various sorts get build, and there is so much construction that costs can be quite low. Oh well, we can wish.

Choice B: suppose there’s a market with roughly constant demand and enough cheap supply to go around (maybe a good hydro resource or some solar and wind and/or cheap natural gas). Residents have cheap power, which is a good thing. But the hydro doesn’t magically get bigger just because someone builds a 1GW data center. Some careful market design is needed, but that datacenter’s grid connection could be contingent on the operator actually sourcing 1GW of new generation and paying the marginal cost of its demand, with appropriate corrections if the time that the generation produces doesn’t line up with the demand. As possible pretend numbers, suppose that existing prices, all-in for the customer, are 12 cents/kWh. 5c of that is distribution and we’ll ignore it. So the data center operator sources 1GW of average supply at 10c/kWh and tries to connect itself fairly directly, so their transmission is cheap. They are allowed to buy from everyone else and sell to everyone else, but they are paid 2c/kWh selling to the grid (which the grid and residents like!) and they pay 15c/kWh when they plus whatever capacity they supplied have a shortfall and they need to buy. And, if the numbers were picked right, the grid makes a small profit selling peak power to the datacenter while still selling at peak times to residents at 12c/kWh.

Would this work? I don’t know, but I think it could be done in a way that makes residential and ordinary commercial rates go down as a result of someone building a giant new load and also paying for the new capacity to supply it.


> we have failed to broadly adopt any new compiled programming languages for HPC

The article neglects that all of C, C++, and Fortran have evolved over the last 30 years.

Also, you'll find significant advances in the HPC library ecosystem over the trailing years. Consider, for example, Trilinos (https://trilinos.github.io/index.html) or Dakota (https://dakota.sandia.gov/about-dakota/) both of which push a ton of domain-agnostic capabilities into a C++ library instead of bolting them into a bespoke language. Communities of users tend to coalesce around shared libraries not creating new languages.


The evolution of C, C++, and Fortran is touched on in a sidebar, although admittedly very briefly:

> Champions of Fortran, C++, MPI, or other entries on this list could argue that…


Thank you for pointing out the sidebar. I missed this because I did not understand that footnotes/sidebars were presented as links.

The authors are aware, as the Chapel compiler makes use of LLVM.

The author's framing "we have failed" suggests otherwise.

This section, https://chapel-lang.org/blog/posts/30years/#ok-then-why, does not mention libraries at all.


Not really, you should actually read that section a few times as well.

> A fact of life in HPC is that the community has many large, long-lived codes written in languages like Fortran, C, and C++ that remain important. Such codes keep those languages at the forefront of peoples’ minds and sometimes lead to the belief that we can’t adopt new languages.

> In large part because of the previous point, our programming notations tend to take a bottom-up approach. “What does this new hardware do, and how can we expose it to the programmer from C/C++?” The result is the mash-up of notations that we have today, like C++, MPI, OpenMP, and CUDA. While they allow us to program our systems, and are sufficient for doing so, they also leave a lot to be desired as compared to providing higher-level approaches that abstract away the specifics of the target hardware.

Nothing there suggests the languages don't improve, especially anyone that follows ISO knows where many of improvements to Fortran, C and C++ are coming from.

For example, C++26 is probably going to get BLAS into the standard library, senders/receivers is being sponsored by CUDA money.

Another thing you missed from the author background, is that Chapel is sponsored by HPE and Intel, and one of the main targets are HPE Cray EX/XC systems, they know pretty well what is happening.


The fact that the author is a developer of Chapel pretty neatly explains why "no new language was adopted" is valued as failure, the article itself makes little effort to argue for that value judgment.

Author here: I didn't go into more detail on this than https://chapel-lang.org/blog/posts/30years/#maybe-hpc-doesnt... because I felt like the article was long enough already and that I'd recently covered that topic in detail in this series https://chapel-lang.org/blog/series/10-myths-about-scalable-... summarized here https://chapel-lang.org/blog/posts/10myths-part8/#summary

In the "maybe we don't need it" you open up with this:

> Another explanation might be that HPC doesn’t really need new languages; that Fortran, C, and C++ are somehow optimal choices for HPC. But this is hard to take very seriously given some of the languages’ demerits

It's honestly hard to think of a less specific claim than "some of [their] demerits", this is clearly preaching to the choir territory. Later hints of substance appear, but the text is merely reminding the reader of something they are expected to already know.

Moving on, the summary for the "ten myths" series starts with:

> I wrote a series of eight blog posts entitled “Myths About Scalable Parallel Programming Languages” [...] In it, I described discouraging attitudes that our team encountered when talking about developing Chapel, and then gave my personal rebuttals to them.

So it appears to be a text about the trouble of trying to break through with a new "HPC" language, and the reader is again expected to already know the (potentially very good) technical reasons for why one would want to create a new one.


Good point on my alluding to demerits of Fortran, C, and C++ without stating them, and thanks for clarifying your criticism. Using the four factors that I focused on as attractive features in new languages:

Productivity: For me, while Fortran has some nice features for HPC (multidimensional arrays), lots about its design feels very old-fashioned to my (not particularly young) eyes. C and C++ are more "my generation" of programming language, so are familiar and comfortable, yet they still seem verbose, convoluted, and less readable (more symbolically oriented) as compared to Python, Julia, or Swift, which are more what I'm looking for in terms of productivity these days. Of the three, C++ has clearly made the biggest strides in recent years to improve productivity, with some successes in my opinion, though I've also had a hard time keeping up with all the changes.

Safety: I consider C and C++ to be fairly unsafe languages compared to more modern alternatives. I don't have enough experience with Fortran to have a particularly informed opinion, but feel as though I've been aware of patterns in the past that have felt unsafe. Here again, I think using modern C++ in a certain style (e.g., smart pointers) probably makes nice strides w.r.t. safety, but I'd still consider there to be a gap between it and Python/Rust (as does my colleague in this post: https://chapel-lang.org/blog/posts/memory-safety/)

Portability: Modulo the degree to which various compilers keep up with the latest standards in Fortran and C++, I'd consider all three languages to be quite portable.

Performance: There's no question that these are high-performing languages in the sequential computing setting. In HPC, while Fortran or C++ and MPI are often considered the gold standard, it's a standard that can be beat if your language maps more natively to the network's capabilities, or knows how to optimize for distributed memory computing rather than relying on the programmer to do it themselves.

With respect to the "10 myths" series, while the focus of the series was about combatting prevalent negative attitudes about new languages in the HPC community, I think there's a lot of content along the way that rationalizes the value of creating new languages in my rebuttals. That said, I fully realize that it's a long read, particularly in its updated "Redux" form.

Thanks again for clarifying your previous point.


> I didn’t put Chapel on my list of broadly adopted HPC programming notations above, in large part to avoid being presumptuous. But it’s also because, regrettably, I don’t consider Chapel’s support within the community to be as solid as the others on my list

> Social Security is almost completely disbursement charges but those disbursements aren't means tested so even quite wealthy individuals receive them.

Only to those who paid into the system and far less than they personally could have earned on investing the same dollars.


Yeah, one of the problems I have with taxes is that if I pay $100 into taxes I don’t get $100 of value back. Everyone should get at least as much as they put in back. Also, some other people should get more back. But we shouldn’t spend more than we make as a government.

Many people do get more back than they put in. High earners do not:

> Most American workers receive significantly more from Social Security over their lifetimes than they contribute through payroll taxes.

https://legalclarity.org/is-social-security-worth-it-contrib...


> Everyone should get at least as much as they put in back

I don't even understand the thought process here. Taxes are not being used for productive investments. Some spending is growth, but probably not half.

I could see expecting the median citizen to be flat over their lifetime as a goal.


Assuming the presence of a sovereign wealth fund which does work somewhat logically for programs like social security - that would make sense. The government should sensibly invest money they're holding onto... however, it's unrealistic to ever expect the government to tolerate a level of risk and thus a rate of return above what you're personally comfortable with so it's unrealistic to assume that the government will be as efficient with money as you'd personally be if they'd never taken that money.

Additionally, a lot of these programs will pay out beyond what you've personally put in - programs like Medicaid are nearly entirely social subsidies to ease poverty and financial distress, so I'm not certain where you'd find the money to pay for them if not looking at either other people's taxes or debt.

As a taxpayer I expect the money I give to the government to be evident in some social projects but I don't personally expect that for each dollar I pay that I'd see a dollar in benefit to me personally. I have a belief that I indirectly benefit from the expenditure of charitable safety net programs even if I never expect to collect from them directly - the improvement in the lives of those around me is to my personal benefit by making society more just and egalitarian as well as reducing the incentive for crime which is a difficult to measure but observable direct benefit to myself.

The fact that so much of our budget goes to debt servicing is probably my personal biggest objection as it is effectively just a wealth extraction from our earn national budget to some select individuals.


In the US agricultural subsidies for 2024 were overwhelmingly for corn ($3.2B), soybeans ($1.9B), cotton ($998M), and wheat ($960M). Pasture comes in 5th ($741M).

https://usafacts.org/articles/federal-farm-subsidies-what-da...

Tofu and ethanol may be more price-distorted by the US government than is beef, but I dunno how to quickly support that idea with hard data beyond what I cited above.


Depending on how we measure it, either 58% or 75% of that heavily subsidized soy goes to feed animals.

https://insideanimalag.org/share-of-soybean-crop-for-feed/


Have you been to the Midwest to observe the scale of corn and soybean operations? I would wager the number of calories per dollar subsidy produced by the corn and soybean industries outweighs handily the calories per dollar subsidy produced by cattle operations, especially given the 10% reduction in efficiency per trophic level.

Also, how much does beef benefit from cheap feed prices (corn and soy) due to subsidies as well?


Beef prices are high right now.

https://fred.stlouisfed.org/series/APU0000703112

If the intent of the government is to pour subsidies into domestic beef production to stabilize prices they're doing a crap job.

Compare corn: https://www.macrotrends.net/2532/corn-prices-historical-char...


[flagged]


I thought most of the corn goes to ethanol

A little over a third of production evidently goes to ethanol: https://afdc.energy.gov/data/10339

No idea if subsidies disproportionately subsidize fuel ethanol over non-fuel usage.


> Most roles let you settle into one way of thinking. Engineers think in systems and constraints. Designers think in experiences and affordances. Salespeople think in objections and value props...

I don't think often, but when I do, it's in stereotypes. Stay dismissive my friends.


The default page load hides plots for all models with negative returns? That's sketch.


COBOL also came to mind.

The COBOL thing seems to be working out just fine last I heard. Today a small number of people get paid well to know COBOL's depths and legacy platforms/software. The world moved on, where possible, to lower cost labor and tools.

Arguably, that outcome was the right creative destruction. Market economics doesn't long-term incentivize any other outcomes. We'll see the arc of COBOL play out again with LLM coding.


I know it's just anecdotal, but I looked for COBOL salaries a couple of years ago, curious about this "paid well".

The salaries were ok but not good for COBOL.

Here's an anecdotal Reddit thread about it. https://www.reddit.com/r/developpeurs/comments/1ixfpsx/le_sa...


I've been waiting for the article talking about how AI is affecting COBOL. Preferably with quotes from actual COBOL programmers since I can already theorize as well as the next guy but I'm interested in the reports from the field.

While LLMs have become pretty good at generating code, I think some of their other capabilities are still undersold and poorly understood, and one of them is that they are very good at porting. AI may offer the way out for porting COBOL finally.

You definitely can't just blindly point it at one code base and tell it to convert to another. The LLMs do "blur" the code, I find, just sort of deciding that maybe this little clause wasn't important and dropping it. (Though in some cases I've encountered this, I sometimes understand where it is coming from, when the old code was twisty and full of indirection I often as a human have a hard time being sure what is and is not used just by reading the code too...) But the process is still way, way faster than the old days of typing the new code in one line at a time by staring at the old code. It's definitely way cheaper to port a code base into a new language in 2026 than it was in 2020. In 2020 it was so expensive it was almost always not even an option. I think a lot of people have not caught up with the cost reductions in such porting actions now, and are not correctly calculating that into their costs.

It is easier than ever to get out of a language that has some fundamental issue that is hard to overcome (performance, general lack of capability like COBOL) and into something more modern that doesn't have that flaw.


> We estimate that these laws prevented fatalities of 57 children in car crashes in 2017 but reduced total births by 8,000 that year and have decreased the total by 145,000 since 1980.

145K is roughly the population of Syracuse, NY or Midland, TX. That is far more than the absolute number of US military deaths in World War I (116,516 per https://en.wikipedia.org/wiki/United_States_military_casualt...).


Wasn't WW1 basically finished when U.S. troops reached the front line?


A friend once remarked that it'd take particular dedication to so using a Miata.


> Operable while tripping

> We are often faced with 45-button remote controls, and multi-level menus of options to navigate. This can be ok in everyday life, but when you transport to the spiritual dimension of the Grateful Dead show, or when you are tripping, menus and buttons are the last thing you can deal with.

> The Time Machine needs to be controlled with knobs, a few buttons, and intuition. And if it doesn’t do exactly what you want, maybe it will do something that you need anyway.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: