Hacker Newsnew | past | comments | ask | show | jobs | submit | citrin_ru's commentslogin

Despite wind energy being in excess in Scotland AFIR end users are still paying very high prices due to marginal pricing used in the UK - electricity cost is set by the most expensive source of energy (even if it is 0.1% of the mix) and most of the time gas is the most expensive source. I think marginal pricing is detrimental but there is no political will to axe it.

“Marginal pricing” is just how a market economy works.

If there weren’t marginal pricing, nobody in the private industry would build more wind farms or submarine power lines or battery capacity - which are lucrative because they produce peak-time power cheaper than imported gas — and these are the things that will drive power prices down eventually.


It sounds like there’s some sort of rule in the UK where al of the suppliers have to charge the same price per watt (or something), and they’ve named this rule “marginal pricing”? So, it is not entirely the same as a market based pricing.

Whether it is better or not, I have no idea. One could probably see an argument for allowing renewables to price themselves below the sustainable rate for petrochemical based fuels—let them outcompete based on price. Of course that gives them less money to reinvest.

On the other hand, power grids are never entirely market based; the grid needs some dispatchable power for stability sake, and it is hard to get consumers to express their tolerance of power outages in terms of how much extra they’ll pay to keep unused plants in reserve…


One solution is to have several markets. Norway also has transmission problems. The land is divided into 5 areas, each with its own price.

What if the datacenter buys bulk energy from a single provider and only uses the grid for excess demand? Can also go the xAI route with massive batteries smoothing out power use.

Is there a rational reason to do that?

The idea behind it is that everyone who supplies energy gets paid the same

E.g. it would be unfair to pay wind farms 10p/kWh and gas turbines 20p/kWh when the electricity they supply is the same and fungible

If there was enough grid storage this wouldn't be an issue, but because there isn't, there are always times where we need gas turbines to top up and those turbines won't turn on for less than it costs them, which is a lot

The upside of this is renewables are very profitable and incentivised


It is nice that is keeps renewables extra profitable, but if they could price down a bit they could just run fossil fuels out of the market entirely… so, it doesn’t seem like a great favor to them.

OTOH treating all units of energy “fairly” ignores the added value of dispatchable generation, so it doesn’t really seem fair at all.

On the gripping hand, if pricing was set by the market, customers could be incentivized to help fix the intermittence problem by making their loads dispatchable, which seems like it would be an all-around win…


> if they could price down a bit they could just run fossil fuels out of the market entirely

What do you propose we do when the intermittent sources don’t provide enough energy and all the other power sources have gone bankrupt?


If that's the case, doesn't it make a huge amount of sense for the utility to tell the silk incinerator selling it 0.001% of its electricity for 40p/kwh, "Bugger off, we'll buy batteries"? Cutting its overall power costs in half for a tiny operational shift.

You don't actually need the 0.1%. There are easy ways to make it up. There AREN'T easy ways to make up 7%, though.


Simplifying wildly: Electricity producers sell their electricity at auction. They all offer a bid (x Wh at price y), the utility accepts bids from lowest to highest until demand is filled, and then everybody gets paid the highest accepted price to fill demand. Wind and solar pretty much always bid their forecasted capacity at $0, because they have no additional costs between producing and getting curtailed.

So the silk incinerator only gets to sell electricity if demand is extremely high and the utility needs to accept even the highest bid.

Batteries would fix a lot of this, but western nations have extremely long interconnection queues (project waiting to be allowed to be connected to the grid), mostly because of stupid bureaucratic reasons.


The utility will bill the 40p/kWh to its industrial customers (and residential customers on “agile” smart meter tarriffs), and the customers can decide whether they need the power even at 40p, or whether they shut down their bitcoin mine/aluminium smelter/EV charger/floodlights for those two hours.

In the longer term, price spikes like this incentivise the building of batteries - which might be marginably profitable most of the time but profit big time (and help big time) in periods of price spikes.


I haven’t yet seen macOS 26 but iOS 26 is ugly and the least usable phone OS I’ve ever used - controls (theirs placement) and blurry areas waste too much space, animations are distracting and “reduce motion” option just reduces them with flickering which even worse. I was a happy iPhone users for 5+ years and was considering to buy a MacBook (as a replacement for an old Thinkpad) but iOS 26 is such abomination that I will wait and if the next version will not remove Liquid Glass I’ll buy an Android phone and a Linux-compatible notebook.

I like the new iOS design, despite having low expectations after seeing nothing but hate online leading up to the release.

It’s non intrusive and neat to look at. There are some decisions that are annoying, like a hidden button that takes an extra click now, but it’s also something I spend such a small amount of time interacting with that spending any time fretting over it seems like a wasteful overreaction.

That said, I empathize with your frustration, and I have plenty of things that trigger me into passionate “overreaction”, so I totally get it… and things like “more battery consumption” are objectively bad.

Hopefully my experience will be similar with macOS once I try it out though.


> They also don't complain nearly as much about things which are 'established and ugly' like powerlines or coal power plants

I like industrial architecture and some plants inspire awe but post-war coal plants are as ugly and boring as it gets. Older ones look much better in my eye and I’m glad that some buildings are preserved after the stations are shut down.


Above ground power lines are a sin.

It seems to me that incremental improvements undervalued in the West instead lots of efforts spend on chasing revolutions (which don’t happen often) and China capitalised on this.

It's a false dichotomy good documentation should have both - comprehensive reference and examples.

> Gathering solar power isn't a problem, but storing it is

I would agree that storage should not be ignored when we talk about the cost but even without storage solar is not useless. Solar + peaking gas power plant is better then gas alone 24x7.

Many sunny countries still burn coal and gas in the middle of the day when solar can provide 100% of energy demand (e. g. in Algeria and many other African countries share of solar is <1%). Dropping cost of PV may help to change this.


> be, before C and C++ took over the zeitgeist of compilation times.

I wouldn’t put them together. C compilation is not the fastest but fast enough to not be a big problem. C++ is a completely different story: not only it orders of magnitude slower (10x slower probably not the limit) on some codebases compiler needs a few Gb RAM (so you have to set -j below the number of CPU cores to avoid OOM).


Back in 1999 - 2003, when I was working on a product mixing Tcl and C, the builds took one hour per platform, across Aix, Solaris, HP-UX, Red-Hat Linux, Windows 2000/NT build servers.

C++ builds can be very slow versus plain old C, yes, assuming people do all mistakes there can be done.

Like overuse of templates, not using binary libraries across modules, not using binary caches for object files (ClearMake style already available back in 2000), not using incremental compilation and incremental linking.

To this day, my toy GTK+ Gtkmm application that I used for a The C/C++ Users Journal article, and have ported to Rust, compiles faster in C++ than Rust in a clean build, exactly because I don't need to start from the world genesis for all dependencies.


That’s not really an apples to apples comparison, is it?

I dunno why, the missing apple on Rust side is not embracing binary libraries like C and C++ do.

Granted there are ways around it for similar capabilities, however they aren't the default, and defaults matter.


I talked a bit about this at the Rust All Hands back in May.

A lot of Rust packages that people ust are setup more like header-only libraries. We're starting to see more large libraries that better fit the model of binary libraries, like Bevy and Gitoxide. I'm laying down a vague direction for something more binary-library like (calling them opaque dependencies) as part of the `build-std` effort (allow custom builds of the standard library) as that is special cased as a binary library today.


You only build the world occasionally, like when cloning or staring a new project, or when upgrading your rustc version. It isn't your development loop.

I do think that dynamic libraries are needed for better plugin support, though.


> You only build the world occasionally

Unless a shared dependency gets updated, RUSTFLAGS changes, a different feature gets activated in a shared dependency, etc.

If Cargo had something like binary packages, it means they would be opaque to the rest of your project, making them less sensitive to change. Its also hard to share builds between projects because of the sensitivity to differences.


Except for RUSTFLAGS changes (which aren’t triggered by external changes), those only update the affected dependencies.

One hour? How did you develop anything?

That was a clean build for the binary code used in Tcl scripts, every time someone would sync their local development with latests, or switch code branches.

Plenty of code was Tcl scritping, and when re-compiling C code, only the affected set of files would be re-compiled, everything else was kept around in object files and binary libraries, and if not affected only required re-linking.


That’s probably why tcl is in there, you use uncompiled scripting to orchestrate the native code, which is the part that takes hours to compile.

I have seen projects where ninja instead of make (both generated by cmake) is able to cleverly invoke the compiler such that CPU is saturated and ram isn't exhausted, and make couldn't (it reached oom).

You mean low water levels? Isn’t it caused by agriculture water use? A dam allows to use more water (for agriculture) but one can choose not to use more.

BMC software quality is low but what's the alternative? Without BMC it is more expensive to manage a fleet of servers. In a better word hardware vendors will publish specs to allow open-source BMC firmware but for some reason they resist this idea. Having only insecure BMC available a semi-separate management network (connected via a bastion host or a VPN) provides balance between cost and security.


> BMC software quality is low but what's the alternative?

Dedicated KVM devices?


This won't scale. Dedicated KVM needs you as an admin walking to the server, reswitching cables, walking back to the KVM console. Instead, with Out of band managament hw/sw, you spawn a dedicated ethernet and can access it from anywhere. It is a flexibility advantage on the costs of security.


There are boxes that can KVM to multiple servers at a time. You don't need to switch cables. They probably cost similar or less than BMC cards on a per-port basis. You might have to combine with some sort of network boot to set up a machine from scratch.


> They probably cost similar or less than BMC cards on a per-port basis

If you build own servers that's an option to consider but most off-the-shelf servers are sold with BMC (so you pay for it even if don't want it). May be some low end brands sell servers without BMC but if you are looking for relatively reliable hardware you'll likely get a server with BMC.


I was thinking more like just having one IP KVM per server always hooked up to a dedicated management network, basically used exactly like a BMC just with better software.


Only if people used and disposed single use bags responsibly. Plastic bags flying in the wind and littering environment is a common sight in any country where they are used (amount can vary but plastic litter free countries are rare if exist).

Also cotton is not the only material for reusable bags (but probably among the most durable).


I'm not sure if this is true though? Granted I live in Europe where topics like these get a lot of attention, but I can't recall when I last saw a plastic bag drifting about, both home and abroad.


In most of Europe, single use bags are outlawed for white a long time already.

In my experience, to see how bad this can be, these days you need to go to Asia or Africa. Anecdotally, there is a lot less rubbish in Europe than there used to be.

1. https://www.greenqueen.com.hk/wp-content/uploads/2020/11/pla... 2. https://www.mirror.co.uk/news/world-news/gallery/horrifying-... 3. https://thebalisun.com/balis-most-popular-beaches-brace-for-...


> I can't recall when I last saw a plastic bag drifting about

In the US, this is so common it became a movie plot point:

https://www.youtube.com/watch?v=Qssvnjj5Moo


This. If anything it's all the other stuff that forms the bulk of the litter. Plastic drink containers, fast food packaging, etc. I can't remember the last time I saw a plastic bag. And I live somewhere that thankfully avoided that fad.


Plastic bags as litter were one of the big selling points of San Francisco's single use bag tax.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: