Hacker Newsnew | past | comments | ask | show | jobs | submit | com's commentslogin

If it’s true, do we get cheaper quantum dots and better atomic clocks?


Do we then need satellite internet for mobile broadband video for doctors and paramedics if information sharing by nonlocal photonic communication is real; despite the false limit and "loopholes"?

Would this simple experiment and less destructive photonic observation show the nonlocal communication described in the OT article?

"Name of this Q/QC experiment given a light polarization-entanglement complementary relation" (2025) https://quantumcomputing.stackexchange.com/questions/44435/n... :

> Given the ability to infer photonic phase from intensity, isn't it possible to determine whether destructive measurement causes state change in entangled photons? Is there a name for this experiment; and would it test this?

FWIU call blocking is not possible without centralized routing; so we wouldn't even all want quantum phones that don't need towers or satellites that may be affecting the jet stream and thereby the heat.


> Do we then need satellite internet for mobile broadband video for doctors and paramedics if information sharing by nonlocal photonic communication is real; despite the false limit and "loopholes"?

Yes we still need satellite internet. The doctors and paramedics can generate some random numbers and the hospital can generate some random numbers, and once they meet again they can look at them and see a strange correlation.

But if the hospital wants to tell something to the doctors and paramedics or vice versa, they must use a classic communication channel.


Will the bandwidth/throughput limits of entanglement-based communication systems continue to preclude their use for anything but lower bitrate applications like key distribution?


I don't want to be quoted in 1000 years like the guy that didn't believe in quantum communication, ... but my guess is that it will not provide a high bandwidth/throughput.


The Double Bind surfaces in tech/security hierarchies where the CTO manages the Head of Security, and is officially accountable for delivering on growth opportunities as well as managing security risks.

While there are great CTOs out there that are conscientious and thoughtful about this double-bind, most aren’t.

It’s good to have open discussions about upside opportunity versus downside risk and generally that happens best when your boss’ bonus doesn’t primarily depend on them maximising upside.


Is there any better way you could set this up? Just asking for a friend.


Get the downside risk people in tech to report to somebody who is accountable for managing downside risk at the same level of the CTO.

Typically an intelligent and tech literate CFO or Chief Risk Officer.

If the Head of Security and the CTO can’t come to a deal, it reaches the ExCo or board for a decision.

I call this “creative tension” and it works better than the alternative.


Sounds reasonable enough - thank you !


This is a terrifying report and if there is any justice in this world, the perpetrators of this crime against humanity will be persued and prosecuted by international tribunals.

With reports like these, there is no excuse that “we didn’t know”.


They won't be able to read the records due to physical bit rot of any public archives making it through the narrow path to the future, and of course the absence of most material behind paywalls that won't be in any public archives that substantially survive. They'll get more from acid-free paper archives under mountains than anything digital I expect.


Tight feedback loops, and where hookups are the goal then it's a real dopamine cycle. I'm guessing we'll be dealing with the impact of this for decades before we as a society manage to deal with it properly.


Given the way the world feels now, I’d really love to return to a kindergarten where every day is growth, discovery and fun without existential dread.

My vote is for the Culture where the hedonism is balanced by good deeds - such as a hand discretely extended - for those crawling out of the mud looking at the stars which the culture leaves as close to untouched and untrammelled as they can.


The advice about logging and metrics was good.

I had been nodding away about state and push/pull, but this section grabbed my attention, since I’ve never seen it do clearly articulated before.


Yes. Everyone should spent the small amount of time getting some logging/metrics going. It's like tests, getting from 0-1 test is psychologically hard in a org but 1-1000 then becomes "how did I live without this". Grafana has a decent free tier or you can self host.


The logging part is spot on. It has happened so many times when I thought, "Oh, I wish I had logged this.", and then you face an issue or even an incident and introduce these logs anyways.


It is a balance. Too many logs cost money and slow down log searches both for the search and the human seeing 100 things on the same trace.


The trick here is to log aggressively and then filter aggressively. Logs only get costly if you keep them endlessly. Receiving them isn't that expensive. And keeping them for a short while won't break the bank either. But having logs pile up by the tens of GB every day gets costly pretty quickly. Having aggressive filtering means you don't have that problem. And when you need the logs, temporarily changing the filters is a lot easier than adding a lot of ad hoc logging back into the system and deploying that.

Same with metrics. Mostly they don't matter. But when they do, it's nice if it's there.

Basically, logging is the easy and cheap part of observability, it's the ability to filter and search that makes it useful. A lot of systems get that wrong.


Nice. I'm going to read up more about filtering.


Yeah, absolutely. But the author's idea of logging all major business logic decisions (that users might question later) sounds reasonable.


Yes. I like the idea of assertions too. Log when an assertion fails. Then get notified to investigate.


The Overton Window isn’t just about political topics, it’s about the governance tools available for use.


As the project management saying goes: 99% done is much closer to 0% done than 100% done


I was looking for a citation and instead found a number of vague or biased papers.

Suprisingly there’s an archived DoJ page that says that the same remedies may have been much cheaper to achieve through other means.

https://www.justice.gov/archives/atr/att-divestiture-was-it-...

YMMV. I don’t agree with the narrowness of this analysis, and would like to see some links to academic studies in economics and the study of innovation tbh.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: