Adding to what others have said, several hdd, ssd and ram manufactures have said that they don't believe demand will be sustained in time for long enough to warrant extra investments in increasing production.
There is no understanding, the weights are selected based on better fit. Our cells have no understanding of optics just because they have the eyes coded in their DNA.
I'm every day more convinced that the only reasonable future of energy production is distributed solar and storage with microgrids at the neighbourhood level or so.
Anything bigger in scale is prone to being shittified to the limit by public entities.
I can tell you've never dealt with an HOA or Strata before. Neighbourhood-level organizations of busybodies are way ahead of everyone else on winning the race to shittification.
Putting them in charge of your electricity won't be all sunshine and rainbows.
I'd rather the county utility manage such a neighborhood level grid; I don't trust my neighbors over the long term to consistently make good decisions regarding such technical matters. But then that's sort of what the major grid operators already do with smaller localized circuits. It's just that it was all built multiple decades ago with centralized unidirectional distribution in mind.
Do you not have a utility that covers one or a few counties where you live? Here it's some sort of strange public-private partnership scheme with private investment, strictly capped profits, and a few publicly elected officials at the top. I've also lived in places where the local government owned and managed the entire grid themselves, including directly employing the workers. I've also lived in places where the operation was entirely privatized (IIRC there was some sort of rate cap and a broad SLA in exchange for being granted the natural monopoly).
Here in New Mexico (USA), all power generation and distribution is privatized but theoretically overseen by a Public Regulation Committee. There are some co-operative generation and/or distribution organizations, but these are still private (and very regional in scope). No actual public utilities at all, though many here would like that.
Sure so in that context by county utility I mean the regional provider. In short if it involves hundreds of kilowatts of power and a timescale measured in multiple decades I want a large stable body consisting of professional specialists to manage it. I don't trust an HOA or other gathering of non-expert locals with potentially high turnover with that sort of infrastructure.
Even the battery installation for a large house borders on questionable. That's not a utility closet your average person should be wandering into under any circumstances and it's easy to run up a massive bill in an instant (if you aren't lucky you might simultaneously kill yourself) but at least that's limited to your personal property.
But now we have storage, distributed production, power meters, etc.
It's relatively easy to set up a grid in which several houses produce, storage and exchange energy with a simplified free market pricing system.
Alas, in most countries it's illegal because you aren't allowed to set up such a network. Energy is one of the big control levers the State has over people.
Again this was done in the 19th century. It is horribly inefficient. The only reason why it seems like a good idea today is because power companies are now rentieers instead of service provides.
How is it inefficient? You have production, storage and consumption right next to each other. You just miss the high voltage lines required to not lose a lot of energy that is produced hundreds of miles away and the transformers and switched that come with it. I would even say that it can be more efficient.
Back in the day having hundreds of nets that are not interconnectable and running everything on coal or gas made things hard to scale. But today interconnections can easily be done through a converter, you don't even have to match demand and production as you have storage and the more productors available, the cheaper and more stable the energy gets.
You also have individuals on the microgrid that actually care about the quality of the microgrid.
For european individuals, yes. For european nations, not in the least. They try to avoid independent consumers and producers of energy with all the regulations they can throw to them.
What nations are you talking about? E.g. in Germany, you can buy up to 7kW of panels, screw them onto your roof, wire them up with controller and battery and feed up to 800W into local grid, no one is gonna stop you or anything (only thing you need to do is register online with the grid operator if you have >2kW of panels).
Legislation is, in fact, specifically made so people (i.e. landlord) actually can't easily stop you from doing this.
Not quite as I understand it. At the low end, a couple of panels, yes. Beyond 960W of panels you will still need an an electrician rather than just the Schuko plug.
It's not clear to me why, for example, 2kW of panels which are also limited to 800W need need the special plug.
Hopefully I'm wrong!
"With a standard Schuko plug, a maximum of 960 watts peak is allowed on the DC side, regulated by DIN VDE V 0126-95. With the Wieland connector, a special feed-in socket, the limit increases to 2,000 watts peak. Anyone wanting to operate a system with up to 7,000 watts of module power will need a permanently installed feed-in socket, thus entering a range that is technically possible but also more complex to implement."
The limits in power you can install, power you can inject to the grid and power you can just directly sell to neighbours through a micro-grid (zeeo, as it's illegal) tell quite the story.
A quick check of OBI (our main home-improvement chain) shows that I can get that 800w "balcony" plug-and-go system for 300-600 EUR depending on the exact panels and inverter I want, and it's all pre-approved. I have fill in a simple, free form online announcing that I've done so or am planning to, and I'm now technically an energy seller - my local utility pays some for power fed back into the network (not nearly the rate they charge for delivering power, but better than a poke in the eye with a sharp stick). If I don't want to accept 0.08 EUR/kWH, I'm free to plug in a battery for any excess. Our base load when I have my work computer and monitor on during the day is somewhere in the neighborhood of 300w, so I think this would work out well for us. I need to get off my bum and do it.
Shockingly unbureaucratic for Germany.
There is, as one could imagine, somewhat more burden for larger systems that require the involvement of someone who is actually an electrician, but I don't want my neighbors to be able to DIY fire hazards.
My advice to anyone even minimally interested in retro games or just clear motion in the image is to get a cheap crt monitor and play a bit with it. You'll surely will appreciate that even against today monitors they hold their ground very well (not in brightness, though) and easily surpass them in motion clarity.
We did lose quite a lot when we trasitioned to lcd screens.
I distinctly remember gaming on CRTs and then LCD screens and it was night and day difference, in favor of LCD. Monitors have only gotten better and I certainly don’t miss CRTs, least of all how hot they were.
I'm curious what the primary causes of that are. Like, I had a similiar experience growing up in the 90's. I think it was just the sheer increase in resolution. Text looked so much better, and you could fit more on a screen.
Same here, I very distinctly remember the first time I got to use desktop-class LCD monitors (it was at a new job at the time) and four things stood out:
- The screen size. Going from a 17” or maybe 19” CRT at home to a 19” LCD but without the CRT bezel — the screen looked HUGE.
- The clarity and flatness. The lack of smudging on text, the consistent geometry, being able to see the screen edge right up to the bezel without any wasted space (which you often had on a CRT if you wanted an image without excessive pincushion / bulge).
- The relative lack of ghosting when compared to laptop LCD screens I’d used in the past.
- The colour gamut. Looking back I think those monitors I first saw were relatively wide gamut monitors being used with Windows XP and no colour profiles. The colour saturation was impressive (not accurate, but striking).
I never remember CRTs looking better than any desktop LCD from that point on overall, but I dare say I just didn’t have access to any high-end CRTs at the time.
I also never remember CRTs having true black levels close to OLED, which is another thing I hear people say sometimes. I mean you could get deep blacks, but you’d be sacrificing brightness and white/gray detail at the white end. Again though might have just been the CRTs I knew of at the time.
I went from a 19" CRT capable of 1600x1200@75Hz to a 17" LCD capable of 1280x1024@60Hz, basically because that CRT would've taken up a huge chunk of desk real estate in my dorm.
My first impressions were that the screen was bright as hell, sharp (but I was torn on whether that was good or bad, given the blockiness that it introduced), thin and light (awesome!), and sucked to run at anything but the native resolution. After a few hours, I realized that my eyes weren't getting tired looking at it, and that it was nice not to have the subtle hum around anymore.
The CRT was a decent screen (for 1999), and the LCD was a decent screen (for 2003). Of course, I just got used to the differences, since the LCD was much more practical in my life. I still have it in storage right now.
You forgot one thing: flickering. At 60 Hz, a CRT is murder on the eyes. A few years ago I used a CRT for the first time in like ten years and my eyes hurt almost immediately.
I was never incredibly disturbed by 60Hz though I did notice it.
You reminded me of something I had forgotten though — remember when 100Hz / 120Hz TVs first became a thing? That I noticed!
I think most of my PC CRTs ran at 72Hz / 75Hz IIRC. At least with the monitor I had I remember pushing it to 90Hz but that would add bluriness / lose clarity so I never used it at that rate.
Agreed a 100% CRTS were wobbly flickery mess. Especially in the 60Hz era. Everyhting below 90Hz on a crt gave me horrible migraines when working longer than 4 hours.
LCDs that were just constantly lit were so much easier on the eyes than a CRT where every bright pixel is flashing at 60Hz.
But one thing is true: a low res game designed to look good on a CRT looks much worse on a low res LCD. CRTs being a blurry mess gave you free 'antialiasing'.
I'm a _bit of a snob_ when it comes to that both due to my film & tv background as well as my game collection (jesus, that's a lot of games including full snes, n64 sets, mega drive, nes, etc). I have various broadcast monitors from PVMs to BVMs as well as some of the finest consumer ones including B&O etc. I can say that now with ultrafast OLEDs (240Hz) we're 95% there now, finally. With high quality shaders or hardware gadgets it's really nice. For that 5% more I think those things like ultra high DPI OLEDs and phosphor dot level emulation shaders with black frame insertions will get us there. Until then - good ol' Trinitron is still superb choice if you want 100%. Another thing, outside of actual display is that old console + CRT are almost zero lag input to screen experiences which I actually think plays significant role in the overall experience.
Very interesting. I grew up with CRTs and didn't even use an LCD screen until in my 20s. It felt magical. Then LED screens (especially the black of OLED) felt even more magical. I've never considered that CRTs might have been superior for some things.
I do remember playing some NES games on emulators on LED screens and thinking the weather effects and such looked pretty bad compared to the CRT experience I remembered, but hadn't gone much deeper than that. I'll have to try and find a CRT and do some tests
I started out gaming on CRTs in the late 90s. Moved to LCD in the mid-2000s and haven't looked back. I don't miss CRTs, not least the bulkiness of them lol.
For real haha. I remember helping my dad move his old big screen TV out of his house when he replaced it with a "flat screen" and holy hell, it beat the hell out of 4 of us and we only had to take it 100 feet. The bulk was something I'm the young will never be able to appreciate :-D
Trump is quoted saying that Iran would surrender or be pverthrown way before they would close the strait.
This operation was cobbled together between Trump, Hegseth, Rubio and Vance without consulting anyone outside that circle. The way they have been selling it, espwcially the strait stuff, smells of unplanned developements all around.
It would depend on how it's distributed. If it's very homogeneous, totally anihilated. If there are galaxies of matter and galaxies of antimatter, more or less like us with a bit more background radiation.
Mass in the universe appears to be (very) roughly uniformly distributed, so even if there are large bodies of antimatter far away in the universe there would have to be a transition boundary somewhere between here and there where the universe goes from being mostly matter to being mostly antimatter. The universe is big and stuff would sometimes cross this boundary and get annihilated, and if this happened it would be the brightest thing in the sky, briefly outshining entire galaxies. We’ve been watching the sky for a while now and have never observed a bright visual event with the spectral signature of a matter/antimatter annihilation, so we assume there is not such a transition boundary, and by extension that the universe is made up of mostly matter out to the edge of the observable universe.
Great explanation. One thing to add: annihilation happens with a very specific energy. Even if it was very far away and redshifted and dim, a "bubble" with a very uniform color (photon energy) would be plainly visible.
It talks about symmetries, but has a nice story about this exact hypothetical scenario. (Someone else already replied why this probably isn't possible in our observable universe, but the episode is cool so I thought I'd share)
reply