I have a 2017 Bolt as my only car and the slow L3 charging is definitely a downside, but I haven't found it to be a huge issue in practice. On a trip long enough to worry about fast-charging you're going to need to stop to eat periodically anyway so if you plan your charging around meals you don't end up waiting too long. Obviously gets a bit more annoying on trips that are long enough to require more than one fast-charge per-day, but I don't take trips that long frequently.
Day to day charging is generally all going to be L2 or even L1 depending on how far you drive and how long typically parked somewhere with a plug. That will be roughly the same speed in any car. Some cars do have higher capacity L2 chargers than the Bolt does, but most public L2 stations don't provide the higher current needed to see the difference.
This I think is the key that most non-EV drivers don't recognize. Especially if you own a home, then an EV is really fantastic. You simply plug it in whenever you are home and 99% of the time you spend 0 time waiting for a charge. The slow trickle charge is both cheaper and more convenient because you aren't making trips to a gas station on a weekly/biweekly basis.
Rather than planning your charging around meals, you're more likely going to have to plan your meals around charging. I don't see a lot of restaurants with level 3 chargers in the parking lot.
If you compare it to the commuter rail systems in those places, BART feels impressive (though less so with the service cuts). I was a regular rider on the Metro North New Haven line and had experience with SEPTA and NJT commuter rail and I was really impressed with BART when I moved out here. Peak frequency was pretty good (at least on the Red line I primarily used) and when things were on time they were very on-time ("on-time" Metro North trains were always at least a few minutes late in my experience).
If you compare it to the NYC subway, it's obviously not impressive at all (though the tech is less dated). As a rapid-transit system, BART isn't exactly a commuter rail or subway system exactly, but I think it's closer to the former than the latter.
As another commenter pointed out, you can do pre-emptive multitasking just fine without an MMU. And as it turns out AmigaOS had just that. All you need for pre-emptive multitasking is a suitable interrupt source to use for task switching.
What it did not have was memory protection or virtual memory. You do need an MMU for those.
Went to Drexel for CS, but dropped out in my Sophmore year back in 2004. Did PHP webdev in my home state of CT until 2011. Moved to the SF Bay Area and transitioned to doing Erlang and C++ for some F2P games for a while. I'm currently a Staff Engineer at Discord focused on AV and other "native" stuff.
So looking back to the Falcon 9, there were only 4 failures to complete orbital objectives across 503 launches and one of those was only a partial failure (main payload delivered successfully, but the secondary payload was not due to a single-engine failure). These failures were not consecutive (4th, 19th, what would have been the 29th and 354th). Now apart from the first launch or two (COTS Demo Flight 1 had some useful payload, but still seemed pretty disposable) these all had real payloads so they were less experimental than these Starship test flights.
If we compare to the propulsive landing campaign for the Falcon 9 1st stage it's a bit more favorable. The first 8 attempts had 4 failures, 3 controlled splashdowns (no landing planned) and 1 success. I think in general it felt like they were making progress on all of these though. Similarly for the Falcon 1 launches they had 3 consecutive failures before their first success, but launch 2 did much better than launch 1. Launch 3 was a bit of a setback, but had a novel failure mode (residual first stage thrust resulted in collision after stage separation).
Starship Block 2 has had 4 consecutive failures that seem to be on some level about keeping the propellant where it's supposed to be with the first 2 failures happening roughly in the same part of the flight and this 4th one happening during pre-launch testing.
This is a small point, but calling the 33-byte unit a sector in CDDA is a bit misleading and probably incorrect for the quantity being labeled. This is a channel data frame and contains 24-bytes of audio data, 1 byte of subcode data (except for the channel data frames that have sync symbols instead) and the rest is error correction. This is the smallest grouping of data in CDDA, but it's not really an individually addressable unit.
98 of these channel data frames make up a timecode frame which represents 1/75th of a second of audio and has 2352 audio data bytes, 96 subcode bytes (2 frames have sync codes instead) with the remainder being sync and error correction. Timecode frames are addressable (via the timecodes embedded in the subcode data) and are the unit referred to in the TOC. This is probably what's being called a sector here. Notably, a CD-ROM sector corresponds 1:1 with a timecode frame.
Note: Red book actually just confusingly calls both of these things frames and does not use the terms "channel data frame" or "timecode frame"
> The Xbox 360 doubled down on this while the PS3 tried to do clever things with an innovative architecture.
I don't think this is really an accurate description of the 360 hardware. The CPU was much more conventional than the PS3, but still custom (derived from the PPE in the cell, but has an extended version of VMX extension). The GPU was the first to use a unified shader architecture. Unified memory was also fairly novel in the context of a high performance 3D game machine. The use of eDRAM for the framebuffer is not novel (the Gamecube's Flipper GPU had this previously), but also wasn't something you generally saw in off-the-shelf designs. Meanwhile the PS3 had an actual off the shelf GPU.
These days all the consoles have unified shaders and memory, but I think that just speaks to the success of what the 360 pioneered.
Since then, consoles have gotten a lot closer to commodity hardware of course. They're custom parts (well except the original Switch I guess), but the changes from the off the shelf stuff are a lot smaller.
Day to day charging is generally all going to be L2 or even L1 depending on how far you drive and how long typically parked somewhere with a plug. That will be roughly the same speed in any car. Some cars do have higher capacity L2 chargers than the Bolt does, but most public L2 stations don't provide the higher current needed to see the difference.
reply