That's not default behavior in most shells (e.g., `autocd` in Zsh, and, for the record, that's also not default up arrow behavior in Bash or Zsh [it is in Fish]).
But my question is specifically about relative vs. absolute paths when recalling directory traversal from history. I'm still struggling to follow how you'd use Zsh history as a zoxide replacement without always using absolute paths.
The official download option doesn't download it to your filesystem as a file. It just lets you watch the video offline in the official app/website. Just tested it now.
Meaning the video file exists in your file system somewhere, so downloading at a higher speed than possibly viewing the video is an existing functionality in the app.
The big question is why not build the turbines offshore?
The article briefly mentions this, and that the off-shore blades are over twice the length of the blades this airplane is designed for, but it doesn't look at all at the economics of either option.
Offshore is not a problem, they build a factory on the coast and put it on a boat.
On shore is a problem - there is a lot of the world where people live that isn't close to a sea. Iowa has more than 6000 despite being hundreds of miles from the nearest sea. (most aren't even close to the Mississippi river)
This page was very slow to load for me, probably partly because it's being hugged by HN. But it would help a lot if images had the `loading="lazy"` attribute, and if they were compressed to about ~100KiB each instead.
Hope they have an unlimited bandwidth plan. I bailed out at about #20, which is unfortunate because it's otherwise a nice list. I'm going to assume 51. Get a Free Kia, isn't part of it.
I'm not sure what Netlify is doing, but the heaviest assets on your website are your javascript sources. Have you considered hosting those on GitHub pages, which has a generous free tier?
The images are from steamcdn-a.akamaihd.net, which I assume is already being hosted by a third-party (Steam)
In part because this particular proof of work is absolutely trivial at scale, with commercial hardware able to do 390TH/s, while your typical phone would only be able to do a million and still have acceptable latency.
I could be mistaken, but I think Python cares about making sure strings don't include any surrogate code points that can't be represented in UTF-16 -- even if you're encoding/decoding the string using some other encoding. (Possibly it still lets you construct such a string in memory, though? So there might be a philosophical dispute there.)
Like, the basic code points -> bytes in memory logic that underlies UTF-32, or UTF-8 for that matter, is perfectly capable of representing [U+D83D U+DE00] as a sequence distinct from [U+1F600]. But UTF-16 can't because the first sequence is a surrogate pair. So if your language applies the restriction that strings can't contain surrogate code points, it's basically emulating the UTF-16 worldview on top of whatever encoding it uses internally. The set of strings it supports is the same as the set of strings a language that does use well-formed UTF-16 supports, for the purposes of deciding what's allowed to be represented in a wire protocol.
You're somewhat mistaken, in that "UTF-32, or UTF-8 for that matter, is perfectly capable of representing [U+D83D U+DE00] as a sequence distinct from [U+1F600]." You're right that the encoding on a raw level is technically capable of this, but it is actually forbidden in Unicode. Those are invalid codepoints.
Using those codepoints makes for invalid Unicode, not just invalid UTF-16. Rust, which uses utf-8 for its String type, also forbids unpaired surrogates. `let illegal: char = 0xDEADu32.try_into().unwrap();` panics.
It's not that these languages emulate the UTF-16 worldview, it's that UTF-16 has infected and shaped all of Unicode. No code points are allowed that can't be unambiguously represented in UTF-16.
The Unicode Consortium has indeed published documents recommending that people adopt the UTF-16 worldview when working with strings, but it is not always a good idea to follow their recommendations.
I imagine the chargers you have are not drawing 3kW each though.
That's the main problem - your legacy infrastructure is most likely wired for 220V@32amps for the whole garage/street just to run the lamps from it, so 7.2kW. That's one EV charger, or two if you want to split them into 3.6kW feeds. If you want to run a proper 7.2kW charger from every lamp post or next to every parking space, that's a lot of brand new cabling that you need to add.
1.6kW is the limit; but no, they aren't. But you don't need 7.2kW all the time! There's no way that every single car would need to charge at every moment, and I know this from walking through parking garages and seeing some cars not move for days at a time.
A EVSE could easily serve multiple spots, and fairly (or unfairly, for profit!) distribute power between cars from a limited supply
Please note, the context here is level 1 charging. 7.2KW is level 2.
With level 1 charging is only 3 to 7 miles per hour, so average of 35 miles in a 7 hour day (assuming you drive for your lunch break). Where I am, the average distance to work is around 27 miles (one way), so a net loss of charge.
No, it's 22 miles to/from work, one way [1]! My commute distance is only a few miles more, and my commute time is almost exactly the average time listed there.
For most, the purpose of living further from work is reduced total cost of living, especially if you're near a big city [2], where it's not usually an option. I save thousands a month by commuting a little, for the same number of bedrooms (which has a legal minimum where I am). If I wanted the same square footage, I'm saving > $10k/month, compared to being 1/3 the distance from work.
reply