Hacker Newsnew | past | comments | ask | show | jobs | submit | post-it's commentslogin

It has a launch escape system, unlike the shuttle.

You're asking some pretty niche copyright questions that even a lawyer would have to spend time searching for case law for. It may be more expedient to look for that case law yourself.

If you need to be an attorney to figure out if you're allowed to take a picture of something, we've already jumped the shark.

Not what he asked.

Yeah, they run fine.

This is a few years old, but at one point Apple was happy to bypass VPN or firewall settings to allow their own apps to communicate[1]. I don't know if this is still true on Tahoe, but I wouldn't be surprised if at least the mechanism still exists. So "they run fine", but they may not do what you expect them to do when it comes to Apple's products/services.

[1] https://www.macworld.com/article/675671/apples-own-programs-...


Not really, just an unintuitive security feature. You still need the user's permission to access that folder, but that permission is then persistent. I consider it a UX bug for sure but not an exploit.

I agree, it's a ui/ux problem. It would seem that using the open file dialog should also request access but I'm guessing that was too intrusive and the user action is seen as implicit authorization. Security is one of those things that should aways be explicit though.

if having to run an arcane terminal program to disable access while GUI is as if access was not granted is "unintuitive security feature" for you, I can't even.

I appreciate the curation you do, dang. I often notice a headline get updated and the result is always a significant improvement.

They are clearly familiar with meat-based animals:

> “That’s ridiculous. How can meat make a machine? You’re asking me to believe in sentient meat.”

> “I’m not asking you, I’m telling you. These creatures are the only sentient race in that sector and they’re made out of meat.”

And indeed sentient species that are partly made of meat:

> “Maybe they’re like the orfolei. You know, a carbon-based intelligence that goes through a meat stage.”

> “Spare me. Okay, maybe they’re only part meat. You know, like the weddilei. A meat head with an electron plasma brain inside.”


I get your point but I don't think that those quotes establish familiarity with meat based animals. Familiarity with animals would be something like "yeah, sure, we know about that planet with cows but this is something else entirely!" (Also humans wouldn't be so surprising if they knew about things like cows).

Their references are not to creatures that are meat through-and-through but fictional alien races that have a kind of incidental relationship to meat that doesn't establish meat-based cognition as normal the way that animals would.


You have to read the story in the original eshidilii. It just sounds so illogical in English.

The construction site next door is using those vehicles, and they're also a lot more pleasant throughout the day. It's easier to tune out white noise than beeping. The first cshh is a little louder than the others, which is a nice design touch.

Speak for yourself, I can tune out a steady beep much easier than the sound of a seagull being strangled to death. (That's what the ones around here sound like anyway.)

On a more serious note: the loud beeping backup alarms were DESIGNED to be annoying and difficult to miss. I would not be surprised in the least if a study showed these "less annoying" backup alarms correlating to a higher number of children being run over by reversing vehicles.


There have been studies and those resulted in the less annoying backup sounds. These sounds are essentially harsh white noise, which has one significant difference to the beeping: it's level drops off differently with distance, meaning you can blast it louder and people who are really in the wrong spot will notice better it means them, while people who are not meant will not be annoyed or fatigued by it. Two noise sources combine different than two tonal sources and the human ear can locate broadband sources better than single tones.

This was developed especially for use in backup heavy environments like harbors where workers started ignoring constant beeps.


There's also another difference: beeps can reflect coherently off of surfaces, causing directionality confusion in a dense environment. White noise is much less likely to have odd interference patterns, maximizing our ability to localize the sound.

> In this particular case, a human could have told the machine: “There’s a lot of things that are both agents and tools. Let’s go through and make a list of all of them, look at some examples, and I’ll tell you which should be agents and which should be tools. We’ll have a discussion and figure out the general guidelines. Then we’ll audit the entire set, figure out which category each one belongs in, port the ones that are in the wrong type, and for the ones that are both, read through both versions and consolidate them into one document with the best of both.”

But that isn't the hard part. The hard part is that some people are using the tool versions and some are using the agent versions, so consolidating them one way or another will break someone's workflow, and that incurs a real actual time cost, which means this is now a ticket that needs to be prioritized and scheduled instead of being done for free.


The message is to have a contract and insist on being paid according to the contract, and refuse further work until you get paid.

Screens outside the windows (not on the windows) can provide parallax, no need to track heads. However, in this case:

> They were attempting to pull off AR effects on the transparent OLED windows of the bus without accounting for lens distortion, field of view, parallax, occlusion, etc., and were frustrated and mystified when things didn’t appear to line up. They were completely naive to what depth and scale cues are and how to deploy them.


Can you elaborate? It seems to me that unless the screens are that far outside that they are where the target object is, two people that are offset laterally wrt the target object would have to be displayed something that's offset on the screen.

Imagine two passengers seating in rows r10 and r11, looking at the target T

A. You need to know where a passager's eyes are to display the POI in the right place. Even if each rows gets their own and only screen you'll need to account for their head vertical position (different people are different height) and movement, hence the eye tracking.

B. If you share a window between multiple people you end us with a POI mess with informations displayed multiplied by as much passengers in the bus.

   |- r9  -|w9
   |       |w9
   |- r10 -|w10   T
   |       |w10
   |- r11 -|w11
   |       |w11
   |- r12 -|w12
IMHO the only practical way is with personal headsets like [0] but then you don't need a bus: just use your foot or any transportation: it's AR and not VR.

0 https://www.youtube.com/watch?v=SpoLdQpPcAc


So is it an actual moving bus or just a simulation of one? I have not heard of the concept before

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: