If yes, perhaps there are relatively easy ways to address this.
I.e. configure the custom binding to also work on lock screen. Karabiner supports this I think.
Alternatively, rebind caps lock with a custom binding and not os settings (i.e. don’t rebind keys in both a custom tool and the OS). Then, if custom bindings don’t work on lock screen, you get the default, working keyboard on lock screen.
And if the surprise is unpleasant you can disable it by turning off memories and holidays in the settings of the photo app. Not so easy to escape Copilot on Windows.
That's not nearly comparable tho. I don't care it's watching my photo as long as it doesn't annoy me when I want to watch them. Copilot is everywhere, you got to actively avoid it like the plague it is.
Even the linked blog post indicates that that is not the case. Windows has Copilot buttons on practically every built in application, a taskbar icon, and a dedicated physical keyboard key that people commonly accidentally hit (contractually required for OEMs to provide). They also actively promote Copilot in the OS (particularly Home Edition with nothing disabled e.g. "Tips," Notification Spam, Recommendations, etc).
Nobody can predict what Apple will do tomorrow, but as of today, they aren't really pushing Siri/Apple intelligence really hard particularly after initial setup. None of most of the above for example.
I have Pro Edition and for me Copilot only added two icons. One in Notepad and another one in Paint. I ignore both. There's also the Copilot app that I didn't even know I have installed.
I don't know what happens with Home Edition, but I though the pushback was mainly from Insider Preview?
You want to take a look at Microsoft office, my bad Microsoft copilot 365...
You can't even select a cell on notepad without a freaking copilot button pooping up every single time. Same on word, that's maddening !
You could argue that windows isn't Microsoft copilot 365, but then, why do people even use windows ? It's always because of the office, my bad, copilot 365 suite.
You can also get rid of both of them very easily with O&O Shutup 10++ (or any of many other GUIs or scripts designed for the same purpose of decrapifying Windows). I toggled off Copilot and Onedrive and haven't seen either in all the years I've been using Windows 11.
About: "I wonder if other species would look at our images or listen to our sounds and register with horror all the gaping holes everywhere.", yes.
In particular, dogs:
> While people have an image frame rate of around 15-20 images per second to make moving pictures appear seamless, canine vision means that dogs need a frame rate of about 70 images per second to perceive a moving image clearly.
> This means that for most of television’s existence – when they are powered by catheode ray tubes – dogs couldn’t recognize themselves reliably on a TV screen, meaning your pups mostly missed out on Wishbone, Eddie from Fraisier and Full House’s Comet.
> With new HDTVs, however, it’s possible that they can recognize other dogs onscreen.
It's about being able to perceive it as a "living" moving creature and not something different.
You can understand something below the perception threshold is supposed to be a creature because you both have a far more advanced brain and you've been exposed to such things your entire life so there's a learned component; but your dog may simply not be capable of making the leap in comprehending that something it doesn't see as living/moving is supposed to be representative of a creature at all.
I've personally seen something adjacent to this in action, as I had a dog over the period of time where I transitioned from lower framerate displays to higher framerate displays. The dog was never all that interested in the lower framerate displays, but the higher framerate displays would clearly capture his attention to the point he'd start barking at it when there were dogs on screen.
This is also pretty evident in simple popular culture. The myth that "dogs can't see 2D" where 2D was a standin for movies and often television was pervasive decades ago. So much so that (as an example) in the movie Turner and Hooch from 1989, Tom Hanks offhandedly makes a remark about how the dog isn't enjoying a movie because "dogs can't see 2D" and no further elaboration on it is needed or given; whereas today it's far more common to see content where dogs react to something being shown on a screen, and if you're under, say, 30 or so, you may not have ever even heard of "dogs can't see 2D".
I mean it's a dog so you can't exactly ask them; but this was a dog that would bark at every other dog. If he wasn't barking at Hooch because Hooch was only showing up at 24 FPS, then I'm inclined to think he didn't recognize Hooch as another dog.
With CRTs I would think that the problem may be that they do not see a full picture at all. Because the full screen is never lit all at once? Don’t know how persistence of vision works in this case…
With Cathode ray TVs only a single pixel at a time is on, it relies on our eyes having bad enough temporal resolution, if you have Superspeed eyes you will see just a coloured line/pixel moving on screen
That's not quite true. Only one pixel is being activated at a time but the phosphors continue to emit light for many pixels. In practice you get a handful of lines lit to varying degrees at at time. Maybe 1-2 lines quite brightly lit and then a trail of lines that are fading pretty significantly (but still emitting light). They yes, our persistence of vision fills in the rest to provide the appearance of a fully lit screen.
> While people have an image frame rate of around 15-20 images per second to make moving pictures appear seamless,
This is just...wrong? Human vision is much fast and more sensitive than we give it credit for. e.g. Humans can discern PWM frequencies up to many thousands of Hz. https://www.youtube.com/watch?v=Sb_7uN7sfTw
The overwhelming majority of people were happy enough to spend, what, billions on screens and displays capable of displaying motion picture in those formats.
That there is evidence that most(?) people are able to sense high frequency PWM signals doesn’t make the claim that 15 to 20 frames per second is sufficient to make moving pictures appear seamless.
I’ve walked in to rooms where the LED lighting looks fine to me, and the person I was with has stopped, said “nope” and turned around and walked out, because to them the PWM driver LED lighting makes the room look illuminated by night club strobe lighting.
That's not really right. Most NTSC content is either 60 fields per second with independent fields (video camera sourced) or 24 frames per second with 3:2 pulldown (film sourced). It's pretty rare to have content that's actually 30 frames per second broken into even and odd fields. Early video game systems ran essentially 60p @ half the lines; they would put out all even or all odd fields, so there wasn't interlacing.
If you deinterlace 60i content with a lot of motion to 30p by just combining two adjacent fields, it typically looks awful, because each field is an independent sample. Works fine enough with low motion though.
PAL is similar, although 24 fps films were often shown at 25 fps to avoid jitter of showing most frames as two fields but two frames per second as three fields.
I think most people find 24 fps film motion acceptable (although classical film projection generally shows each frame two or three times, so it's 48/72 Hz with updates at 24 fps), but a lot of people can tell a difference between 'film look' and 'tv look' at 50/60 fields (or frames) per second.
That association seems to be an unfortunate equilibrium because higher frame rates seem to be "objectively" better, similar to higher resolution and color. (Someone without prior experience with TV/movies would presumably always prefer a version with higher frame rate.)
In general yes. Low framerates can be used deliberately to make something feel more dreamlike but that is something that should only used in very specific cases.
Pretty much all dramatic American TV shows were shot on film (at 24 fps) before the digital camera era. It's why so many old shows (ex. Star Trek TNG) are now available as HD remasters, they simply go back and rescan the film.
It's more complicated in other countries (the BBC liked to shoot on video a lot) but it was standard practice in the States.
It took far more than simply rescanning the film to get the TNG remasters as all the visual effects were only rendered and composed at broadcast resolutions (and framerates). They had to essentially recreate all of that, which is why we haven't gotten the same remasters for the less popular Deep Space Nine and Voyager series.
From what I have see most series of that era were edited in NTSC after converting the original film material.
I think familiarity is a major factor, but the lower frame-rate and slower shutter speed also creates motion blur, which makes it easier to make the film look realistic since the details get blurred away. I remember when The Hobbit came out at 48 fps and people were complaining about how the increased clarity made it look obviously fake, like watching a filmed play instead of a movie.
> I remember when The Hobbit came out at 48 fps and people were complaining about how the increased clarity made it look obviously fake, like watching a filmed play instead of a movie.
Curiously I can already get in this mindset with 24fps videos and much, much prefer the clarity of motion 48fps offers. All the complaining annoyed me, honestly. It reminds me of people complaining about "not being able to see things in dark scenes" which completely hampers the filmmakers ability to exploit high dynamic range.
Tbf, in both cases the consumer hardware can play a role in making this look bad.
I went out of my way to see the Hobbit in 24 and 48 fps when it came out, and weirdly liked 48 better. It was strange to behold, but felt like the sort of thing that would be worth getting used to. What I didn't like was the color grading. They didn't have enough time to get all the new Red tech right, that's for sure.
Yeah, that's pretty much it. They standardized on 24 back when sound on film took over Hollywood, and we now have a century of film shot at that speed. It's what "the movies" look like. There have been a few attempts to introduce higher frame rates, like Peter Jackson's The Hobbit and James Cameron's Avatar, both at 48 fps, but audiences by and large don't seem to like the higher frame rates. It doesn’t help that we have nearly a century of NTSC TV at ~60 fps[1], and our cultural memory equates these frame rates with live tv or the "soaps," not the prestige of movies.
[1]Technically 29.97fps but the interlacing gives 59.94 fields per second.
I haven't seen a single person complain about avatar. I wonder if the issue with the hobbit wasn't the 48fps at all but rather something more akin to when we shifted to HD and makeup/costume artists had to be more careful.
Because movies (in film form) are projected an entire frame at a time instead of scanned a line (well, actually a dot moving in a line) at a time onto the screen. I read somewhere (but no longer have the link) that when projecting the entire frame at once as film projectors do lower frame rates are not as noticeable. I do not know if modern digital projectors continue to project "whole frames at once" on screen.
Movies are not projected using the scan and hold approach used by typical computer displays. They have a rotating shutter which blinks every frame at you multiple times. This both helps to hide the advance to the next frame but also greatly increases motion clarity despite the poor framerate.
But blinking a frame multiple times rather than once creates a double (or triple etc) image effect. To get optimal motion clarity which compensates Smooth Pursuit without double images, one would need to flash each frame once, as short as possible. But that's not feasible for 24 FPS because it would lead to intense flickering. It would be possible for higher frame rates though.
Maximum depends on what it is you are seeing. If it’s a white screen with a single frame of black, you can see that at incredibly high frame rates. But if you took a 400fps and a 450fps video, I don’t think you would be able to pick which is which.
The discussion on flicker fusion frequency (FFF) and human vs. canine perception is fascinating. When building systems that synchronize with human physiology, like the metabolic digital twins I'm currently developing, we often find that 'perceived' seamlessness is highly variable based on cognitive load and environmental light.
While 24-30fps might suffice for basic motion, the biological impact of refresh rates on eye strain (especially for neurodivergent users) is a real engineering challenge. This is why I've been pushing for WCAG 2.1 AAA standards in my latest project; it’s not just about 'seeing' the image, but about minimizing the neurological stress of the interaction itself.
Dogs can see some colors, but not as many as humans. They have dichromatic vision, and see shades of gray, brown, yellow and blue. Red and green are particularly bad colors for them.
We get blue tennis balls for our pups instead of green; but they aren’t the fetching kind so not sure if it helps.
Is this by some random company that happens to rent an address in Hague? And even that is uncertain because there's no actual address except a vague OSM pin. And no company name either.
This seems untrustworthy, double so for a product that claims to prioritize transparency.
> Our headquarters are in The Netherlands (The Hague). Contact us to book a meeting or ask any questions.
Sounds like when there was that news of "Europe is creating a social media website called 'W' to compete with X" and it just turned out it was some random tiny company.
> Office.EU is a service offered and operated by EUfforic Europe BV, registered with the Dutch chamber of commerce under registration number 98746243 and having its address at Dr. Kuyperstraat 10-A at (2514 BB) The Hague, the Netherlands.
Default Talk copies all streams to all users meaning it can handle a small numbers of concurrent users. There's some sort of a broadcast (plus Stun, nat etc) solution that allows it to scale. They are probably charging for that.
> If you wish to protect your existing buckets, you’ll need to create new buckets with the namespace pattern and migrate your data to those buckets.
My pet conspiracy theory: this article was written by bucket squatters who want to claim old bucket names after AI agents read this and blindly follow.
I see this so often. Sometimes it’s just “no react hooks”, other times it gets literal and extra unnatural, like: “here’s <your thing>, no unnecessary long text explanation”. Perhaps we’re past AGI and this is passive aggressiveness ;)
That is an amazing read, thank you for sharing. Not often you see a landfill for welfare recipients turned into a holy place that the popes visit and wealthy people store their wine
reply