The Locked Tomb series:
- Gideon the Ninth
- Harrow the Ninth
- Nona the Ninth
Liked the first Locked Tomb, didn't really enjoy the remainder of the series.
The Witch King (from the Murderbot author), was alright.
The Time Travellers Almanac. Time Travellers Almanac was meh. A set of short stories, but many retreads of the same ideas we've seen over the decades. There was one good story, and one other interesting concept...
Road to Roswell and Crosstalk. Connie Willis' books were great, as usual.
Brzrkr. Keanu's book was meh.
The first five of the Kate Daniels Series:
- Magic Bites
- Magic Burns
- Magic Strikes
- Magic Bleeds
- Magic Slays
Kate Daniels is doing an acceptable job filling in for Dresen's absence, though book 3 and 4 go a bit heavy on the romance side.
Pre-orders pending on the next Murderbot book (Platform Decay), and Dresden (Twelve Months)
I'd argue that 68K is simpler to learn and use. You get a similar instruction set, but 32-bit registers, many of them. It's even got a relocatable stack so it can handle threading when you get to that point.
I agree, I feel like the 68k architecture was a dream for assembly programming. each register is large enough to store useful values, there are lots of them, there are instructions for multiply and divide. This allows you to focus on the essence of what you want to accomplish, and not have to get side-tracked into how to represent the X-coordinate of some object because it's just over 8 bits wide, or how to multiply to integers. Both of these seemingly trivial things already require thought on the 6502.
Pinch zoom is the preferred method, it just needs to zoom at a usable (much faster) pace.
I have no idea what input events zooming generates. Perhaps an option in the URL could be used to turn on a log of input events that people can copy/paste and submit back to you, so you can get a sense of what events are coming from various devices and input methods without having to find devices to try them all yourself.
Plenty of map sites have spent years broken on Macs, not handling smooth pinch-zoom or Magic Mouse smooth scroll wheel events. A slight 3mm movement of a finger going from fully zoomed in street view to fully zoomed out planet view. I'm assuming every micro-movement event is treated as a Windows scroll wheel event where you expect to move a decent chunk on each event.
However you're treating zoom inputs, you've got the opposite effect. A full zoom motion barely does anything.
Perhaps someone at Microsoft threatened physical harm to a Google engineer if they didn't remove the videos... and they caved into their demands rather than reporting the threat, or perhaps did both.
Typically "Cost Of Living" increases target roughly inflation. They don't really keep up though, due to taxes.
If you've got a decent tech job in Canada your marginal tax rate will be near 50%. Any new income is taxed at that rate, so that 3% COL raise, is really a 1.5% raise in your purchasing power, which typically makes you worse off.
Until you're at a very comfortable salary, you're better off job hopping to boost your salary. I'm pretty sure all the financial people are well aware they're eroding their employees salaries over time, and are hoping you are not aware.
Tax brackets also shift through time, though less frequently. So if you only get COL increases for 20 years you’re going to be reasonably close to the same after tax income barring significant changes to the tax code.
> 4. M1 has over 600 reorder buffer registers… it’s significantly larger than competitors.
And? Are you saying neither Intel nor AMD engineers were able to determine that this was a bottleneck worth chasing? The point was, anybody could add more cache, rename, reorder or whatever buffers they wanted to... it's not Apple secret-sauce.
If all the competition knew they were leaving all this performance/efficiency on the table despite there being a relatively simple fix, that's on them. They got overtaken by a competitor with a better offering.
If all the competition didn't realize they were leaving all this performance/efficiency on the table despite there being a relatively simple fix, that's also on them. They got overtaken by a competitor with better offering AND more effective engineers.
I got stuck on Catan for a while too, as I assumed it was like most games and didn't use names.
At some point I just moved on, and solved the next word, Chess, in like 2 seconds. The results claimed I spent a whole pile of time on Chess, since it has no idea which result I'm thinking of solving (though all the wrong guesses for Catan should have been a clue that the time should have gone towards Catan.)
I'm not sure the per-word time values are useful if they can't be trusted to be accurate.
From early green or amber text on black mono displays.
Grey on black DOS text mode.
Light Blue on Dark Blue C-64.
Apple 2's grey/white (I don't recall) on black.
Even GUI wise, Amiga used a dark-blue background as the default Workbench, with user selectable palettes for everything.
It was Microsoft Windows that changed the paradigm to default to a searing white display with black text in most apps, like Notepad, Word, etc., because "it's more like paper". Sure, paper is white, but it's not glowing white. That transition was painful.
I'm glad to see dark-modes return, I agree there needs to be an option, not just forced dark-mode. Preferably light mode options to use a not-as-bright-as-possible white too.
And you shouldn't have your device or monitor set to glowing white -- turn the brightness down so it's the same as a sheet of paper next to it.
And Windows didn't change the paradigm, the Mac was the first widely available consumer device which did. And its built-in CRT wasn't especially glowing either -- it was less bright than paper in traditional office lighting.
Early computers had "dark" color schemes because the resolution was so low and pixels "bled" brightness, that it was easier to read. As technology improved, that problem thankfully went away, and it's easier on the eyes to read dark text on a light background, regardless of print or screen.
There’s a significant base of users that prefer light mode and dark mode so provide both, it’s generally not difficult to do so.
I disagree that apps should tone down light mode. It’s better that all apps use the same brightness and contrast and then users can adjust their monitor to suit their individual preference.
> There’s a significant base of users that prefer light mode and dark mode so provide both, it’s generally not difficult to do so.
There’s a significant base of users that hate with a passion all low contrast dark gray on light gray (aka llight mode) or light gray on dark gray (aka dark mode).
When has the brain of people promoting this been damaged ?
Paper, particularly bleached paper, is not "traditional normal" either.
I'm no paleontologist, but originally humans would use substances like ash and fruit to draw/write on rock/leaves/bark, so white/red/colors on grey/green/brown.
Disagree, white should be standardised as #FFFFFF so that it’s consistent between applications. Then users can adjust how they want “white” to appear by adjusting their screen settings.
No, #FFF is white, and it's up to the client to decide what white should look like.
Arguing that we should use, say, #CCC for white, is like saying that instead of rating things out of a 100, you should rate them out of 70 instead. All you've done is narrow the scale.
For me, the small contrast on pages like HN (in particular with any of the gray text) strains my eyes because it’s more effort for me to see the letters.
But I also read a reasonable amount of PDFs (black on white) which is relatively comfortable on most of my monitors (LCDs with generally low brightness setting to have less light shine into my eyes).
I think what I am saying is, I agree that what is comfortable depends on the user, so websites not moving off the defaults is better, because then users can configure what works for them.
Addendum: The low contrast example on the article is very uncomfortable to read for me.
Given that screens are always adding their own light, it’s impossible for a screen to ever be equally bright as a piece of paper next to it. The screen will always be brighter.
Do what now? An entirely black OLED screen is certainly going to reflect less of the room’s light than a sheet of white paper. An OLED screen displaying white at 10% of its maximum brightness is also likely going to be less bright than a sheet of white paper in most rooms.
The contrast ratio of an old CRT (and amber and green were considered more comfortable than white-on-black) is radically different from a modern LCD/IPS/OLED screen. It's so different that there's no comparison. Dark mode might be ok for more people if there is some brightness to the background instead of being completely black, but then you lose most of the benefits of OLED.
The "true black" OLED displays have their part of the display off where there are black pixels, if I am not wrong. So, wouldn't dark mode suit well for those types of displays?
GP is arguing that exactly because there is no backlight, the contrast between on/off is uncomfortably high on modern screens compared to the CRTs where Windows 2/3 was running.
I agree. Most websites with a dark color scheme use a dark grey background and even off-white text.
Traditional normal is not an absolute statement. Sure DOS / Unix back in the early days of PC displayed black backgrounds due the display's at the time working better this way.
Before that, people shared information in white paper; and the beginning of the internet brought it back with black text over white background.
Therefore there is no canonical traditional normal, it all depends when one joined.
Paper and paper-like writing surfaces were non-white for a long time before we got bleached white paper.
We haven't yet had a glowing-white paper.
Traditional-normal for computing was a dark background.
There was likely a technological limit in the use of pure white at the start when "emulating" paper. VGA 16-color mode likely meant that the choice was between bright white and medium grey, which was too dark. Configurability has lagged behind though.
That was only common for a blip in time where NOTHING was normal because it was all being figured out and cost constraint, not personal or ergonomic preference, drove computing capabilities.
> Even GUI wise, Amiga used a dark-blue background as the default Workbench
That's because of cost. It was expected that many people would be viewing Amiga video output on a television via composite output and white-on-blue is something that TVs are good at displaying. The 1080 was like 1/3rd the cost of the A1000 and I'm willing to bet that many, MANY A500s were hooked up to TVs for at least a while after being opened on Christmas.
I used practically every word processor ever made for Amiga. Except for WordPerfect they were all black text-on-white, and WordPerfect you could change that they just kept the default blue and white to match DOS.
Dark mode was normal in the early days of CRTs, when most CRTs refreshed at 60Hz or lower. The dark background made the flicker less obvious. Once higher refresh rate CRTs became common (1990s), the flicker became less of a problem and light mode became the default.
...and Lotus 1-2-3 mimiced visicalc and when I used visicalc (on an HP85a) it had a dark background with a greyish white foreground colour. ie dark mode by default.
Mac likely did use this scheme, and yes, copied it from Xerox. However neither Macs nor Xerox had mainstream use. I'd only actually seen 3 Macs in the wild before their switch to Intel, over 20 years later.
Windows adopting the "paper"-white background and whole world drooling over the arrival of Windows 3.1 and 95 is when it became the standard, I think.
There's no 'likely' about it - the Mac absolutely used white as its background color for document windows and finder folders. It was striking and different when you first encountered one of the early compact Macs to see how white the screen was when you opened MacWrite.
As for the claim that Macs had no 'mainstream use' for 20 years until the Intel switch... your personal Mac-free life is a sad story, but not remotely universal, and while it's certainly true that Macs always had minority market share, it's insane to suggest they weren't influential.
My favorites were actually DOS TUIs, where for some reason blue became a commonly used background color for a lot of things (e.g. Norton Commander, many Borland products, FoxPro...).
Yeah, it wasn't Windows that changed it, they just hopped on the bandwagon.
I remember (SunOS)[https://en.wikipedia.org/wiki/SunOS] on a SPARC in 1987 that was black in white text, and Macintosh before that.
> It was Microsoft Windows that changed the paradigm to default to a searing white display with black text in most apps
My early 90s Sun SPARCStation was black on white, right from the boot. The xterm default is black on white too, a default that far predates Windows AFAIK.
I don't really know the full history on all of this, but in my limited knowledge, this seems grossly simplified at best since there seem to have been several popular systems before Windows that used white background colours.
Athena text widgets on X were black on white in the 80's. So was Lisa, Mac, and NeXT, OS/X and SunOS's first GUI. Yes, amber on black was long running, but since you weren't alive then let me tell you something: it sucked. Moving from VT100 (VT104) terminals to actual Sun/Aix machines running X was a HUGE improvement on eye strain.
I’m glad those brightness settings work for you but I can’t deal with how dull it makes colors look on traditional backlit displays. The reduced contrast also isn’t very fun with modern UIs which for some reason actively avoid good contrast.
windows originated very little: plenty of type-on-page metaphor predated it.
original was light mode: printer terminals. yes, green-on-black became normal in the mid seventies, and some amber-on-black. but even early lisp machines, the Alto, Smalltalk, W/X/Andrew interfaces, Next, etc - type-on-page, not serial-terminal-ish dark mode.
Beside it not being true for paper it's also not true for electronic screens.
Before a computer with CRT most of us had some simpler screens on calculators or other devices that were LCDs. And they are blackish on some lighter gray or green - light mode.
I'm not the original poster, but I ran into something similar late in Win 7 (Win 8 was in beta at the time). We had some painting software, and we used open-MP to work on each scan-line of a brush in parallel.
It worked fine on Mac. On Windows though, if you let it use as many threads as there were CPUs, it would nearly 100% of the time fail before making it through our test suite. Something in scheduling the work would deadlock. It was more likely to fail if anything was open besides the app. Basically, a brush stoke that should complete in a tenth of a second would stall. If you waited 30-60 minutes (yes minutes), it would recover and continue.
I vaguely recall we used the Intel compiler implementation of OpenMP, not what comes with MSVC, so the fault wasn't necessarily a Microsoft issue, but could still be a kernel issue.
I left that company later that year, and MS rolled out Windows 8. No idea how long that bug stuck around.
To add to this, I'll try to give an idea of how much zoom (or focal length really) you'd need to get a picture with detail.
I took photos of both Jupiter and Saturn w/ a Canon R7 and the RF 100-500mm lens, with a 1.4x extender. The 1.4x extender make the lens act like 700mm instead of 500mm. The R7 being an APS-C sensor adds another 1.6x factor, making the combo the equivalent of 1120mm. In these photos the planets are still just dots. The camera takes 32.5 megapixel photos. When zoomed in to the pixel level, both planets were still tiny, about 50 pixels wide. It was enough to see Saturn had a ring and some color striping on Jupiter, but that's it.
The iPhone main camera is like 26mm (42x less zoom). The iPhone 13 Pro's telephoto lens is 77mm (14.5x less zoom), and the iPhone 15 Pro Max is 120mm (9.3x less zoom)... so you're unlikely to get much more than what looks like an out of focus few pixel wide dot even on the zoomiest of iPhones, but with that wider 26mm lens, you just might be able to capture them all in one shot.
To me, what's more technically impressive than the fact I took pictures of the planets with readily available camera gear was that I did with 1/125s shutter speed, handheld, standing in my yard. The accuracy of the image stabilization needed to pull that off is what astounded me the most.
Liked the first Locked Tomb, didn't really enjoy the remainder of the series.
The Witch King (from the Murderbot author), was alright.
The Time Travellers Almanac. Time Travellers Almanac was meh. A set of short stories, but many retreads of the same ideas we've seen over the decades. There was one good story, and one other interesting concept...
Road to Roswell and Crosstalk. Connie Willis' books were great, as usual.
Brzrkr. Keanu's book was meh.
The first five of the Kate Daniels Series: - Magic Bites - Magic Burns - Magic Strikes - Magic Bleeds - Magic Slays
Kate Daniels is doing an acceptable job filling in for Dresen's absence, though book 3 and 4 go a bit heavy on the romance side.
Pre-orders pending on the next Murderbot book (Platform Decay), and Dresden (Twelve Months)