I'm totally with you re: Android and Apple being walled-garden ecosystems with ever-changing rules. But, don't you feel like this is true of most software (that it's never "done")? In my experience, there aren't many categories of software that can be truly feature-complete unless they are fully decoupled from popular culture. Maybe GNU units or grep can be called "done", but most apps have to change with the world around them.
> don't you feel like this is true of most software (that it's never "done")?
The problem is that "done" is a subjective term. Most of the software I use on a regular basis is "done" as far as I'm concerned. If it didn't meet my needs, I wouldn't be using it on a regular basis.
This ignores security issues, of course, but most of the software I use on the regular doesn't have a networking component, so that's not as much of an issue.
How does something like `let y: Time = 1 year` work? Does it take into consideration the idea of leap years and leap seconds, counting a single year as 365.2422 days[0]? Or does it count as 365 days?
I got curious and installed the CLI tool[1] and found that it does indeed account for leap second / leap years:
I feel like that’s unit confusion. Converting from year to day should require you to specify what calendar year you’re in to resolve the ambiguity. Otherwise I set an alarm for today + 1 year and things are off by 6 hours.
Time is nasty because there’s lots of overloaded concepts and bugs hide in the implicit conversions between meanings.
I’m also kinda curious what the underlying type is. Is it a double or do they use arbitrary precision math and take the perf hit.
> I feel like that’s unit confusion. Converting from year to day should require you to specify what calendar year you’re in to resolve the ambiguity.
A year has two meanings - a calendar year, with all the leap days and seconds and timezones, or a duration of time. The latter is still useful, e.g. when someone states that Proxima Centauri is 4.2 light-years away, they don't want to deal with leap-days.
Decent time libraries have separate ways to deal with durations and dates.
Ok so I did some reading and I like what I see, its important however to properly disambiguate between the two kinds of time units.
Chronological Time units and Calendrical Time units... These are fundamentally different concepts that overlap a lot in day to day life but when you need to ensure technical accuracy, can be very different.
- Planck time, Stoney time, Second: Unambiguously valid for both chronological and calendrical usage. Since we define everything in terms of the second anyway it's basically the centre of the Venn diagram.
- Sidereal day: It isn't a fixed value over longer periods of time, getting longer at a rate on the order of 1.7 milliseconds per century [1], so a conversion of a short period like 7 sidereal days into seconds is going to be off by something like 3.26×10^-7 seconds which might be ok, particularly if you also track the precision of values to avoid introducing false precision in the output, since you can then truncate the precision to above the error margin for a calculation like this one and treat it unambiguously as valid for both calendrical and chronological purposes.
- It's also worth noting since you mentioned it, the slight difference between Tropical year (seconds the earth takes to do a complete orbit around the sun), and Sidereal Year (seconds for the sun to fully traverse the astronomical ecliptic), the sidereal year is longer due to the precession of the equinoxes.
- Minute, Hour: These can vary in length up to a second if a leap second is accounted for, so while conventionally fine for chronological calculations as fixed multiples, don't have precise chronological values when used with calendar calculations. The exact number of minutes between now and 2030 is fixed, but the number of seconds in those minutes is not.
- Day: In addition to leap seconds, the length of a calendar day also have to deal with the ambiguity of daylight savings time and are where the significant differences in calendrical calculations vs chronological calculations really start to kick in.
- Week, Fortnight: all the problems of days but magnified by 7 and fourteen respectively. Also, theres the concept of standard business weeks and ISO week calendars, where some years wind up with more weeks than others due to the ISO week related calendar rules.
- Month: Obvious problem... "which month?" theres quite a few less seconds in February than in October.
- Julian year, Gregorian year: These are conventionally defined by how many days they have and the leap day rules and then approximated to an average seconds value so you can "pave over" the problem here and a lot of people might not be as surprised as if you average the length of a day or a month.
- Decade, Century, Millennium: are all affected by leap day related rules, and over a given length of time you see the introduction of an unknown but sort of predictable number of leap seconds. So while you can average it down yet again, the problems of anything bigger than a day have reached the point where over a millennium you're dealing approximately 0.017 seconds of change in the rotation of the earth,
Doing this right is basically incompatible with doing this the easy way, I'd at least re-label the averaged time units to make the use of average approximations more obvious, and ideally I'd split the time types into calendrical and chronological, and use more sophisticated (and I'll be the first to admit, annoying to implement) calendar math for the chronological calculations.
[1] - Dennis D. McCarthy; Kenneth P. Seidelmann (18 September 2009). Time: From Earth Rotation to Atomic Physics. John Wiley & Sons. p. 232. ISBN 978-3-527-62795-0.
Some people may think you’re tossing out a sarcastic joke here… but unambiguously fuck yes … because doing this kind of preemptive typing, the forward thinking to “frame of reference” is basically the next step after overhauling everything to disambiguate between calendrical and chronological timekeeping and units…
Because fundamentally you can’t correct for the reference frame if you can’t work out if your dealing with chronological or calendrical units. Calendrical units are in a weird liminal space outside of the earth reference frame. We measure the history of most deep space missions by earth reference frame mission elapsed time and do so by keeping a clock on earth and silently keeping records of the vehicle clock.. but on Mars we have a per mission Sol count that brings Mars time into the mix, and I know for a fact a lot of people neglect the barycentric gravity gradient difference between Earth and Mars because for literally 99.9% of things it doesn’t matter… but if you measure a transit of an Astronomical body from instruments on Mars and don’t deal with the relative reference frames your fractions of an arc second are basically pointless false precision.
Alpha Centauri is simply a weird earth. I think that a game around long-lived creatures that colonize a galaxy and have to work with relativistic effects could be different and fun. At some level it builds on all these different ways of looking at time. Breathe some life via simulation into this tongue-in-cheek interstellar economics: https://www.princeton.edu/~pkrugman/interstellar.pdf
They will not have to work with relativistic effects. Nobody is going to fly faster than 0.1C. It is prohibitively expensive energy-wise and there is no point of doing that. Just accelerate to 0.01C, and arrive at the neighbouring system in 400 years. 400 years is a blink of eye, anyway.
Per 400 years, there is one leap day every 4 years (100 leap days), except when the year is divisible by 100 (so we overcounted by 4 and there are 100 – 4 = 96 leap days), except when the year is divisible by 400 (so we need to add that day back and arrive at 100 – 4 + 1 = 97). This gives us 97/400 = 0·2425.
The tropical year is about 365·24219 days long, but that's not relevant to timekeeping.
From the linked article: " After the weighing, they received an official certificate proclaiming them not a witch... Certificates would state that 'the body weight is in proportion to its build'. The reasoning behind this is the old belief that a witch has no soul and therefore weighs significantly less than an ordinary person".
I love this piece of history so much, thank you for sharing it! Adding this one to my bookmarks :-)
Which is obviously misguided. You can't determine from somebody's BMI if they have a soul. A soul only weighs three quarters of an ounce [1], or about 1/10th a cup of coffee. Well within even the daily variation of a person's body weight. (/s)
Does my soul look fat? Is it good or bad to have an obese or skinny soul? Do they float? Are they charged, can you trap one in a Faraday cage? Is that what the Ghostbusters were collecting?
“The Waag is still open as a tourist attraction, and official certificates are available.”
Maybe with modern science we can make someone much lighter than they ought to be. (Put helium in their stomach?) Then they would be forced to issue a certificate that said that they could not guarantee that the person is not a witch :D
If you have ADD or ADHD, Ritalin might help. I have severe ADHD that I refused to treat for decades, but I recently gave in.
I am _hugely_ sensitive to caffeine and feel a buzz even from decaf. It ruins my sleep in a similar way to what a couple people in this thread describe.
I take 10mg of instant-release Ritalin at 7AM each day, and it allows me to focus and deliver. It wears off by around 2-3PM, and I sleep like a rock most nights.
There are downsides as well: once it wears off, it leaves you mentally drained until you've slept. Also, there's a potential for building a tolerance, as well as potential for addiction. I've been lucky in both cases so far, but ymmv.
Thanks for the kind words :). I didn't necessarily mean "useless" in a self-deprecating way. I only meant it in the sense of the linked article called "Write More 'Useless Software'"-- i.e. that it doesn't directly have a production use.
I'm curious what you mean about greasemonkey and noscript. Do you just mean you're astounded that such cool add-ons were made, or something else?
I more meant it's a fork in your personal browsing experience -- do you want to do a lot of risky customizations, or lock things down?
(To paint a picture, I'm writing this in TBB since it forces only HTTPs JS and I want to get an approximate picture of what others see, then then people complain about the obvious exit node I pop over to a public location's wifi with uBlock and NoScript and try to allow as little as possible.
Ah, I see. Wow, that is a high level of dedication to Internet privacy! With my work and interests I would 100% be incapable of using a combo of TOR, public wifi, and other locked-down browsers / networks. Props to you!
I took his Build a Compiler in Python course, and I honestly found it to be lackluster.
He had just changed the format, though, and our class was his first of a new wave where he supplied us with no example code, instead making us write it all out by hand.
By the end of the five-day course, virtually no one had a working toy compiler of any kind. Maybe a parser and AST representation, but not the whole shebang.
His style of teaching is not as compelling as his live, on-stage presentations, in my opinion.
If I had paid with my own money, I would've been pissed. Caveat emptor.
I took the same course around towards the end of 2021. David provided examples and I found that he hit a good balance of "I will let you figure out yourself based on these" vs "let me explain you guys what it's about".
I struggled a bit at the last few days, partially because I was taking the course from the other side of the world (i.e had to stay awake all nights for the course, five days).
However, most of us at least managed the "Function" part. The LLVM and WASM parts, I just watch what David did and didn't attempt them myself.
I also took the SICP course, which I found it to be a better "bang for the bucks". Either way, I have no regret about taking David's courses.
I'm sorry, I know this is a dumb and off-topic comment more appropriate for Reddit, but I couldn't resist. If you check my comment history, you'll see I don't usually do this. Please forgive.
This is amazing! After playing my first game, I feel immediately inspired to think about the relationships among the tiles / positions. Maybe I'll try to write some kind of solver as a learning exercise.
I noticed that there is an option to download the puzzle as JSON, and I tried it. Can you tell us what the schema is? I see there's a key called `tiles` that has a value of an object mapping ints to other ints. What do those ints represent? I figure since there are 36 tiles, they key int is the tile number, but what about the value? I think this info would be valuable for programmatically solving these puzzles.
Once again, great work! I have a feeling this will be a big hit, possibly of Wordle-like proportions.
I think the downloaded version of Penrose puzzles does not currently contain enough info to run a solver on it. Because the grid is different every time and this info is not present in the downloaded file.
But I can tell you how it works for other grids =).
The contents of `tiles` key are an array of tile shapes. Possible directions where there are connections are binary-encoded. For example, on a square grid 1 means right, 2 means up, 4 means left and 8 means down. So a 3 means a corner piece connecting up and right, a 5 is a horizontal straight piece. A 0 means an empty tile, 15 in this case is a fully connected + tile. It's similar for other grids.
How the tiles are numbered depends on grid implementation. In a square 5x5 puzzle the indices would go like this:
This is so cool to receive a response from the creator of Hexapipes! Thank you for the reply!
That makes sense re: the Penrose grid being different and not having enough data in the numbers to make sense of yet.
Based on your help, I've decided to start with the square grid, and I've gotten as far as a simple program that can take the JSON and render a puzzle as a png image. Now to draw the rest of the owl...
Michael W. Lucas is such a good author, and I just wanted to give a shout-out: "TLS Mastery" opened my eyes to the world of TLS / SSL certs in an understandable way that used interactive examples on Linux.
I've only watched Tim Cook's introduction, but it sounds like his voice is being shifted (autotuned) to sound pitch-perfect. It's pretty uncanny valley and off-putting imho.
The whole presentation evokes the unsettling sensation of the uncanny valley for me. The presenters appear eerily humanoid in their demeanor, posture and gestures. The peculiar emphasis they use on certain words adds to this feeling too. I said this in another comment, their outfits also don't fit with what one would expect in a typical setting, they feel strangely "off" and out of place. Also... the disorienting special effects and the ambiguous nature of the campus visuals (real or simulated...or both? I can't tell) is just ... a lot. I know Apple is going for a certain look with these presentations but for me it's crossed a line. Kind of like watching one of those weird overly produced Netflix shows or something, where the lighting/coloring/etc is all wrong and fake feeling.
I'm totally with you re: Android and Apple being walled-garden ecosystems with ever-changing rules. But, don't you feel like this is true of most software (that it's never "done")? In my experience, there aren't many categories of software that can be truly feature-complete unless they are fully decoupled from popular culture. Maybe GNU units or grep can be called "done", but most apps have to change with the world around them.