The original TeX Fonts stored their metrics in TFM (short for TeX font metrics) files, which contains a bytecode interpreter for calculating ligatures and kerning between characters. I learned about that when I tried reading the files myself.
From what I can tell, modern fonts using OpenType just have tables to accomplish something similar now, in the form of the GSUB and GPOS tables?
Oh yes, thank you for the link to that! Looks like that is an instruction set for representing the font glyphs themselves? I was talking about the instruction set in TFM which is for representing meta information, like ligatures and kerning between glyphs, and not for the actual glyphs. The glpyhs for the original TeX fonts are described using Metafont which is an interpreted language.
Glyphs are generally defined as pure outlines (the "glyf" table [1]), and the instruction set is an optional system for things like grid fitting. Ligatures, kerning etc. are normal tables.
My source was also the Lawful Masses stream, but it sounded like the original excuse was made in April, but it took until November for him to actually produce the death certificate to prove that he had lied about when his grandfather had died.
That point is made in the article - he could have simply said the Grandfather died that week, and it would have been accepted. Instead he doubled down on going on attack, most likely because it did feel like a lie when he said it or was not the actual reason he missed the proceedings.
2 days ago (12/27), "Web Hosting Service" was the most common page visited, with <0.1% mobile (compared to >70% for the rest of the top 10). Did something happen that would cause this? Or is this a bot farm that's visiting wikipedia? Odd.
Edit: Looks like there were also around 3m "automated" views to the page based on user agent. Maybe some bot script went awry but used non-bot user agents half the time?
I think by "each system call" she meant it like "every time it calls read()", since it would be read() that was using the AVX registers. Since the example program just calls read() over and over, this could add a significant amount of overhead.
Tip: Wappalyzer (https://www.wappalyzer.com/) can automatically do this snooping for you. I use it all the time to answer the question “What is this site running on?”
This was such a wonderful read! I've been getting into Rust recently, and the sections on dealing with challenges that are specific to Rust were particularly useful. The way they created a new trait to turn `Fn(&str) -> Result<(&str, T), &str>` into `Parser<T>` was insightful, and the discussion of how they dealt with the growing sizes of types was something that I can imagine myself running into in the future.
Most importantly though, when they started writing `and_then`, my eyes lit up and I said "It's a Monad!" I think this is the first time I've really identified a Monad out in the wild, so I enjoyed that immensely.
I think you should be able to put those settings into your X11 configuration so you don't have to re-configure every time you boot, unless xinput on Ubuntu is different from Arch: https://wiki.archlinux.org/index.php/Libinput#Common_options (Although debugging xorg config files is not a terribly fun exercise :/)
Flow actually explicitly allows this[1] by just /* */ commenting out the flow type signatures. We use this in Aphrodite's code base[2] and it works pretty well.
Woah, awesome. I've used the http://dnstunnel.de/ tunneling scripts before, and they're a bit of a pain to setup (mostly because of the perl library requirements).
From what I can tell, modern fonts using OpenType just have tables to accomplish something similar now, in the form of the GSUB and GPOS tables?
Documentation for the TFM format here: https://tug.org/TUGboat/Articles/tb02-1/tb02fuchstfm.pdf (search for lig/kern)