During this time I got really excited about succinct data structures and wrote a Rust library implementing many bitvector types and a wavelet matrix. [2]
My interest came from a data visualization perspective -- I was curious if space-efficient data structures could fundamentally improve the interactive exploration of large datasets on the client side. Happy to chat about that if anyone's curious.
[1] Paper: https://archive.yuri.is/pdfing/weighted_range_quantile_queri... though it's pretty hard to understand without some background context. I've been meaning to write a blog post explaining the core contribution, which is a simple tweak to one of Navarro's textbook data structures.
Many dictionaries now list one common use of “literally” as meaning “figuratively, with emphasis”. So literally officially sometimes now literally means figuratively.
I suspect some people are literally having conniption fits about this…
I’m sorry, but your comment mixes two different types of dictionaries. You talk about “official” meanings which would be a prescriptive dictionary telling you the way you are allowed to use a word. But the dictionaries that include “figuratively” in their definitions are clearly descriptive, presenting all the ways words are commonly used.
You can’t take a descriptive dictionary and then claim it is prescriptive.
There are no prescriptive dictionaries, at least not correct ones, for living languages.
IIRC both the OED and CED list figurative uses for the word, do you know any publications considered more authoritative than those for English? Webster too, for those who prefer simplified English.
They have Académie Française which intends to control the language to an extent, in recent times focussing a lot on resisting then encroachment of English word and phrases, but IIRC their recommendations don't carry as much weight as many think and are often ignored even by government departments and other official French bodies.
The Académie do publish a dictionary every few decades though, there was a new edition recently, so there is a prescriptive dictionary for French even though it carries little weight in reality.
French is the only living language to attempt it to this extent, though the existence of one is enough to make my “there are none for living languages” point incorrect. It is difficult to pin a language down until no one really speaks it day-to-day (so it doesn't evolve at the rates commonly used languages do).
Very few of those have official force or cover much more than a subset of language properties (i.e. spelling rules), but definitely more than the "none" of my original assertion.
Isn't this more of a cultural thing, that Germans seem to agree that it is authoritative and use it as a reference?
I'm not sure what would even make a dictionary prescriptive other than an explicit declaration that it is so or, ridiculously, a law declaring the same.
I'm sorry, can you point to such a prescriptive dictionary? People can talk however they please, and dictionaries are tasked with keeping up with the vernacular.
The "literally" ship sailed centuries ago. Sorry, but that battle has been lost. Even so-called "prescriptive" dictionaries would be categorically incorrect if they ignore nearly three centuries of common vernacular.
It never has, it always will. We've already lost a host of words that meant "I'm not exaggerating, I actually mean it": "really", "very", etc. I'm going to keep up the fight.
Language is defined by its speakers, as basically a "vote". I'm going to keep voting for "literally" meaning "this actually happened" as long as it's practical, because 1) there are dozens of other ways to emphasize something 2) we need some way to say "this is not an exaggeration".
Why a quantum leap isn’t the length of an Ångstrom will always sadden me. I’m sure there are other scientific concepts you can use to describe a Great Leap Forward…
Being just a single file with a simple build command it seemed like a minimal advancement on the state of the art, so I quickly decided to publish, and did not appropriately credit the original as I should have. I hope you can accept my apology – this was an honest mistake.
No problem Yuri, thanks for the credit and the apology! I do recommend writing articles based on doing the thing from scratch yourself though as you can write something more nuanced and interesting that way.
> ... another nice WASM-based browser SQLite user interface.
Thank you for pointing that one out. Every conceptually similar project is a great source of ideas. sqlite's fiddle app is literally less than 2 weeks old so still has lots of room left for feature creep ;).
That's really interesting! Could you say more about the job-like aspects of having users? Do you have advice on infrastructure that could be built to make the job easier?
I'm currently working on my first-ever side project with user accounts, and now I'm wondering what I'm in for. :-)
For me it's about the moral responsibility. If people are trusting your site with their data, you have an obligation to keep it running, and to keep it secure. This is a big responsibility! Especially since over the long-term the vast majority of projects eventually cease to exist.
In addition to user PII responsibilities, you may also be responsible for user generated content depending on where you live. Both real users and bots will inevitably submit nefarious material on your servers.
The easy solution there is to just ban European users if it's just a hobby project and your concerned about that. Probably not the solution that GDPR would prefer.
That's not as easy a solution as it appears - the GDPR isn't the only piece of personal data legislation in the world. If your strategy is to keep track of all the places that place responsibilities on you for collecting personal data and reject users from those locations then, you need to be looking at every state in the USA (Californian citizens have a consitutional right to privacy), and many countries across the world have various data protection laws.
Thanks! I've used it when the data is itself binned frequency data, which is one way to reduce data volume (e.g. binning 2 weeks of minutely time series data into (hour, percentile) buckets and plotting them as weighted points).
When the number of the bins in the data is not an exact integer multiple of the number of bins in the histogram, adjacent data bins can get mapped to the same histogram bin, resulting in e.g. 2x the data volume in some rows/columns of the histogram.
When a bit of loss of fidelity is acceptable the two solutions I've used are to render the histogram at an exact factor of the number of data bins then set the canvas dimensions to the desired size (relying on the browser to downsample the resulting image), or to use `max` as the reduceOp and render the histogram directly at the intended size.