> People will pay untold thousands for a Mac, but God forbid when a PC manufacturer charges more than $599 for a laptop.
The article compares the FL12 to laptops of the same price range, including other framework laptops to note that it falls short.
The FL12 has worse performances and battery life than an M1 Air, for more than an M4.
The point of the article is that the 12 should either be a lot less expensive or it should be a lot better. It's not whatever nonsense you're dreaming of.
The core philosophy of Framework is repairability and modularity. Yes, you are paying extra for those things, and so people who do not value them, should probably not buy Framework. These comments are full of the old cliche of judging a fish in a tree climbing contest.
Repairabilty and modularity come with tradeoffs. Not everyone is going to value those tradeoffs and therefore shouldn't buy a laptop where those are the priority. But some people do value those things, and telling them to "get a MacBook" is just silly.
You can repair a Mac by handing it (and possibly your wallet) to Apple and letting them replace entire large subsystems to remedy the issue and pair the new parts. A few years back (pre-Apple Silicon) I got a new top case, keyboard, battery, and trackpad because the button in the trackpad had failed. Pretty good deal on a laptop that was nearly 3 years old, in fairness.
To repair (or upgrade) a Framework, you buy the part and install it. That's worth something to me!
Incidentally, I also have a last-gen ThinkPad P14s Gen 5 AMD and it's a flimsy POS. Already needed a new motherboard and battery and spent three weeks sitting at the service center while they rounded up the parts. Wish I'd bought another Framework 13.
It didn’t necessarily need to be highly productive, just more productive than cultivating local stuff, and even if the corn was not necessarily at its most productive it might have been worth it (and with no real replacement) as part of the companion garden, getting just a few ears of corn would still have been worth more than unproductive wooden stakes.
> AFAIK one factor that accounts for the amazing longevity of old paintings was the compatibility between their many layers. The canvas, the primer, the paint and the medium were all derived from the same plant: flax.
AFAIK that's largely untrue in the general case:
- Canvas painting only appeared in the 15th century, and took a while to spread, paintings before the 17th century (16th in some locations) tend to be on wood panels e.g. raphael and da vinci painted almost exclusively on board.
- Canvas is almost always coated with gesso which at the time would use rabbitskin glue as binder, no flax there.
- While flaxseed was the most common drying oil (in europe), walnut, poppyseed, and safflower, were also in use. And additives were usually mixed in to manage the viscosity of the paint, so even in the "best" case it's not like the paint would be just pigments in linseed oil unless it's a very early oil painting, which wouldn't have been on canvas.
- As for varnishes, not only was flax not the only drying oil for oil varnishes, varnishes could also be "spirit" varnishes (with a resin dissolved into a solvent like alcohol or turpentine), waxes were also sometimes used as or in varnishes.
True… what I said only applies to canvas paintings. I should have made that clear.
The longevity of murals is easy to account for: the paint is applied to wet plaster, in that way becoming part of the wall. That is why the murals of Pompeii survived.
> Canvas is almost always coated with gesso which at the time would use rabbitskin glue as binder, no flax there.
You are right about the rabbit skin glue, but wrong about the gesso. As I recal, traditional Gesso is a mix of glue plus titanium white powder and is very brittle, generally unsuited to a flexible support such as canvas. A canvas painter would more likely use something flexible like a mix of pigment and rabbit skin glue, or pigment and egg protein or pigment and oil.
I thank god for modern primers. Using modern primers, I can prime a canvas in two days. An oil based oil primer could take months to dry.
As far as I can tell, the ingredients described by Cennini[1] is close to the “traditional” preparation:
* Gypsum (Hydrated calcium sulfate)
* Zinc white pigment
* Clean tap water or distilled water
* Rabbit skin glue
—-
[1] Cennini, Cennino d'Andrea, The Craftsman's Handbook "Il Libro dell Arte," Daniel V. Thompson, Jr., trans. (New York: Dover Publications, 1960) pp. 69–74.
As far as I understand this only replaces the retouching phase. A human needs to do that in place in order to color and pattern match (and it's already hard enough that way). And I do not know but would assume the retouching paints can be removed without affecting the isolation layer if the conservator realises they went in the wrong direction earlier and can’t easily recover.
That is a bit of a truism, as conservators don't have access to time magic and can only conserve what's left. The question is how far conservation extends.
To my layman's understanding, the current logic of conservation is:
1. clean. dust, grime, soot, tar, and other environmental deposits don't belong on the painting and have little value
2. stabilise, this is mostly for paintings which are actively degrading (e.g. paint is lifting), this can also include repairs to the substrate e.g. patch up tears in the canvas
3. fill-in losses
The last one is the one that has some contention, and what TFA is about (and only a portion of it to boot), and the idea behind it is (again to my understanding) that at the end of the day art is for the living to appreciate, and while damage can be of some historical interest it generally detracts from enjoyment of the piece and is (usually) not part of the original vision.
However it is of course subject to (0). do not damage what remains, so the retouching should be well separated from the original material in order to be identifiable and removable without risk of further damage in case re-conservation is needed, or better conservation methods can be used.
And then if you allow for (3) arises the question of limit, is there and if so what is the point at which it doesn't make artistic sense to try and replace losses even if somebody's paying for it, and it is better to accept the piece's new state of being as its normal?
I would assume the mask only covers the damaged areas. This means it replaces manual retouching, so you still need to clean the original painting and remove the varnish in order to get at the original colors, and as part of the usual cleanups (removal of old conservation, stabilisation, infill). The article specifically says that the film is adhered to and via a varnish layer.
Makes me wonder how it would handle heavy impasto tho.
Obviously it can't handle heavy impasto. With heavy impasto, people usually literally sculp the surface using some material to math the rest of the painting and then paint on top of this. Even with minimal impasto, they will often try to mimic the texture by imprinting various tools into the material, mimicking for example brushes used by the original author. This printing method can't do this at all, and the filter might struggle when the painting is too uneven.
Otherwise this might be an interesting technique, if the result can match the color and texture of paint perfectly. I can see it being used for some low priority paintings. There are much more paintinga that need restoration than people with necessary skills, so this could save of them, as it will be more viable to fix them. The infilling is usually just a small part of the entire process, but usually the most difficult wrt how skillful the conservator must be. You must be able to match colors and style perfectly, and there are huge differences in how fast this process is depending on the skills of the painter.
glibc will return memory to the OS just fine, the problem is that its arena design is extremely prone to fragmentation, so you end up with a bunch of arenas which are almost but not quite empty and can't be released, but can’t really be used either.
In my experience it delays it way too much, causing memory overuse and OOMs.
I have a Python program that allocates 100 GB for some work, free()s it, and then calls a subprocess that takes 100 GB as well. Because the memory use is serial, it should fit in 128 GB just fine. But it gets OOM-killed, because glibc does not turn the free() into an munmap() before the subprocess is launched, so it needs 200 GB total, with 100 GB sitting around pointlessly unused in the Python process.
This means if you use glibc, you have no idea how much memory your system will use and whether they will OOM-crash, even if your applications are carefully designed to avoid it.
I commented there 4 years ago the glibc settings MALLOC_MMAP_THRESHOLD_ and MALLOC_TRIM_THRESHOLD_ should fix that, but I was wrong: MALLOC_TRIM_THRESHOLD_ is apparently bugged and has no effect in some situations.
So in jemalloc, the settings to control this behaviour seem to actually work, in contrast to glibc malloc.
(I'm happy to be proven wrong here, but so far no combination of settings seem to actually make glibc return memory as written in their docs.)
From this perspective, it is frightening to see the jemalloc repo being archived, because that was my way to make sure stuff doesn't OOM in production all the time.
Interesting that one of the factor listed in there, the hardcoded page-size on arm64, is still is an unsolved issue upstream, and that forces app developers to either ship multiple arm64 linux binaries, or drop support for some platforms.
I wonder if some kind of dynamic page-size (with dynamic ftrace-style binary patching for performance?) would have been that much slower.
Not sure about OP but I have all manner of blues and jazz recordings unavailable via streaming. There are also lots of obscure Japanese game and rock recordings that aren't in Apple or Spotify though to Spotify's credit, they have a lot of game content. Streaming is mostly in service of licenses and margins which as a shareholder, that makes sense to me.
There's also an intermediate category where an online/streaming version is available, but it's defective – a number of live Bob Dylan (!) albums from the Bootleg Series are missing the track transitions between the individual tracks. It feels like somebody forgot to include the pregaps when ripping the discs for digital distribution, and it's been like that for years…
Even local super popular rock bands from 80s don't always have their entire catalog available on streaming services, and solo endeavors of their musicians are often nowhere to be seen there.
People seem to assume that any decent creative output always gets carried forward to the next form of media tech. But there are 78s that didn’t make to LP, much less anything after that.
A wide range, actually. It's more about the time period and artists than musical style. If it's earlier than the 90s and/or from an artist who wasn't big on the charts, it gets more likely that they're not available except on used CD.
In that sense, the depth and variety of good music that is available has been shrinking for a long while now. The advent of streaming seems to have made it worse.
By contrast, before I got rid of almost all my vinyl, one particular sub-collection that I had was about 200 12" singles from the London club scene in 1981-1985. Almost none of the tracks ever appeared on CD or were ever released digitally.
All of them were available on youtube, even the whitelabel DJ-only releases!
This is the part that tends to have the most mistakes, if used. It's generally better to provide minimal info manually if the CD wasn't identified by its ID.
I can curate my own library of bookmarks within [some other body's music library] without CDs; of course I can.
I can do that with iTunes or Spotify or Tidal or Amazon Music or whatever else.
But none of these bookmarks are necessarily related to my music. They are only just bookmarks that refer to music that might exist within the libraries that these bodies provide.
And while all of these libraries are certainly quite vast, there's a fuckton of (published!) music that these commercial libraries do not provide.
The article compares the FL12 to laptops of the same price range, including other framework laptops to note that it falls short.
The FL12 has worse performances and battery life than an M1 Air, for more than an M4.
The point of the article is that the 12 should either be a lot less expensive or it should be a lot better. It's not whatever nonsense you're dreaming of.
reply