so maybe that's why OP didn't realize that it had already been posted recently. With the older scheme, not only is SEO bad, but it was really hard to remember which date corresponds to which blog post, and people can brute force search for my hidden (unpublished) blog posts easily.
The gain is fixed. I think the column variation arises from unstable oscillator frequency and maybe some electrical bug/crosstalk between pixels. Not sure exactly.
In many cases today “gif” is a misnomer anyway and mp4 is a better choice. Not always, not everywhere supports actual video.
But one case I see often: If you’re making a website with an animated gif that’s actually a .gif file, try it as an mp4 - smaller, smoother, proper colors, can still autoplay fine.
I've been thinking of integrating pngquant as an ffmpeg filter, it would make it possible to generate even better pallettes. That would get ffmpeg on par with gifski.
Does ffmpeg's gif processing support palette-per-frame yet? Last time I compared them (years ago, maybe not long after that blog post), this was a key benefit of gifski allowing it to get better results for the same filesize in many cases (not all, particularly small images, as the total size of the palette information can be significant).
I use `split[s0][s1];[s0]palettegen=max_colors=64[p];[s1][p]paletteuse=dither=bayer` personally, limiting the number of colors is a great way to transparently (to a certain point, try with different values) improve compression, as is bayer (ordered) dithering which is almost mandatory to not explode output filesizes.
* x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance and power efficiency
* Qualcomm kinda fumbled the Snapdragon X Elite launch with nonexistent Linux support and shoddy Windows stability, but here's to hoping that they "turn over a new leaf" with the X2.
Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
On the build quality side, basically all the PCs are still lagging behind Apple, e.g. yesterday's rant post about the Framework laptop [2] touched on a lot of important points.
Of course, there are the Thinkpads, which are still built decently but are quite expensive. Some of the Chinese laptops like the Honor MagicBooks could be attractive and some reddit threads confirm getting Linux working on them, but they are hard to get in the US. That said, at least many non-Apple laptops have decent trackpads and really nice screens nowadays.
I have no faith in Qualcomm to even make me basic gestures towards the Linux community.
All I want is an easy way to install Linux on one of the numerous Snapdragon laptops. I think the Snapdragon Thinkpad might work, but none of the other really do.
A 400$ Arm laptop with good Linux support would be great, but it's never ever going to happen.
Facts are Linux support has heavily accelerated from both Qualcomm and Linaro on their behalf. Anyone who watches Linux ARM mailing lists can attest that.
Hardware has already been out for a year. Outside a custom spin by the ubuntu folks, even last years notebooks arent well supported out of the box on linux. I have a Yoga Slim 7x and I tried the Ubuntu spin out at some point - it required me to first extract the firmware from the Windows partition because Qualcomm had not upstreamed it into linux-firmware. Hard to take Qualcomm seriously when the situation is like this.
Qualcomm _does_ upstream all their firmware, but vendors usually require a firmware binary to be signed with their keys, which are burned into the SoC. As a result you cannot use Qualcomm's vanilla firmware and need to extract the original firmware as provided by the vendor, otherwise it won't load. This is an actual security feature, believe it or not. Besides, chances are it wasn't even Qualcomm's firmware, but rather Cirrus for sound or display firmware, etc.
I get the hate on Qualcomm, but you're really one LLM question away from understanding why they do this. I should know, I was also getting frustrated before I read up on this.
I get where youre coming from but I think the job of a company pushing a platform is to make it "boring". ie it should work out of the box on debian/fedora/arch/ubuntu. The platform vendor (Qualcomm) is the only one with enough sway to push the different laptop manufacturers do the right thing. This is the reason why both Intel / Windows push compliance suites which have a long list of requirmements before anyone can put the Windows / Intel logo on their device. If Qualcomm is going to let Acer / Lenovo decide if things work out of the box on linux then its never going to happen.
Can you please let me know if there is an ISO to get any mainstream Linux distro working on this Snapdragon laptop ?
ASUS - Vivobook 14 14" FHD+ Laptop - Copilot+ PC - Snapdragon X
It's on sale for $350 at Best buy and if I can get Linux working on it it would definitely be an awesome gift for myself.
Even if there's some progress being made, it's still nearly impossible to install a typical Linux distro on one of these. I've been watching this space since the snapdragon laptops were announced. Tuxedo giving up and canceling their Snapdragon Linux laptop doesn't instill much confidence
That covers the Elite, not the cheaper Snapdragon X laptops such as the ASUS Vivobook 14 (X1407QA).
I've followed that thread for almost a year. It's a maze of hardware issues and poor compatibility.
From your other response.
>but vendors usually require a firmware binary to be signed with their keys, which are burned into the SoC. As a result you cannot use Qualcomm's vanilla firmware and need to extract the original firmware as provided by the vendor, otherwise it won't load.
This makes the install process impossible without an existing Windows install. It's easier to say it doesn't work and move on.
It's going to be significantly easier to buy run Linux in an X86 laptop.
Not to mention no out of the box Linux Snapdragon Elite laptop exists. It's a shame because it would probably be an amazing product.
This sounds a lot like how AMD’s approach had changed on Linux and still everyone I know who wants to use their GPU fully used Nvidia. For a decade or more I’ve heard how AMD has turned over a new leaf and their drivers are so much better. Even geohot was going to undercut nvidia by just selling tinygrad boxes on AMD.
Then it turned out this was the usual. Nothing had changed. It was just that people online have this desire to express that “the underdog” is actually better. Not clear why because it’s never true.
AMD is still hot garbage on Linux. Geohot primarily sells “green boxes”. And the MI300x didn’t replace H100s en masse.
Maybe it's just that you're mostly viewing this through the LLM lens?
I remember having to fight with fglrx, AMDs proprietary Linux driver, for hours on end. Just to get hardware-acceleration for my desktop going! That driver was so unbearable I bought Nvidia just because I wanted their proprietary driver. Cut the fiddling time from many hours to maybe 1 or 2!
Nowadays, I run AMD because their open-source amdgpu driver means I just plonk the card into the system, and that's it. I've had to fiddle with the driver exactly zero times. The last time I used Nvidia is the distant past for me.
So - for me, their drivers are indeed "so much better".
But my usecase is sysadmin work and occasional gaming through Steam / Proton.
I ran LMStudio through ROCm, too, a few times. Worked fine, but I guess that's very much not representative for whatever people do with MI300 / H100.
I play lots of games on a AMD GPU (RX 7600) for about a year and I can't remember a game that had graphical issues (eg driver bugs).
Probably something hasn't run at some point but I can't remember what, more likely to be a Proton "issue". Your main problem will be some configuration of anti-cheat for some games.
My experience has been basically fantastic and no stress. Just check that games aren't installing some Linux build which are inevitably extremely out of date and probably wont run. Ex: human fall flat (very old, wont run), deus ex mankind divided (can't recall why but I elected to install the proton version, I think performance was poor or mouse control was funky).
I guess I don't play super-new games so YMMV there. Quick stuff I can recall, NMS, Dark Souls 1&2&3, Sekiro, Deep Rock Galactic, Halo MCC, Snow runner & Expeditions, Eurotruck, RDR1 (afaik 2 runs fine, just not got it yet), hard space ship breaker, vrising, Tombraider remaster (the first one and the new one), pacific drive, factorio, blue prince, ball x pit, dishonored uhhh - basically any kind of "small game" you could think of: exapunks, balatro, slay the spire, gwent rougemage, whatever. I know there were a bunch more I have forgotten that I played this year.
I actually can't think of a game that didn't work... Oh this is on Arch Linux, I imagine Debian etc would have issues with older Mesa, etc.
Works very well for me! YMMV maybe depending on the titles you play, but that would probably be more of a Proton issue than an AMD issue, I'd guess.
I'm not a huge gamer, so take my experience with a grain of salt. But I've racked up almost 300 hours of Witcher3 with the HQ patch on a 4k TV display using my self-compiled Gentoo kernel, and it worked totally fine. A few other games, too. So there's that!
Don’t know what LLM lens is. I had an ATI card. Miserable. Fglrx awful. I’ve tried various AMDs over the last 15 years. All total garbage compared to nvidia. Throughout this period was consistently informed of new OSS drivers blah blah. Linus says “fuck nvidia”. AMD still rubbish.
Finally, now I have 6x4090 on one machine. Just works. 1x5090 on other. Just works. And everyone I know prefers N to A. Drivers proprietary. Result great. GPU responds well.
Well, I don't know why it didn't work out for you. But my AMD experience has improved fundamentally since the fglrx days, to the point where I prefer AMD over Nvidia.
You said you don't know why people say that AMD has improved so much, but it definitely rings true for me.
I said "LLM lens" because you were talking about hardware typically used for number crunching, not graphics displays, like the MI300. So I was suggesting that the difference between what you hear online about the driver and your own experience might result from people like me mostly talking about the 2d / 3d acceleration side of things while the experience for ROCm and stuff is probably another story altogether.
I see. I see. I got tripped up by 'LLM' since I got the GPUs for diffusion models. Anyway, the whole thing sounds like the old days when I had Ubuntu Dapper Drake running flawlessly on my laptop and everyone was telling me Linux wasn't ready: it's an artifact of the hardware and some people have great support and others don't. Glad you do.
Google has previously delivered good Linux support on Arm Chromebooks and is expected to launch unified Android+ChromeOS on Qualcomm X2 Arm devices in 2026.
My personal beef with Thinkpads is the screen. Most of the thinkpads I’ve encountered in my life (usually pretty expensive corporate ones) had shitty FHD screens. I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
FWIW if you buy new from Lenovo, getting a more high-res display has been an option for years.
I'm on the other side where I've been buying Thinkpads partly because of the display. Thinkpads have for a long time been one of the few laptop options on the market where you could get a decent matte non-glare display. I value that, battery life and performance above moar pixels. Sure I want just one step above FHD so I can remote 1080p VMs and view vids in less than fullscreen at native resolution but 4K on a 14" is absolute overkill.
I think most legit motivations for wanting very high-res screens (e.g. photo and video editing, publishing, graphics design) also come with wanting or needing better quality and colors etc too, which makes very-highly-scaled mid-range monitors a pretty niche market.
> I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
Did you make a serious effort while having an extended break from retina screens? I'd think you would get used to it pretty quickly if you allow yourself to readjust. Many people do multi-DPI setups without issues - a 720p and a 4k side-by-side for example. It just takes acclimatizing.
I have a 14” FHD panel (158 dpi) on an old (7 year) laptop and there’s more issues with low resolution icons and paddings than with font rendering. I wouldn’t mind more, but it’s not blurry.
I just learned on Reddit the other day that people replace those screens with third party panels, bought from AliExpress for peanuts. They use panelook.com to find a compatible one.
Old Thinkpads are great! I used to have a Lenovo Thinkpad X1 Carbon Gen 6 with Intel Core i7 8640U, 16 GB of RAM, and 1 TB SSD. I installed Arch Linux on it with Sway.
> x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance
Nodding along with the rest but isn't this backwards? Are M series actually outperforming an Intel i9 P-core or Ryzen 9X in raw single-threaded performance?
Not in raw performance, no, but they're only beat out by i9s and the like, which are very power hungry. If you care even a little bit about performance per watt, the M series are far superior.
Have a look at Geekbench's results.[1] Ignore the top ones, since they're invalid and almost certainly cheated (click to check). The iPads and such lower down are all legit, but the same goes for some of the i9s inbetween.
And honestly, the fact that you have to go up to power hungry desktop processors to even find something to compete with the chip that goes in an (admittedly high-end) iPad, is somewhat embarrassing on its face, and not for Apple.
Yes, the M4 is still outperforming the desktop 9950X in single-threaded performance on several benchmarks like Geekbench and Cinebench 2024 [1]. Compared to the 9955HX, which is the same physical chip as the 9950X but lower clocked for mobile, the difference is slightly larger. But the 16 core 9950X is obviously much better than the base M4 (and even the 16 core M4 Max, which has only 12 P cores and 4 E cores) at multithreaded applications.
However, the M2 in the blog post is from 2022 and isn't quite as blazingly fast in single thread performance.
The closest laptop to MacBook quality is surprisingly the Microsoft Surface Laptop.
As to x86, Zen 6 will be AMD's first major architecture rework since Apple demonstrated what is possible with wide decode. ( Well more accurately it should be since the world take notice because it happened long before M1 ). It still likely wont be close to M5 or even M4 with Single Threaded Performance / Watt, but hopefully it will be close.
Honor strangely enough doesnt make any efforts to really support Linux
The machine quality is pretty damn good, but Huawei machines are still better. Apple level of quality. And Huawei releases their machines with Linux preinstalled
The company to watch is Wiko. Its their French spin off to sidestep their chip ban. They might put out some very nice laptops, but a bit tbd
> Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
ohh thanks for that link; i was thinking about updating to the latest on my asusbook s15 but i think ill stick with the current ubuntu concept for now... saved me some trouble!
Dealing with Honor support is a pain. They don't understand absolutely anything and is impossible to get them out of their script if you have a problem.
I have a Honor 200 pro, and the software is buggy and constantly replaces user configurations with their defaults every 3 or 4 days.
I would avoid anything Honor in the future at any cost.
> On the build quality side, basically all the PCs are still lagging behind Apple,
This is an oft-repeated meme, but not really true. Thinkpads, high-end lightweight gaming laptops like the Asus G14... There are many x86 laptops with excellent build quality.
I have no insight into the Asahi project, but the LKML link goes to an email from James Calligeros containing code written by Hector Martin and Sven Peter. The code may have been written a long time ago.
Apple does tons of optimizations for every component to improve battery life.
Asahi Linux, which is reverse engineered, doesn't have the resources to figure out each of those tricks, especially for undocumented proprietary hardware, so it's a "death by a thousand cuts" as each of the various components is always drawing a couple of milliwatts more than on macOS.
They do have a loophole; they import them as kits and “build” them at a Magna facility in Arizona (similar to how early Sprinter vans were re-assembled in the US and sold as Freightliners). But, they are FMVSS compliant (besides steering wheel) and have had several NHTSA organized recalls like any other compliant car might.
Tariffs are easy, just pay them. Federal Motor Vehicle Safety Standards are harder... But maybe there's a loophole for commercial transport? or maybe they paid to have the testing done?
Which isn’t even really that prohibitive because Chinese vehicles beat Western pricing by five figures.
Plus, all the sensor equipment is made in China anyway. There’s almost certainly no way to have it manufactured in the US.
On top of that, fleet sales don’t have to deal with the antiquated dealer network laws in the US.
And of course American market car manufacturers refuse to produce vehicles that are like this one: space efficient and reasonably sized, instead opting for gigantic bean shaped SUVs with sloping rear roofs that rob you of cargo space while taking up maximum curb real estate.
Ford E-Transit is an electric van for a lot of money. But it looks like Ford wants to stop making them, and 2 seat models look much easier to find. But you'd be able to fit your board no problem.
Not sure if it's sold in the US (assuming you are from there), but the Kia PV5 is probably your best bet. On top of that it's very reasonably priced (in contrast to the ID buzz)
Your Transamerica pyramid picture is incredible among really cool pictures you have there. Quite cool to photograph for wikipedia like this, the world needs more people like you!
The fact that so many people use FFmpeg and QEMU suggest that he is quite good at documenting, collaborating, and at least making his code remarkably clean and easy to follow. This already puts him way ahead of the average silicon valley senior software engineer that I've worked with. However, he does value independence so I don't think he would have been happy working at a faang-type company for long.
>Fabrice won International Obfuscated C Code Contest three times and you need a certain mindset to create code like that—which creeps into your other work. So despite his implementation of FFmpeg was fast-working, it was not very nice to debug or refactor, especially if you’re not Fabrice
The fact that SVG files can contain scripts was a bit of a mistake. On one hand, the animations and entire interactive demos and even games in a single SVG are cool. But on the other hand, it opens up a serious can of worms of security vulnerabilities. As a result, SVG files are often banned from various image upload tools, they do not unfurl previews, and so on. If you upload an SVG to discord, it just shows the raw code; and don't even think about sharing an SVG image via Facebook Messenger, Wechat, Google Hangouts, or whatever. In 2025, raster formats remain way more accessible and easily shared than SVGs.
This is very sad because SVGs often have way smaller file size, and obviously look much better at various scales. If only there was a widely used vector format that does not have any script support and can be easily shared.
All SVGs should be properly sanitized going into a backend and out of it and when rendered on a page.
Do you allow SVGs to be uploaded anywhere on your site? This is a PSA that you're probably at risk unless you can find the few hundred lines of code doing the sanitization.
Note to Ruby on Rails developers, your active storage uploaded SVGs are not sanitized by default.
It would be better if they were sanitized by design and could not contain scripts and CSS. For interactive pictures, one could simply use HTML with inline SVG and scripts.
Notably, the sanitization option is risky because one sanitizer's definition of "safe" might not actually be "safe" for all clients and usages.
Plus as soon as you start sanitizing data entered by users, you risk accidentally sanitizing out legitimate customer data (Say you are making a DropBox-like fileshare and a customer's workflow relies on embedding scripts in an SVG file to e.g. make interactive self-contained graphics. Maybe not a great idea, but that is for the customer to decide, and a sanitization script would lose user data. Consider for example that GitHub does not sanitize JavaScript out of HTML files in git repositories.)
At least with external entities you could deny the parser an internet connection and force it to only load external documents from a cache you prepopulated and vetted. Turing completeness is a bullshit idea in document formats.
Postscript is pretty neat IMHO and it’s Turing complete. I really appreciated my raytraced page finally coming out of that poor HP laser after an hour or so.
With SVGs you can serve them from a different domain. IIUC the issue from TFA was that the SVGs were served from the primary domain; had they been on a different domain, they would have not been allowed to do as much.
IIUC, an untrusted inline SVG is bad. An image tag pointing to an SVG is not.
<img src="untrusted.svg"> <!-- this is ok -->
<svg from untrusted src> <!-- this is not ok -->
I feel like this is common knowledge. Just like you don't inject untrusted HTML into your page. Untrusted HTML also has scripts. You either sanitize it. OR you just don't allow it in the first place. SVG is, at this point, effectively more HTML tags.
Also remember that if the untrusted SVG file is served from the same origin and is missing a `Content-Disposition: attachment` header (or a CSP that disables scripts), an attacker could upload a malicious SVG and send the SVG URL to an unsuspecting user with pretty bad consequences.
That SVG can then do things like history.replaceState() and include <foreignObject> with HTML to change the URL shown to the user away from the SVG source and show any web UI it would like.
Because displaying user-submitted images is pretty common and doesn't feel like a security footgun, but displaying user-submitted HTML is less common (and will raise more careful security scrutiny).
Would it be possible for messenger apps to simply ignore <script> tags (and accept that this will break a small fraction of SVGs)? Or is that not a sufficient defense?
I looked into it for work at some point as we wanted to support SVG uploads. Stripping <script> is not enough to have an inert file. Scripts can also be attached as attributes. If you want to prevent external resources it gets more complex.
The only reliable solution would be an allowlist of safe elements and attributes, but it would quickly cause compat issues unless you spend time curating the rules. I did not find an existing lib doing it at the time, and it was too much effort to maintain it ourselves.
The solution I ended up implementing was having a sandboxed Chromium instance and communicating with it through the dev tools to load the SVG and rasterize it. This allowed uploading SVG files, but it was then served as rasterized PNGs to other users.
It's definitely a possible solution if you control how the file are displayed. In my case I preferred the files to be safe regardless of the mechanism used to view them (less risk of misconfiguration).
No, svgs can do `onload` and `onerror` and also reference other svgs that can themselves contain those things (base64'd or behind a URI).
But you can use an `img` tag (`<img src="evil.svg">`) and that'll basically Just Work, or use a CSP. I wouldn't rely on sanitizing, but I'd still sanitize.
> But you can use an `img` tag (`<img src="evil.svg">`) and that'll basically Just Work
That doesn't help too much if evil.svg is hosted on the same domain (with default "Content-Type: image/svg+xml" header), because attacker can send a direct link to the file.
IMO, the bigger problem with SVGs as an image format is that different software often renders them (very) differently! It's a class of problem that raster image formats basically don't have.
I would have expected SVGs to be like PDFs and render the same across devices. Is the issue that some renderers don’t implement the full spec, or that some implement parts incorrectly?
They are reasonably consistent because there is a de-facto reference implementation (Adobe Acrobat) which, if your implementation does not match exactly, users will think your implementation is broken.
You definitely don't understand PDFs, let alone SVGs.
PDFs can also contain scripts. Many applications have had issues rendering PDFs.
Don't get me wrong, the folks creating the SVG standard should've used their heads. This is like the 5th time (that I am aware of) this type of issue has happened, (and at least 3 of them were Adobe). Allowing executable code in an image/page format shouldn't be a thing.
SVG can for example contain text elements rendered with a font. If the font is not available it will render in a different one. The issue can be avoided by turning text elements into paths, but not all SVGs do that.
More like HTML and getting different browsers to render pixel perfectly identical result (which they don't) including text layout and shaping. Where different browser don't mean just Chrome, Firefox, Safari but also also IE6 and CLI based browsers like Lynx.
PDFs at least usually embed the used subset of fonts and contain explicit placement of each glyph. Which is also why editing or parsing text in PDFs is problematic. Although it also has many variations of Standard and countless Adobe exclusive extensions.
Even when you have exactly the same font text shaping is tricky. And with SVGs lack of ability to embed fonts, files which unintentionally reference system font or a generic font aren't uncommon. And when you don't have the same font, it's very likely that any carefully placed text on top of diagram will be more or less misplaced, badly wrap or even copletely disappear due to lack of space. Because there is 0 consistency between the metrics across different fonts.
The situation with specification is also not great. Just SVG 1.1 defines certain official subsets, but in practice many software pick whatever is more convenient for them.
SVG 2.0 specification has been in limbo for years although seems like recently the relevant working group has resumed discussions. Browser vendors are pushing towards synchronizing certain aspects of it with HTML adjacent standards which would make fully supporting it outside browsers even more problematic. It's not just polishing little details many major parts that were in earlier drafts are getting removed, reworked or put on backlog.
There are features which are impractical to implement or you don't want to implement outside major web browsers that have proper sandboxing system (and even that's not enough once uploads get involved) like CSS, Javascript, external resource access across different security contexts.
There are multiple different parties involved with different priorities and different threshold for what features are sane to include:
- SVG as scalable image format for icons and other UI elements in (non browser based) GUI frameworks -> anything more complicated than colored shapes/strokes can problematic
- SVG as document format for Desktop vector graphic editors (mostly Inkscape) -> the users expect feature parity with other software like Adobe Illustrator or Affinity designer
- SVG in Browsers -> get certain parts of SVG features for free by treating it like weird variation of HTML because they already have CSS and Javascript functionality
- SVG as 2d vector format for CAD and CNC use cases (including vinyl cutters, laser cutters, engravers ...) -> rarely support anything beyond shapes of basic paths
Beside the obviously problematic features like CSS, Javascript and animations, stuff like raster filter effects, clipping, text rendering, and certain resource references are also inconsistently supported.
From Inkscape unless you explicitly export as plain 1.1 compatible SVG you will likely get an SVG with some cherry picked SVG2 features and a bunch of Inkscape specific annotations. It tries to implement any extra features in standard compatible way so that in theory if you ignore all the inkscape namespaced properties you would loose some of editing functionality but you would still get the same result. In practice same of SVG renderers can't even do that and the specification for SVG2 not being finalized doesn't help. And if you export as 1.1 plain SVG some features either lack good backwards compatibility converters or they are implemented as JavaScript making files incompatible with anything except browsers including Inkscape itself.
Just recently Gnome announced working on new SVG render. But everything points that they are planning to implement only the things they need for the icons they draw themselves and official Adwaita theme and nothing more.
And that's not even considering the madness of full XML specification/feature set itself. Certain parts of it just asking for security problems. At least in recent years some XML parsers have started to have safer defaults disabling or not supporting that nonsense. But when you encounter an SVG with such XML whose fault is it? SVG renderer for intentionally not enabling insane XML features or the person who hand crafted the SVG using them.
Yeah, I spent a bit of time trying to figure out some masking issues with a file I created in Inkscape but which chrome would butcher. Turned out to be opacity on a mask layer or something.
It's wild how often we rediscover that executing untrusted code leads to decades of whack-a-mole security. Excel/Word plus macros, HTML plus JavaScript, SVG plus JavaScript, ...
Does it need to be as complicated as a new format? Or would it be enough to not allow any scripting in the provided SVGs (or stripping it out). I can't imagine there are that many SVGs out there which take advantage of the feature.
If only there was a widely used vector format that had script support and also decades of work on maintaining a battle-tested security layer around it with regular updates on a faster release cycle than your browser. That'd be crazy. Sure would suck if we killed it because we didn't want to bother maintaining it anymore.
Uh... Flash was a genuine firehose of security flaws. I mean, yeah, they patched them. So "battle tested security layer" isn't wrong in a technical sense. But, yikes, no.
There is artistically no equivalent to Flash ever since it died. Nothing else has allowed someone with artistic skills but no programming skills to create animations and games to the same degree and with the same ease.
I'd say Roblox is absolutely filling that market need. And as mentioned elsewhere, the "animations and games" demographic has moved on in the intervening decades to social media, and tools like CapCut make creating online content easier than it ever has been.
Honestly I think a lot of the Flash mania is just middle aged nerds fondly remembering their youth. The actual tool was a flash in the pan, and part of a much more complicated history of online content production. And the world is doing just fine without it.
Sure, but that's because the media and forums change, not so much a point about tool capability. The equivalent of teenaged geeks hacking on flash games today is influencer wannabes editting trends in CapCut. If anything content production is far more accessible now than in the 90's.
Yeah, it's still insane to me that the SVG can contain scripts. Wholly unnecessary; the DOM subtree it defines could be manipulated by external scripts just fine.
Anyway, I just set `svg.disabled` in Firefox. Scary world out there.
Update: this breaks quite a few things. It seems legitimate SVGs are used more often for UI icons than random diagrams and such. I suppose I shouldn't be surprised. I'll have to rethink this.
I gather from the HN discussion that it's not simple to disable scripting in an SVG, in retrospect a tragically missing feature.
I guess the next step is to propose a simple "noscripting" attribute, which if present in the root of the SVG doc inhibits all scripting by conforming renderers. Then the renderer layer at runtime could also take a noscripting option, so the rendering context could force it if appropriate. Surely someone at HN is on this committee, so see what you can do!
Edit: thinking about it a little more - maybe it's best to just require noscripting as a parameter to the rendering function. Then the browsers can have a corresponding checkbox to control SVG scripting and that's it.
Disabling script execution in svgs is very easy, it's just also easy to not realize you're about to embed an svg. `<img src="evil.svg">` will not execute scripts, a bit like your "noscripting" attribute except it's already around and works. Content Security Policy will prevent execution as well, you should be setting one for image endpoints that blocks scripts.
Sanitizing is hard to get right by comparison (svgs can reference other svgs) but it's still a good idea.
I had the impression from elsewhere in this thread that loading the svg in some other way, then you are not protected. This makes a no-brainer "don't run these ever" option in the browser seem appealing.
> This makes a no-brainer "don't run these ever" option in the browser seem appealing.
Firefox has this: svg.disabled in about:config. It doesn't seem to be properly documented, and might cause other problems for the developer (I found it accidentally, and a more deliberate search turns up mainly bug tracker entries.)
That's apparently how 4chan got hacked a while back. They were letting users upload PDFs and were using ghostscript to generate thumbnails. From what I understand, the hackers uploaded a PDF which contained PostScript which exploited a ghostscript bug.
Yes but the primary issue was that 4chan was using over a decade old version of the library that contained a vulnerability first disclosed in 2012: https://nvd.nist.gov/vuln/detail/CVE-2012-4405
In one of my penetration testing training classes, in one of the lessons, we generated a malicious PDF file that would give us a shell when the victim opened it in Adobe.
Granted, it relied on a specific bug in the JavaScript engine of Adobe Reader, so unless they're using a version that's 15 years old, it wouldn't work today, but you can't be too cautious. 0-days can always exist.
True, I just considered that once you handle a PDF with so much care like if it was poisoned, it's perhaps better to send this poison to someone else to handle.
Really awesome design! It would be wise to replace some of those 3D printed parts with CNC parts, especially for places where a lot of strength is required (eyelets for those Peak Design anchors) or precision is required (lens mount). I myself have 3D printed some parts for my line scan camera too, so I can totally understand.
reply