There's actually a library called ast-grep which does something very similar to what you're describing. They have an example in their introduction which performs a find and replace operation on a JS AST using a pattern:
Yes, as you pointed out, not only can bundlers tree shake namespace imports, but they're literally used in the esbuild documentation to demonstrate the concept of tree shaking.
The issue you linked to is referring to the case in which you import a namespace object and then re-export it. Bundlers like webpack and rollup (which vite uses in production) can tree shake this pattern as well, but esbuild struggles with it.
If you're using esbuild instead of this:
import * as someLibrary from "some-library"
someLibrary.someFunction()
export { someLibrary }
You can still do this:
import * as someLibrary from "some-library"
someLibrary.someFunction()
export * from "some-library"
export { default as someLibraryDefault } from "some-library"
Somewhat related: you technically can access some type metadata in TypeScript at runtime using the `emitDecoratorMetadata` and `experimentalDecorators` tsconfig options, along with Microsoft's `reflect-metadata` polyfill package. There was a trend at one point where people were writing database ORMs using decorator metadata (e.g. Typegoose, TypeORM, and MikroORM).
This of course requires your build tool to actually understand the TS type system, which is why it's not supported in tools like esbuild and tsx (which uses esbuild under the hood).
I’m talking about in place of a fetch call, you could simply import a json response from an endpoint, there by bypassing the need to call fetch, and you’ll get the response as if it’s imported.
It won’t replace all GET calls certainly but I can think of quite a few first load ones that can simply be import statements once this happens
const supportsRequestStreams = (() => {
let duplexAccessed = false;
const hasContentType = new Request('http://localhost', {
body: new ReadableStream(),
method: 'POST',
get duplex() {
duplexAccessed = true;
return 'half';
},
}).headers.has('Content-Type');
return duplexAccessed && !hasContentType;
})();
Safari doesn't appear to support the duplex option (the duplex getter is never triggered), and Firefox can't even handle a stream being used as the body of a Request object, and ends up converting the body to a string, and then setting the content type header to 'text/plain'.
Oops. Chrome only! I stand very much corrected. Perhaps I should do less late night development.
It seems my original statement that download, but not upload, is well supported was unfortunately correct after all. I had thought that readable/transform streams were all that was needed, but as you noted it seems I've overlooked the important lack of duplex option support in Safari/Firefox[0][1]. This is definitely not wide support! I had way too much coffee.
Thank you for bringing this to my attention! After further investigation, I encountered the same problem as you did as well. Firefox failed for me exactly as you noted. Interestingly, Safari fails silently if you use a transformStream with file.stream().pipeThrough([your transform stream here]) but it fails with a message noting lack of support if you specifically use a writable transform stream with file.stream().pipeTo([writable transform stream here]).
I came across the article you referenced but of course didn't completely read it. It's disappointing that it's from 2020 and no progress has been made on this. Poking around caniuse, it looks like Safari and Firefox have patchy support for similar behavior in web workers, either via partial support or behind flags. So I suppose there's hope, but I'm sorry if I got anyone's hope too far up :(
Enums and parameter properties can be enabled with the --experimental-transform-types CLI option.
Not being able to import TypeScript files without including the ts extension is definitely annoying. The rewriteRelativeImportExtensions tsconfig option added in TS 5.7 made it much more bearable though. When you enable that option not only does the TS compiler stop complaining when you specify the '.ts' extension in import statements (just like the allowImportingTsExtensions option has always allowed), but it also rewrites the paths if you compile the files, so that the build artifacts have the correct js extension: https://www.typescriptlang.org/docs/handbook/release-notes/t...
In Node 22.7 and above you can enable features like enums and parameter properties with the --experimental-transform-types CLI option (not to be confused with the old --experimental-strip-types option).
Node does have a permissions system, but it's opt in. Many runtimes/interpreters either have no sandbox at all, or they're opt in, which is why Deno's sandbox is an upgrade, even if it's not as hardened as iptables or Linux namespaces.
> you can't just allow exact what you need in that category? You have to allow the entire category and then deny everything you don't want/need?
No, you can allow access to specific domains, IP addresses, filesystem paths, environment variables, etc, while denying everything else by default. You can for instance allow access to only a specific IP (e.g. `deno run --allow-net='127.0.0.1' main.ts`), while implicitly blocking every other IP.
What the commenter is complaining about is the fact that Deno doesn't check which IP address a domain name actually resolves to using DNS resolution. So if you explicitly deny '1.1.1.1', and the script you're running fetches from a domain with an A record pointing to '1.1.1.1', Deno will allow it.
In practice, I usually use allow lists rather than deny lists, because I very rarely have an exhaustive list on hand of every IP address or domain I'm expecting a rogue script to attempt to access.