Hacker Newsnew | past | comments | ask | show | jobs | submit | throwitaway1123's commentslogin

There's actually a library called ast-grep which does something very similar to what you're describing. They have an example in their introduction which performs a find and replace operation on a JS AST using a pattern:

  ast-grep --pattern 'var code = $PAT' --rewrite 'let code = $PAT' --lang js

https://ast-grep.github.io/guide/introduction.html


For anyone looking for context on smooshgate: https://developer.chrome.com/blog/smooshgate


Alternatively, with namespace imports in JS you can write [1]:

  import * as someLibrary from "some-library"
  someLibrary.someFunction()
Which works pretty well with IDE autocomplete in my experience.

[1] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


This is a non starter for anything you want to publish online, as it breaks tree shaking which will cause size bloat and therefore slow loading.


I don't think this is true. The example from the esbuild docs uses `import * as lib from './lib.js'` in an example for tree shaking.

https://esbuild.github.io/api/#tree-shaking

Although there are associated issues but they may be specific to esbuild.

https://github.com/evanw/esbuild/issues/1420


Yes, as you pointed out, not only can bundlers tree shake namespace imports, but they're literally used in the esbuild documentation to demonstrate the concept of tree shaking.

The issue you linked to is referring to the case in which you import a namespace object and then re-export it. Bundlers like webpack and rollup (which vite uses in production) can tree shake this pattern as well, but esbuild struggles with it.

If you're using esbuild instead of this:

  import * as someLibrary from "some-library"
  someLibrary.someFunction()
  export { someLibrary }
You can still do this:

  import * as someLibrary from "some-library"
  someLibrary.someFunction()
  export * from "some-library"
  export { default as someLibraryDefault } from "some-library"
  
Tree shaking works as expected for downstream packages using esbuild in the second case, which someone else in the linked issue pointed out: https://github.com/evanw/esbuild/issues/1420#issuecomment-96...


Somewhat related: you technically can access some type metadata in TypeScript at runtime using the `emitDecoratorMetadata` and `experimentalDecorators` tsconfig options, along with Microsoft's `reflect-metadata` polyfill package. There was a trend at one point where people were writing database ORMs using decorator metadata (e.g. Typegoose, TypeORM, and MikroORM).

This of course requires your build tool to actually understand the TS type system, which is why it's not supported in tools like esbuild and tsx (which uses esbuild under the hood).


> I do wonder if this makes the importable gets (via type: json) a reality like assert was going to.

Yes, the JSON modules proposal is finished.

https://github.com/tc39/proposal-json-modules

https://caniuse.com/mdn-javascript_statements_import_import_...


An entire class fetch requests will go away with importable gets. I am excited for this


In node you could always require("food.json")


Not what I am talking about though.

I’m talking about in place of a fetch call, you could simply import a json response from an endpoint, there by bypassing the need to call fetch, and you’ll get the response as if it’s imported.

It won’t replace all GET calls certainly but I can think of quite a few first load ones that can simply be import statements once this happens


Ohh right. That makes sense.


Which browsers have you tested this in? I ran the feature detection script from the Chrome docs and neither Safari nor Firefox seem to support fetch upload streaming: https://developer.chrome.com/docs/capabilities/web-apis/fetc...

  const supportsRequestStreams = (() => {
    let duplexAccessed = false;
  
    const hasContentType = new Request('http://localhost', {
      body: new ReadableStream(),
      method: 'POST',
      get duplex() {
        duplexAccessed = true;
        return 'half';
      },
    }).headers.has('Content-Type');
  
    return duplexAccessed && !hasContentType;
  })();
Safari doesn't appear to support the duplex option (the duplex getter is never triggered), and Firefox can't even handle a stream being used as the body of a Request object, and ends up converting the body to a string, and then setting the content type header to 'text/plain'.


Oops. Chrome only! I stand very much corrected. Perhaps I should do less late night development.

It seems my original statement that download, but not upload, is well supported was unfortunately correct after all. I had thought that readable/transform streams were all that was needed, but as you noted it seems I've overlooked the important lack of duplex option support in Safari/Firefox[0][1]. This is definitely not wide support! I had way too much coffee.

Thank you for bringing this to my attention! After further investigation, I encountered the same problem as you did as well. Firefox failed for me exactly as you noted. Interestingly, Safari fails silently if you use a transformStream with file.stream().pipeThrough([your transform stream here]) but it fails with a message noting lack of support if you specifically use a writable transform stream with file.stream().pipeTo([writable transform stream here]).

I came across the article you referenced but of course didn't completely read it. It's disappointing that it's from 2020 and no progress has been made on this. Poking around caniuse, it looks like Safari and Firefox have patchy support for similar behavior in web workers, either via partial support or behind flags. So I suppose there's hope, but I'm sorry if I got anyone's hope too far up :(

[0] https://caniuse.com/mdn-api_fetch_init_duplex_parameter [1] https://caniuse.com/mdn-api_request_duplex


Enums and parameter properties can be enabled with the --experimental-transform-types CLI option.

Not being able to import TypeScript files without including the ts extension is definitely annoying. The rewriteRelativeImportExtensions tsconfig option added in TS 5.7 made it much more bearable though. When you enable that option not only does the TS compiler stop complaining when you specify the '.ts' extension in import statements (just like the allowImportingTsExtensions option has always allowed), but it also rewrites the paths if you compile the files, so that the build artifacts have the correct js extension: https://www.typescriptlang.org/docs/handbook/release-notes/t...


In Node 22.7 and above you can enable features like enums and parameter properties with the --experimental-transform-types CLI option (not to be confused with the old --experimental-strip-types option).


Excellent update! Thanks!


Node does have a permissions system, but it's opt in. Many runtimes/interpreters either have no sandbox at all, or they're opt in, which is why Deno's sandbox is an upgrade, even if it's not as hardened as iptables or Linux namespaces.


> you can't just allow exact what you need in that category? You have to allow the entire category and then deny everything you don't want/need?

No, you can allow access to specific domains, IP addresses, filesystem paths, environment variables, etc, while denying everything else by default. You can for instance allow access to only a specific IP (e.g. `deno run --allow-net='127.0.0.1' main.ts`), while implicitly blocking every other IP.

What the commenter is complaining about is the fact that Deno doesn't check which IP address a domain name actually resolves to using DNS resolution. So if you explicitly deny '1.1.1.1', and the script you're running fetches from a domain with an A record pointing to '1.1.1.1', Deno will allow it.

In practice, I usually use allow lists rather than deny lists, because I very rarely have an exhaustive list on hand of every IP address or domain I'm expecting a rogue script to attempt to access.


Yeah, that was my point, default deny vs default allow.

If you can default deny, then you're good. It's kind of a junior sysadmin mistake, otherwise, I would say.


There are usecases like SSRF where I want to allow any IP, except for my internal network. They promise they can do that, but they cant.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: