Was waiting for the first person to point out that what you get when you visit a url is not guaranteed to be the exact same on a subsequent visit.
Not seeing how url-based package management is safer when a package host can use a server that sends a special payload to certain requester ips, headers, cookies or referrer.
Until there are firm guarantees around what you get from a url, a trust-able third party is needed, even if just as an option.
But if you visit https://deno.land/std/examples/welcome.ts in your browser you get back HTML, not raw code.
Anyone know how this works? Is deno.land a special case or is there some Accept header cleverness or something going on?