Sadly, I'm unable to find specific documents confirming it. AFAIK, Poland agreed to biometric data sharing with the US Office of Biometric Identity Management in exchange for loosening travel requirements. That said, US seems to be pushing[0] for more such agreements.
> The archive is for everyone, and we welcome all inquiries. However, we prioritize requests that support gaming culture, gaming history, and the games industry. /../ While the archive is not open to the public, we hope /../
The archive is for everyone, but it's only for these groups of people, and it's also not open to the public... Yikes.
I'd much rather support initiatives that actually make the games and software required to run them open to the public, like GOG.com and Internet Archive. This feels like a one-way transaction - society puts games in, society gets nothing back.
This is how most archives work. You can't just have a stroll around for the craic. And there's no point really, because it's not a museum – most people would be bored quite fast, unless you have a specific reason.
Exactly, and you shouldn't have to visit the archive to play its games in the first place. That's why I mentioned IA and GOG.com in particular - both let you download games remotely.
An archive of physical media serves a very different purpose from a bunch of computers loaded with the games from those media that are available to be played. It's kind of like a film vault that stores original movie film, vs. a place like YouTube that lets you play copies of those movies. And playing the game is not the same as examining and handling the original media (CD/tape/cartridge/manual/inserts/box).
Sure, archives often permit you to actually view their original media in person, but that's not always part of their mission. Sometimes the best they'll do is give you copies for a fee. Other times they may lend their original media (or sometimes copies) to qualified entities (spoiler alert: not everybody qualifies). There really is no single "right" way for this to work.
it's a video game archive. Virtually no video game due to the young age of the medium is in the public domain, so both creating digital backups, let alone making them available is a pretty good way to find yourself in a court with the rightsholders.
I don't think anyone would ever get in trouble for creating digital backups, but yeah: making video games available to the public without permission from the copyright owner is a good way to get at the very least ceased-and-desisted.
It is. I have very little respect for artists with sentimentality over such trivial bullshit. Speaking as someone who makes games. The jewel case doesn't matter.
It detracts from the thing-itself, like a showroom car that travels everywhere in a hermetically sealed container. That's not a car anymore, it's waste. Just because it gets driven 5 miles a year doesn't change shit. If someones spending money to preserve my games, I'd rather it'd just be a tarball in a well maintained magnetic tape vault available on-line than some aristocratic funko pop collection for a tiny amount of people to pog at in person.
You may not have experienced gaming in the 1980s and 1990s. Games back then were more than just a stream of bits. They had physical maps (sometimes on cloth, not paper). They had actual manuals (and many of them were art in itself). Disks were in strange, unusual colors, later CDs had shapes, or hidden audio tracks. I vividly remember a copy-protection scheme of black, slightly shiny ink on black paper to prevent photocopying (to read one of the 400 random codes, you had to tilt the page just right.
That needs to be preserved. Just making a digital copy isn't enough.
You correctly intuit that I am young, but I do have a lot of experience with that era of games. They could be reliably played on any shitbox I could commandeer in high school (early 2010s.)
Thus I've cut through a laundry list of silver and golden age cRPGs (and other genres, of course) with cute copy protection and complex manuals. Often times the relevant information was included as a text file (maybe not in the same format it came in), sometimes the copy protection was patched right out of the game (the Scene in the 20th century was just that crazy.) I never particularly cared. I was there for the game, not the auxiliary accoutrements. To some degree I understand the sentimentality and attachment if you grew up with the tactile materials accompanying the game. That being said, I don't feel like my experience of Ultima 3 was lessened because I lacked the cloth map any more than I feel like my experience of Wasteland was ruined because I could ctrl+f the copy protection challenges.
The issue here is that a picture of a book is not a book, a copy of a game is the same game. Barring people with excellent and well-adjusted monitors looking at uncompressed images, the pics we see are (potentially excellent, but still) approximations of the original.
With software the notion of an original is meaningless though.
> The issue here is that a picture of a book is not a book, a copy of a game is the same game.
That... depends. A lot of older games shipped with physical artifacts which were an important part of the game: manuals, code wheels, custom controllers, "feelies" in Infocom games, etc. You can't easily make a copy of those. (And preserving them isn't just a matter of throwing a copy on a hard disk.)
A picture of every page of a book combined in a collection is a copy of that book, and for all intents and purposes is the same book. This is because the overwhelming majority of books, the text is what matters. Gravity's Rainbow is still Gravity's Rainbow whether it's in a PDF or the very first printed copy. To extend that to games, it doesn't matter if you're playing the MS-DOS version of X-COM or OpenXcom.
And for that matter, if you're concerned about minute granular details of visual media being an integral component of the meaning and essence of that media, you should be far more concerned with whether or not the cultural context is one that's even accessible to you in a meaningful capacity, because most are not. The whimsy of the Mona Lisa in person isn't actually all that deep.
> To extend that to games, it doesn't matter if you're playing the MS-DOS version of X-COM or OpenXcom.
Oh boy, are you mistaken. The 1990s version of XCOM was a mess. If it wasn't enough that it was hard as a nail, it also was buggy beyond belief. To many of us, X-COM (and to a lesser extend the reskin 'Terror from the deep') was what started the 'compulsively save the game after every move' trend. OpenXcom in comparison is a lot more forgiving.
> The whimsy of the Mona Lisa in person isn't actually all that deep.
That is in large parts because you never get to experience the original as you are the copies. You always have a thousand tourists around you, all loud, all pushing to get a glimpse of the real painting once in their lives. They do not understand art needs time to work, impressions do not come between a hotdog and a trip to a café - that is why in most galleries, you see little benches that invite you to sit down and immerse yourself in a picture.
Similarly ... just giving the games to anyone without context will lead to people pushing and prodding, getting bored or frustrated easily, and eventually losing interest. For those people, it doesn't matter if you give them a masterpiece like Ultima III or junk. They consume, they move on. Abandonware sites exist for them. A scholarly archive of gaming history, not generally available to the public, still is useful.
For many books that's true, for the book in this example (The Book of Kells) it isn't. Having seen it in pictures and then in person, the difference is notable, partly because of the difference between a camera and a human eye, and partly because what you get in person isn't a 2D map of a 3D object.
GoG makes games available for purchase, but on multiple occasions they've sold games where functionality has been stripped out, or they sell something that straight up doesn't work.
This isn’t how most serious archives work. Archiving media is sensitive, careful work which takes time and space. Providing tours is at very least expensive and at worst a serious risk to the collection.
So what is the point of spending money and effort to preserve video games if nobody can play them? Seems like a waste of time and resources, especially for something so trivial as games.
See, here's the thing: Archives do not have pedestals - they use archival storage. This may come in different forms, but to the uninitiated eye, it often looks a lot like narrow pathways between storage shelves, with labeled cardboard boxes, which may contain, in acid-free paper, sometimes with foam inlays - objects.
Archives exist to preserve what is there, not to show it off. Sometimes, that's for future scientific research. And sometimes, they may participate in museal work as well, lending off objects.
My misunderstanding, sorry - I was imagining this as an online archive that was closed to the public, hence the apparent absurdity of it. The only thing wrong that day was my brain.
By this logic one could simply download Protonmail's TLS certificate instead of trusting a CA and access the service via clearnet. Fully decentralized. Discovery, once again, left as excercise for reader.
Fun fact: if you've ever had bash (or another shell) complain that a file doesn't exist, even though it's on $PATH, check if it's been cached by `hash`. If the file is moved elsewhere on $PATH and bash has the old path cached, you will get an ENOENT. The entire cache can be invalidated with `hash -r`.
It's just how bash works. If there's an entry in the session cache, it uses it. Since executable paths only get cached when you run a command successfully, this only happens when it gets moved from one directory in your PATH to another after you run it once, which isn't that common.
Setting PATH or calling hash -r will clear the session cache or one could run set +h which will disable it altogether.
> this only happens when it gets moved from one directory in your PATH to another after you run it once
It also happens when you have two executables in different directories and then you delete the one with the higher priority. Happens regularly for me after I uninstall a Linux Homebrew package.
Sure but not doing it on ENOENT suggests they’re just being completely lazy. Not to mention that they do have the tools (eg inotify watches) to proactively remove stale entries based on HD changes. Of course I’d be careful about the proactive one as it’s really easy to screw things up more (eg 100 bash instances all watching the same PATH directories might get expensive or forgetting to only do this for interactive mode connected to a TTY)
I think bash has an alias “rehash” that does the same as hash -r too. But zsh doesn’t have it, so “hash -r” has entered my muscle memory, as it works in both shells.
Bah, you’re right! I got it backwards, it’s zsh that has rehash, bash does not. And hash -r works in both.
I guess I’ve been using zsh longer than I thought, because I learned about rehash first, then made the switch to hash -r later. I started using zsh 14 years ago, and bash 20+ years ago, so my brain assumed “I learned about rehash first” must have been back when I was using bash. zsh is still “that new thing” in my head.
the odd thing is, at some point I ended up with `hash -R` as muscle memory that I always type before I correct it to a lower case r, and I'm not sure why, I can't remember any shell that uses `-R`.
Unsure of in which situation, but I've had situations where a script didn't have the right shebang, and as such I had to resort to `alias --save hash='#'` to make sure the script worked.
If you want to be compatible across all shells, use command -v. POSIX mandates it exists and has that returncode behaviour, whereas it doesn't mandate the hash, which or where command
...and of course, if you're going to run the command anyway, and you know an invocation that does nothing and always exits with success, you can do that too. I like doing running "--version" or equivalent in CI systems, because it has the side effect of printing what actual versions were in use during the run.
Yeah, if you're targetting POSIX shells, then "command -v" may be more reliable.
If you're targetting BASH, then "hash" is a builtin so maybe slightly quicker (not that it's likely to be an issue) and it caches the location of "java" or whatever you're looking for, so possibly marginally quicker when you do want to run the command.
Whilst running "java -version" may be useful in some scripts (my scripts often put the output into a debug function so it only runs it when I set LOG_LEVEL to a suitable value, but it writes output to a file and STDERR), you run into an issue of "polluting" STDOUT which then means that you're not going to be using your script in a pipeline without some tinkering (ironically you're putting the failure message into STDERR when you probably don't care as the script is exiting and hopefully breaking the pipeline). Also, it can take some research to figure out what invocation to use for a specific command, whereas the "hash" version can just be used with little thought.
By the way, I don't believe that ">&2" is POSIX compliant, but that's trivial to fix.
The redirection operator:
[n]>&word
shall duplicate one output file descriptor from another, or shall close one. If word evaluates to one or more digits, the file descriptor denoted by n, or standard output if n is not specified, shall be made to be a copy of the file descriptor denoted by word
It's also inferior because the filter lists for requests must be hardcoded and can only be changed through extension updates, which Google (or whoever owns the browser's extension store) can delay or block at their discretion.
This also means users can't install their own filters, which was widely used when YouTube began aggressively bypassing adblockers.
>It's also inferior because the filter lists for requests must be hardcoded and can only be changed through extension updates, which Google (or whoever owns the browser's extension store) can delay or block at their discretion.
This thread is about safari, and its declarative ad blocking API doesn't have this issue.
In the case of Android, εxodus has one[1], though I couldn't find the malware library listed in TFA. Aurora Store[2], a FOSS Google Play Store client, also integrates it.
That seems to be looking at tracking and data collection libraries, though, for things like advertising and crash reporting. I don't see any mention of the kind of 'network sharing' libraries that this article is about. Have I missed it?
> If a wifi password is required to make full use of the device, I will return it.
This is one of my favourite uses of OpenWRT, or any other firmware that gives you proper control over the router - for WiFi-networked IoT devices, I set up a separate wireless network with no WAN/LAN access and client isolation. I can connect to the device, but it can't connect to WAN, any other devices on the IoT network, or my LAN.
Of course this won't work for cloud-tethered devices, but many will expose their functionality directly over network.
It gets even worse, looks like their driver makes calls to google analytics.[1] I'd stay away. The README doesn't even mention it, and the promise that "your personal data will never be shared, sold, or distributed in any form." certainly sounds misleading when you consider this.
This piqued my interest, but I couldn't find anything. Do you know where I could find more information about it?
reply