Hacker Newsnew | past | comments | ask | show | jobs | submit | whiskeykilo's commentslogin

Even more ominous that it's free


It's slow going but we'll get there eventually. NIST made a recent revision to SP 800-53 that includes responsible disclosure as a recommendation: https://csrc.nist.gov/publications/detail/sp/800-53b/final


Thanks very much, I hadn’t seen this and it’s good news!


Signal cares so much about UX that they're starting to agitate security/privacy diehards. They've implemented features like stickers and message reactions that the diehard community calls "useless". And they recently implemented PINs with the intention of storing your encrypted profile and settings for easy recovery on a new device.

I applaud the direction they've taken. These are the kinds of features that will acquire and retain a broader user base.


I recently stopped using Signal over the PIN thing. I do not want them to store my data on their servers, but they won't let me opt out.


Not sure what you’re looking to get into specifically but here’s some resources you can look at in no particular order:

Hack the Box Try Hack Me Pentester Lab Hacker101 Portswigger Web Security Academy Linux Academy Bugcrowd University Hacksplaining Cybrary Malware Unicorn bugbountyguide.com CTFtime root-me.org MalwareTech Nahamsec The Cyber Mentor


Exactly. Most people didn't even get picked for payment. I don't want a shitty monitoring service I want to see Equifax bleed


> I want to see Equifax bleed

This seems vindictive. Their behaviour should be corrected and other people deterred. We shouldn't be calling for blood!


What’s wrong with a corporate death sentence in the event of such egregious mistakes? There must be a severe enough punishment in order for a company’s risk department to take things seriously.


Exactly. There's not right and wrong for companies if there's never any _real_ risk.

If people had to just pay a small fine and go on with their lives for misdeeds, we'd see people hijacking sports cars every day.


It is vindictive. When non-vindictive measures failure, the measure of last resort sadly tends to be the vindictive one. I don't know if this was part of your point (I imagine it was), but the vengeful carcerality of vindictive justice is independent from meaningful structural change that would prevent bad actors from repeating similar actions in the future. Everyone performs their role feigning outrage, repentance, regret and reconciliation, until the same thing happens all over again.

I would be surprised if that did not happen here. The collusion between credit bureaus, banks and lenders is well known for its issues to consumers, but without consequences for adverse consequences to them, business will continue as usual. Will it be savvy fraudsters who place backpressure on the crumbling ability of credit bureaus to prescreen for credit-worthiness? Will upstart incumbents see high margin, low fitness targets ripe for disruption? Will corporate raiders see ossifying remains just ripe enough to be scavenged? Even if all answers to these questions were in the affirmative, I don't know if any of that would lead to immediate change. But, I'd be surprised if business continued as usual forever. The bar for a step function level improvement is quite low, if you could somehow retrofit a better way to pull credit into legacy underwriting processes.

Source: Early stage engineer and prior executive at $PREVIOUS_FIRMS that included two growth stage consumer lending startups.


They pioneered the knowledge based authentication approach. It was never a good one. It's now utterly broken now that they leaked everyone's information.

At the very least, Equifax should be destroyed. And society would benefit more if it were destroyed in such a way to be an example for future organizations working in the space.


Don't worry, it's only metaphorical blood. Equifax is just a concept, and can't be physically hurt.


Their business model shouldn’t exist. That is precisely why we should punish them.


Do you think that Equifax provides the general public with a useful service?


I do. They allow people who are lending money or extending credit to know what kind of risk they are likely taking on. They also let you know how risky you appear to be when you ask someone for a loan or credit. This allows people to reduce their risk, and it allows people to understand and manage the risk that they present.


Each party to a transaction has a different perspective. Speaking about the benefits to one party while ignoring the harm to others is nonsensical. Comprehensive surveillance databases obviously benefit lenders, or they wouldn't pay to create them. But general society is harmed by creating surveillance records that would make even the most staunch Stasi agent blush.


> Speaking about the benefits to one party while ignoring the harm to others is nonsensical.

But I did mention both parties. When I want credit it lets me understand my own risk profile and to manage it:

> They also let you know how risky you appear to be when you ask someone for a loan or credit.

> But general society is harmed by creating surveillance records that would make even the most staunch Stasi agent blush.

I think bad lending and borrowing is a major risk to society.


Speaking how one party can better conform to the other party's requirements is not an honest description of both perspectives.

Bad lending is better attributed to a mistaken belief that borrowers can be perfectly modeled to reduce variance. The economic crunch we're currently facing has been directly caused by cheap credit based on such assumptions which, once again, turn out to be suddenly correlated.


It's a free market. Don't like the payout? Don't submit the bug. Someone else probably will anyway


As someone who grew up on Windows when I initially started working on a Mac daily this drove me absolutely nuts. It could be worded much better.

The way Finder sets the default view is equally unintuitive


Being able to associate a particular file with application instead of all files of the given type might be handy feature.


However useful, I have a feeling that is more likely to be the exception rather than the rule and should therefore be prioritized accordingly


I don't see how Apple could've made it any easier.

1) Select any file.

2) Change the app in the "Open with" dropdown.

3) If you want it to apply to all files click the "Change all" button.


The fact that the upstream HN comment has to explain two identical-sounding options that do different things should help you see what's wrong with the current impl. Your steps don't work with the "Open With -> Options" context menu like they point out, but rather only with the "Get Info -> Open With" options. Not the best UX.

For example, I can imagine if there were instead two radio buttons in "Open With": "Always for this file", "Always for this extension". All in one place.


Doesn't work when Xcode is the default app. I use macdown for mathjax files for math.stackexchange.com, and I have to use "open with" to choose macdown every time.


In old days Mac OS files had two properties: file type and creator type. File type could be like "TEXT" and creator type was a unique code for the application that created it (they were registered at Apple, I believe). All this was purely internal, users never saw such things: Mac file names normally didn't have an "extension", it was just "1995 Report" or something.

As a result it behaved like that:

- When you opened a file, it opened in the program that created it.

- When there were no such program, you could still open it in another program that claimed to understand these files. I don't remember how it was implemented, I think there was a dialog asking if one would like to use another program with some choice (probably the first one that fit).

This was rather handy with formats like "Encapsulated Postscript" in desktop publishing: this format supported both vector and raster images, but one normally used different applications to manipulate them.

This stayed for some time after the move to Mac OS X, but now I believe these things are gone or not used at all.


The concept of file extensions on windows is equally unintuitive (why is the type 9f.file dictated by what comes after a dot). It's just a question of what you're used to.


File extensions are the de facto type indicator on all platforms, not just Windows. Even when the FS has some other way to specify the type (e.g. a separate MIME type metadata field), the extension still normally sets the default value for that.

This is orthogonal to all that, though. It's about the UX to associate an app with a file type, regardless of how that file type is determined.


doesn't Linux use libmagic?


What you're referring to as Linux is actually...

GNU/Linux doesn't automatically choose which program to use for anything (you can't ./file.png like you can in Windows). In my experience file managers tend to use file extensions to choose which application to use, and that application might then use libmagic and ignore the file extension. (e.g. giving a PNG file the .JPG extension will make the file manager think it's a JPEG, and therefore open it with an image viewer, but the image viewer program will use headers to recognise it as a PNG)


xdg-open ./file.png


I've never heard of this. Is this exclusively a Digital Ocean issue?


No, I run into this with my Linode also. Basically any of the large VPS providers and some of the smallest are well known to other services for being used to automate scraping or other things. Linkedin is a great example of one that (used to anyway, haven't tried in a while) completely block any IP that was known to be from a VPS provider.


Nope, this is pretty common. I found out the hard way that Delta doesn’t allow access to their servers from my cloud hosted VPN, which is shitty considering airports are pretty VPN-heavy locations for me. They don’t seem interested in reconsidering this stance either.


I set one up using a free GCP instance and it's been working great so far. Would definitely handle your described usage and save you $5 a month


Aren’t static IPs now excluded from the GCP free tier?

https://github.com/rajannpatel/Pi-Hole-PiVPN-on-Google-Compu...

(I run OpenVPN and PiHole from a GCP micro instance)


Not quite yet. But also like another user said, DuckDNS

"Note: Starting January 1st, 2020, GCP will charge for VM instance external IP addresses. However, under the Free Tier, in-use external IP addresses will be free until you have used a number of hours equal to the total hours in the current month. Free Tier for in-use external IP addresses apply to all instance types (not just f1.micro instances)."


A VPN server doesn't strictly need a static IP, you can use dynamic DNS.


Can you explain, or provide a link?

I’m a networking novice, but in my .ovpn profiles I provide, the IP is hard-coded.


Instead of an IP, you can use a domain. Then you can use Dynamic DNS to keep that domain pointed at your current IP (essentially, you run a small program on the same computer as the VPN server, that updates the DNS provider every time the IP changes).


Isn't there a 1 GB limit on network traffic though?


That's true, I forget about that bc I use it so sparingly

"1 GB network egress from North America to all region destinations per month (excluding China and Australia)"


I suspect they're just trying to expand US cyber capabilities and recruiting.

“If I go to the next capture-the-flag contest and I see some college students using Ghidra, I will be really excited” - Rob Joyce, senior cybersecurity adviser at NSA

Source: https://www.cyberscoop.com/ghidra-nsa-tool-public/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: