I don’t disagree with what you’re saying about software having ongoing vulnerability issues. But, that’s exactly what the problem is with communications-centric solutions that don’t offer strong data security protections such as end-to-end encryption: you’re always one CVE away from having your company’s data exposed. And, in this specific case, there is a MAJOR difference between Chrome getting owned, and the program that hosts all of a company’s internal communications concerning, among other things, known vulnerabilities in their software.
Think about that for a second. If someone finds a vulnerability in JIRA, they don’t just find a vulnerability in that software: they’ve got access to support tickets, issue tracking, etc about lots of vulnerabilities in lots of software. That’s a big deal.
The fact that the US government had to step in and say PLEASE TAKE THIS SERIOUSLY, rather than Atlassian going into a Code Red situation, shows that they just don’t take the level of responsibility they’ve been given as seriously as is required for what they’re doing. This isn’t just some lousy app having a CVE. This is the keys to the kingdom for a lot of very critical software. This is systemic risk. The problem isn’t the code, it’s the culture.
If you “work in the world of business software” and you think that’s a “complete bullshit statement,” I really hope you don’t work on anything for which such systemic risk is possible. Because, to turn your statement back on you, that’s a complete bullshit way to treat the responsibility you have for the data with which you’ve been trusted. Go build a social media app or an online shopping site or something, and stay out of critical systems that can create cascading vulnerabilities.
For people who have the Atlassian Cloud offering this is not a problem and has been fixed. The only people who are impacted are the one who host Confluence themselves. There isn't much that Atlassian can do except that tell them to update the software.
If you are maintaining the software of someone else and this software is exposed to the internet that's your responsibility to update the software. If you want your service to be available from the web and have good security, use the cloud offering, that's what it's for.
And if you don't want to use external software to manage your internal knowledge, then create some shared windows folders that nobody use and quickly becomes a mess. What alternative do your propose ?
> and stay out of critical systems that can create cascading vulnerabilities
Oh, you mean like Microsoft and Google in the examples I gave?
Why are you changing the goal posts? Your original statement was that this issue is evidence that "everyone is now aware of the awful engineering practices that underpin their products." My clear argument is that this is not the case, as lots of companies with systemically critical software also have bugs of this magnitude, or more.
Much as I hate everything Atlassian touches (although Portfolio was pretty useful as a stand-alone tool if you kept it far away from JIRA), It’s not like MS Office or Sharepoint have never had vulnerabilities..
> don’t offer strong data security protections such as end-to-end encryption
Imagine being so naive as to think that documentation would be improved by e2e encryption. It's not bad enough convincing developers to write things down, now we need to explicitly share those things with every new person who joins the team?
"Sorry, we can't fix your bug for 6 weeks, Bob's on paternity and the fix is documented on his page".
> MAJOR difference between Chrome getting owned, and the program that hosts all of a company’s internal communications
There's at best 0 difference between these things. Pop chrome, harvest tokens, access Jira. Think about that for a second. What critical company information is not accessible to someone who has arbitrary code execution in your browser.
Oh, serendipitously, the article currently directly below this one on HN is "Apple iMessage Zero-Click Hacks". So add Apple to the list of companies on your "awful engineering practices" list.
Eh? If there was a remote root exploit in Chrome, ALL your data is similarly 0wned, exfiltrated, and for sale to your enemies. EVERY computer you have used Chrome on has is now suspect, and all website data each of those compuers to will now have its data stolen and sold on hackers.ru and .ch. All my employer's business data being stolen is one thing, but all of my online banking credentials are of particular personal interest to me in staying confidential.
Think about that for a second. If someone finds a vulnerability in JIRA, they don’t just find a vulnerability in that software: they’ve got access to support tickets, issue tracking, etc about lots of vulnerabilities in lots of software. That’s a big deal.
The fact that the US government had to step in and say PLEASE TAKE THIS SERIOUSLY, rather than Atlassian going into a Code Red situation, shows that they just don’t take the level of responsibility they’ve been given as seriously as is required for what they’re doing. This isn’t just some lousy app having a CVE. This is the keys to the kingdom for a lot of very critical software. This is systemic risk. The problem isn’t the code, it’s the culture.
If you “work in the world of business software” and you think that’s a “complete bullshit statement,” I really hope you don’t work on anything for which such systemic risk is possible. Because, to turn your statement back on you, that’s a complete bullshit way to treat the responsibility you have for the data with which you’ve been trusted. Go build a social media app or an online shopping site or something, and stay out of critical systems that can create cascading vulnerabilities.