Hacker News new | past | comments | ask | show | jobs | submit login
Developer's Guide to SaaS Compliance (courier.com)
181 points by serverlessmom on June 13, 2022 | hide | past | favorite | 37 comments



I worked in this industry for a while (am still adjacent to it). This is a well-written guide, as far as I can tell.

The thing it wouldn't mention is that very, very few companies actual care about complying with data security standards for the sake of keeping PII and sensitive data safe. They are more than happy to do the absolute minimum to pass an audit, and the absolute minimum is shockingly little.

What they really care about — and what lights a fire under them in the way that basic ethics and common sense apparently does not — is passing vendor security reviews.

Shout out to companies with very strict assessments, who actually pay attention and weed out companies with bad practices.


> Shout out to companies with very strict assessments, who actually pay attention and weed out companies with bad practices.

We do B2B software. All of our customers (small community banks) have been incredible hard-asses regarding PII visibility. As they should be.

Even in cases where our customers use a "cloud" solution, it's with some niche 3rd party vendor that would never grant access to someone outside the secret club. Every one of our product installations is effectively an "on-prem" deal.

All of the logging information that we get to see is redacted for PII by the customer's instance before it leaves their secure context. This is a zero-tolerance policy too. We anonymize even the most generic facts and ensure our hashes are salted as specifically as feasible. There are some cases where this burns us (e.g. knowing that the SSN contained a non-numeric digit could be very helpful at troubleshooting time), but on the other hand everyone sleeps better at night knowing that PII is not leaking out into 3rd party buckets arbitrarily.

I would say that being very cautious about PII has opened up more opportunities for us. Our organization operates under the pretense that if any one of our customers were to become compromised by way of our product (or consultation per the product), we are instantly dead and put out of business. We don't sell ourselves exactly this way on sales calls, but we do make it clear that PII is the #1 concern in our minds. For us, it is analogous to safety on a construction site or nuclear power plant.


If only you guys were the common case...


I've worked in this field, as well. Both implementing a FedRAMP'ed PaaS and sponsoring a CSP from the customer side where FedRAMP compliance was required. One thing that is often missing in these articles are compliance costs. Most don't realize that FedRAMP compliance at a High baseline is likely a $750K - 1M investment.


The cost is much higher than that when you account for the friction added to day to day developer work after compliance processes are put into place.

Adding 5% more friction on every step of development compounds a lot.


Then all the good developers leave. A series of decent people hire in, get frustrated and quit. After awhile you just have a core group of either incompetent or desperate people hanging on.

Management can ignore for a few years. Rebooting things isn't too hard. But then the issues that could be ignored can't be anymore. Eventually you get sold for the intellectual property or customer base.


If a "good developer" doesn't want to deal with the overhead of security, then, frankly, I have to ask why they are a "good developer"?

Why does security and compliance frustrate "good developers"? Is it the extra steps required? Is it that it sometimes (often?) means that they don't get to work with bleeding edge/greenfield technology and feel left out?

This seems like the heart of the security issue, IMO. Sure, there are investors and managers who don't prioritize this work, and there are definitely concerns with the amount of investment it takes to accomplish ... but if a large majority of the engineering team were pushing for security and compliance as part of their normal routine in the same way they push for things like automation, would that solve some of other issues too?


So in that case it's much less about a company desiring to comply to these costs and much more about not being able to realistically being able to afford to do so?


Ouch, I had no idea it cost that much. What are the main cost areas?

What would you estimate compliance at a Moderate baseline would be?


1. Engineer costs - A PaaS at the high baseline will likely implement 300+ controls. It's been a while since I looked at an IaaS CSP's FedRAMP package, but they typically implement roughly 100 fully implemented controls. The rest is on the customer to fully implement or engineer completely. Likely 300K-500K worth of engineering costs.

2. Assessment - 3PAO assessor will likely be 100K-200K. Most first time CSP's may require more than 1 assessment as the process is usually (1) Assess (2) Submit to FedRAMP PMO (3) they provide feedback (4) limited time to implement. If you cannot implement in sufficient time, you'll have to reassess. Note, unless you are AWS, Azure, Google, FedRAMP PMO may not prioritize you without sufficient customer support. As a result, your contract with your 3PAO may be expired. You'll need to bring them in again.

3. Documentation experts – There’s an art to generating the FedRAMP package. Engineers typically aren’t good at it, and it often requires one level of abstraction above internal technical documentation. Having technical writing experts that know how to communicate the security implementation without diverging too much is a skill set. You share the bear minimum to get compliance. As there’s business risk from sharing too much (sharing implementation details with a competitor or untrusted source). Also, the more technical details there are, the more audit questions often arise.

4. Control Implementation SME’s – Often time your engineers don’t know how to implement a required security control or don’t know what the compliance people really want. Many CSP’s hire a 3PAO assessor to advice you how to implement. This cannot be the same 3PAO assessor that audits you.

5. Conflict between product/feature value versus control implementation - Sometimes a value or feature of your product directly conflicts with a control requirement. A good example is a CMS PaaS (WP as a service or Drupal as a Service). Those CMS's often support user code or user code to spawn processes. The high baseline requires process whitelisting. Solving this problem while not destroying that feature can be difficult or expensive.


Side note to that, a lot of the orgs that actually have medical PII (schools, especially) will be significantly less compliant themseles with things like HIPAA than the vendors they use.


It's important to take security seriously, and to be rigorous about it. It does not follow that people should take a rigid, maximalist approach to things like SOC2. What SOC2 wants you to do intersects only somewhat with what you should be doing to secure your company.


You have a really good point, thank you for your post and the reminder of this aspect!

I believe fully in the importance of ethics and security that we as a society and we that work in tech should be honoring in full. It is disheartening to watch trusted companies utilize that -just enough- mentality to skim over the tops of audits. We have all seen what it looks like when companies operate from the absolute minimum and how dangerous and disrespectful that is for everyone- company and users included.


> It is disheartening to watch trusted companies utilize that -just enough- mentality to skim over the tops of audits.

I'll be honest, this mentality bothers me. The point of independent audits and guidelines is to tell someone what the minimum bar is. If the minimum is 50, and the company goes from 20 to 50 in order to pass that audit, that's a good thing, not "doing just enough...to skim over the top." If you want to argue the minimum should be 75 instead, fine, but argue that the audit isn't good enough or that the guidelines are wrong, not that the companies are somehow unethical or immoral for spending more money than they need to in order to pass a vendor security review that is not going to award them any extra credit for effort.


Exactly, if you want me to wear 37 pieces of flair, why don't you just make the minimum 37 pieces of flair?


The point is exactly that- morality isn't for "extra credit", we shouldn't need to get rewarded to do the right thing- that's kind of the point.

I mean sure, maybe when it comes to smaller companies. But some of the companies with the biggest security blunders are those that have enough money that the security for their users information should be a major priority and those costs would barely impact their bottom line.

And also... Using this same train of argument couldn't we also just argue that the cost of security compliance shouldn't be this high to begin with so it's more accessible?


Isn't that supposed to be the whole point of these security reviews? To tell people where the line is? Saying "just do more because it's the moral thing to do" is not a convincing argument, because it doesn't tell anyone what "more" is.


> It also stipulates security control measures such as two-factor authentication (2FA) and access control for any accounts that store sensitive information, end-to-end encryption, training staff in data protection awareness, and a data privacy policy.

Uhhh does it? I'm pretty sure it does not explicitly stipulate this.


For ISO 27001:

- Two-factor authentication (2FA) - Not stipulated - Access control for any accounts that store sensitive information - Access control policy is required - End-to-end encryption - Not stipulated - Training staff in data protection awareness, and a data privacy policy - Training policy is required

For the controls not stipulated in the standard (e.g. 2FA, E2E encryption), you may find you ultimately need them once you do an information risk assessment. As long as you explain clearly why the risk is not significant enough to require it or have good alternate controls, you won't get dinged by the auditor for not having these.


Right, but the original article is talking about GDPR. There is nothing in GDPR that says "you need to use MFA".


I'm always wondering whether these very abstract and incoherent standards actually improve or damage _actual_ real world security practices. I've seen whole departments at companies that used to be very focussed on protecting customer and company data shift focus to compliance measures with the effect of actual security getting worse in the process.


The standards are not incoherent. However by design they need to be abstract to apply across very diverse businesses.

I've implemented ISO 27001 myself (solo dev founder, 6 person company, USD2mn SaaS). The divergence in quality of the implementation depends on whether the company is actually using IS027001/SOC2 as a tool to formally define, implement and monitor information security or finding the path of least resistance to accreditation.


Is there a guide to this for small teams and one man bands?


I would repeat user Aaronstotle's comment that smaller teams are likely not the right audience for these levels of compliance.

When it does become a concern, I think this post [0] put out by Latacora (no affiliation) is a great starting guide, although it is bent towards SOC 2 compliance. All of the items they mention are concrete wins for the business from a compliance/security perspective. The process of implementing just a few can help you better realize if your business is going to be able to muster the time and effort needed to continually work towards these standards, or if the business value isn't there yet.

[0] https://latacora.micro.blog/2020/03/12/the-soc-starting.html


I've worked at smaller orgs and my advice would be to first ask yourself if it's worth the cost, they are large time-sucks and cost intensive.

If this is still something you want to pursue, hire experienced help. A majority of time spent in audits is figuring out what the auditors are looking for, and having someone experienced can save you a lot of headache.

Be aware that compliance is more than a one time thing, and during this process you will have created either an entirely new department, or at the very least multiple work-streams.


The recommendation for something like vanta or drata are spot on, and they have some somewhat cheaper competitors. So at cheap end say 5-10k for compliance management sw then 7-10k for auditor, and try to bargain a bit. Also maybe another 5-10k/yr on security related sw.

FWIW doesn't make sense to me for most 1-person SaaS: if you had 30k+ sitting around just for compliance, and presumably 5-10x for other stuff, why not just hire a 2nd person? Small team can make sense if explicitly targeting a.niche regulated space tho. Otherwise wait till beyond a 2 pizza team.

Note: I think it's an indictment of sw ecosystem that an earnest & diligent 1 person SaaS can't easily pass SOC2 style standards, or something equiv for the scale. SSO, RBAC, SIEM, IDS, default policies, etc -- like a SOC2-level heroku kit. Drata & Vanta are cool, but should shift further left.


I've successfully implemented ISO 27001 (CTO, Sole Founder, ~USD2mn ARR SaaS Business). Had a small team around me (5 people) but essentially implemented ISO 27001 myself spread out over 6 months.

Would be very difficult to write a complete guide - it's a long journey and accreditation requirements very specific to the business. Do you have specific questions or areas of concern in mind? Been thinking I could write a few blog posts about my experience.


The idea is for you to stop.

Regulatory capture and learned helplessness. The costs of compliance require deep pockets. When you look at what they actually do, then it's obvious it's just theater. You'll see so many comments on HN that persuade you that security and privacy are too complex for you to handle (learned helplessness). It doesn't matter if your org is 2000 people with entire departments focused on compliance. Security and privacy will always be somewhere out there on the horizon. A mythical thing that no one can obtain. Definitely not a sole developer working alone in their bedroom. So better not try.

Which is a bit crazy that this blog post is targeting "developers". As if developers care about any of this stuff. Executives at large corporations do. But those same developers working at that same company are off in agile land working on micromanaged tickets. They don't have a say in SOC. Not in whether it's worth it, not in how information is collected. Not even in how information is stored, in most cases. Because, again, SOC is top-down. Not bottom-up. The same executives pushing SOC are the same ones pushing Google Analytics. Theater.


Curious - would you even pass SOC2 as a one man band? Assuming you did everything else right, the report would still highlight the risk of one person bus factor.


They don't just make things up in the report; there's a set of controls and risks they capture. "Bus factor" isn't one of them.


But things like "was there a review of this change?" are.


Reviews don't have to be done synchronously or by different people. You could get a SOC2 report as a single-person operation; like any company that doesn't exactly follow the mold of the modal SAAS startup, you'll end up having more of a scoping conversation with your auditor than your peer companies would, but you don't get charged extra for that.


Neat! So only being one person isn't a risk? Interesting.


I’m at a tiny startup doing SOC 2 Type 2 at the moment. As far as I can tell it’s a complete joke. We have written policies but no-one knows what they are and they are never referred to ever. The app is riddled with SQL injection and who knows what other vulnerabilities because the original engineer has no idea what he was doing (working on fixing this!). There’s no auditing, no logging, no security scans (unless Dependabot alone counts), one shared SSH key - I could go on.

And apparently it’s going quite well. No reason to think we won’t get certified. I don’t know whether to laugh.

Is this normal?


Good article!

Just a note: some links are rendered wrong. The Markdown [ ] and ( ) are rendered, instead of an <a> tag.


Fixed, thank you!


On this note, is there a guide for what minimum the average Joe SaaS developer needs to complete before taking money through a SaaS (E.G setting up X type of business entity, getting insurance)? I pay tax to the UK, so any info for that would be fantastic.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: