Hacker News new | past | comments | ask | show | jobs | submit login
Google tries internet air-gap for some staff PCs (theregister.com)
138 points by beardyw on July 22, 2023 | hide | past | favorite | 159 comments



This article is missing a key detail: that the expected workflow is that the workstations are basically just build machines.

My workstation is just an SSH server that I do builds on, everything else is on my laptop. I can’t remember the last time I used the internet on my workstation. I install packages but those come via a mirror, I scp files back and forth but that’s internal only.

Lots of people aren’t on this workflow yet, but I don’t think anyone is suggesting airgapping the main interface people are using.


All the coverage on this memo has missed this point. I guess they can’t imagine a company giving its employees two computers!

You can BYO ChromeOS devices too which make great thin terminals to a remote workstation or cloud VM. I have a whole bunch from dogfooding preproduction Chromebooks.


I feel like much of the internal coverage missed the point. There were a lot of people arguing in bad faith, ignoring the fact that the whole idea was clearly tied to the rollout of the particular workflow where no internet would be needed on the machines in question. Most devs still aren't working like that yet, and so pushed back hard.


Yeah this is the unfortunate outcome of the past two years. Memegen is now primed to take anything in the maximally critical way.


Memegen is toxic and, today, a net negative to the company.


Memegen has been toxic for way more than two years. Even before they gave the holiday bonus devices to school children.


The days of good faith, logical, respectful arguments is gone. Googlers (the ones on Memegen) have become a bunch of whining and bad faith actors.


Memegen is unbearably whiny now, but doesn't feel like it's because Google hires whiners. It feels more like valid employee feedback, accurately capturing valid worker sentiment toward shitty, randomizing leadership.


It can be both.


The cultural shift was basically overnight, after big decisions by leadership. Seems to point to the latter rather than the former.


A bunch of people with admin access used a setup like this for a while shortly after the china hack years ago. It’s not that bad.


My first job out of college I ended up with the absolutely slowest box in the office, which definitely pushed me toward optimizing our app (it wasn't even good on fast machines, but it was tragic on slow), but compile times were murder.

We had a Sparcstation somewhere in the office being used as a mail server and nothing else. I don't recall how but I managed to SMB or NFS mount a directory so that I could edit code on my piddly little desktop but compile it on the Sparc. First time Java really worked as advertised for me.

Saved me loads of time and let me iterate on ideas that I don't think I or my boss would have had the patience for otherwise.

I'm not entirely keen on language servers, but those would be another area where distributed computing could come in handy. I think what might be missing though is a multitenant version. Everyone on my team has a duplicate model of our dependencies loaded into Webstorm and it's a pig.


>I ended up with the absolutely slowest box in the office, which definitely pushed me toward optimizing our app

Aside, but I 100% believe most developers of user-facing programs should be forced to use their app on a slow computer/phone with a slow internet connection before being able to ship it.


By “bunch” you mean the entire SRE org. I felt like it was a good setup, but I also felt that the long-term strategy of reducing the access necessary to operate the infrastructure was a good one. But why not both, as the child in the meme asks?


Far from all, mostly traffic and a few others


Is the internal discourse aware of “cleanroom” or has that institutional memory disappeared?


Sounds like fighting against that sort of miserable workflow is part of the point.


It seems so weird to me to want to use Linux on the desktop. This is not The Year.

I decided at the very start of my career, before Google, to use a laptop at the primary interface and only use workstation / dev environment / etc. remotely. That way I have the same working experience at my desk, at a cafe, on my desk at home. The only thing that changes is the number and size of monitors. It's worked out really well, and during the COVID era it only got better due to investments in the remote workflows.

I wouldn't even notice if my VM lost internet access and root.


What's wrong with the desktop at Linux, why "classical desktop" at all? I love my tiled wm setup so much since >10 years, and absolutely hate my corp windows desktop so much for everything it does to me. Gf similar regrets her chosing a private Mac Airbook everytime she uses it and swears to never buy one again every time she uses it (but uses it to seldom to consider just installing an Ubuntu, otherwise would do that)... so we are maybe strange people but

> It seems so weird to me to want to use Linux on the desktop. This is not The Year.

It seems so weird to me everybody considering their Windows or Mac as desktop ;)


I would still want Linux on my laptop.

It has been “the year” for Linux on the desktop for the last 10 years, even more so if you use Emacs as your WM.


> It seems so weird to me to want to use Linux on the desktop. This is not The Year.

Why? I've used it for many years and still like it very much. I use Windows too, but I do most of my work in Linux. Why would this be weird?


I've been using Linux since I started nearly 5 years ago, plus the previous 8 at my last company. I don't have any major issues.

SSH disconnects are still annoying. And mosh/tmux/screen don't do it for me.


Mosh and tmux have worked great for me for the last 12 years. Different strokes I guess.


I understand that a large proportion of Googlers use Google's slightly customized Linux on laptops (which is also the default/recommended), which is completely compatible with your workflow, so I'm not sure what argument you were trying to make (unless you don't count laptops as "Linux on the desktop"?)

I can only infer that you're suggesting people would only want to use Macbooks? Setting aside the fact that that is wrong, you don't make any argument as to why that would be.


Chromebooks are great too.

gLinux laptops are OK I guess? I dunno. Every time I've tried to use Linux as my main interface, it's been crashy and confusing. Maybe it's gotten better since the last time I tried.


The other option is provisioning dev servers as VMs on big hosts, which some other companies do. That’s where the build and development happens, using an IDE locally on the issued laptop which uses remoting.


I suspect it is mostly who supports build machines desktop support or network support.


True. In the case of my current gig, that's not an issue as there is a huge organisation supporting that and live services.


I've been 100% ChromeOS for ten years.

All my dev work, personal, and internet while I work. KVM to swap. External screens.

Strict separation of work and life. No key logging like the work system. No fillers.


I also ran that setup when I worked at Google. I liked it a lot. ChromeOS was a great desktop OS; even when I was in the office, I just ssh'd to my Linux workstation from a Chromebox. Thus, "local" and "remote" had the same experience.


Good for you, my personal life often involves connecting to real hardware in the field and ChromeOS doesn't work for that.


Sure, they give you two computers. But you better not ask for a GPU!


Do you get one of those m2 coral to click in there?


Are you sure? Why would a build machine need access to GMail?

>The report says Google's new pilot program "will disable Internet access on the select desktops, with the exception of internal web-based tools and Google-owned websites like Google Drive and Gmail.

https://arstechnica.com/gadgets/2023/07/to-defeat-hackers-go...


> Are you sure? Why would a build machine need access to GMail?

Because some people like to use remote desktop with it. And because some people's "build machine" is a VM, or a physical desktop in the office for some others.


You can access your mails on your computer, so it's not necessary to access GMail on the remote desktop computer.

Sounds more like Google isn't talking about workstations but the peoples normal computer they work one, just with limited internet access.


My build machine sometimes needs to send logs to attach to bugs or have other humans inspect them. This obviously doesn't have to be Gmail but it comes pretty handy.


Bug trackers and build logs are all internal systems, though.


So is Gmail - it’s Google after all.


Why would a "desktop" be a build machine in this era?


Laptops almost always have cooling which definitely won't sustain 100% CPU or GPU usage for 2 hours for a buildjob. On my desktop with fairly standard, non-fancy air cooling, I could run it on 100% for hours, and the only thing which would happen is the fans spinning loudly. On my work laptop, the poor thing (with an i9!) begins to heat up like crazy after 5-10 minutes of virtual machine usage.


I have noticed here on HN that a lot of people don't have desktops and think they are as good as laptops in terms of performance. lol


Why not?


GPU


Many people at Google still use desktop Linux workstations. Those would be airgapped too under this plan and force these people to change to another workflow. There are at least hundreds of people who have been there 10+ years still using desktops for all development. For example on the team I was previously on, 4/8 people used a desktop workstation as their main device.

I used a cloud VM as my main machine, and still used internet all the time. I used github to sync my configuration (using a proxy would be out of date and no way to push), amongst other things.


~5 years ago I had desktop Linux and a chrome book. All compilation happened via cloud build and the preferred IDE was the cloud-based web app. I didn't really use my desktop for anything, not even ssh.

I switched to a Chromebook in the pandemic and honestly it was fine. My workflow barely changed. I wasn't a vim fanatic at the time so I didn't really care.


Many people don't do that. They sit in front of a desktop with a monitor plugged into it and use the desktop for everything, including stack overflow etc


They don't need the desktop for stack overflow. They're launching a browser. They can launch the browser in any computer without breaking their workflow.


That doesn't make sense to me. You can't copy-paste into and out of your editor across machines.

You see an error message on your desktop, you want to search for it. What do you do?


You can in most modern IDEs/editors (e.g., editing over SSH or clipboard support over remote X/terminal)


The editor for people I'm talking about is running locally on the desktop. The user is viewing a window on a monitor attached to the desktop. How does this work without changing that workflow?


> the preferred IDE was the cloud-based web app

possibly, not everybody is ok with using a not so good editor, no debugger integration and keeping their ssh and pgp keys somewhere else than a local machine.


The in-browser editor at Google is actually very good and has built in debugging support. Some people still don't use it though, and this change seems to ignore that.


>keeping their ssh and pgp keys somewhere else than a local machine

This is Google. Who is using ssh keys for anything?


new gpg supports them for signing… do you not sign commits?


As I said, many people are still not on this workflow, but it’s clearly the workflow everything is moving towards. I don’t believe anyone wants to take the internet away from people not on this workflow, but I’d give it up and be essentially unaffected, and I think there’s a growing minority in that position. Therefore, it seems like a reasonable experiment to pursue to inform future development.


I have a number of co-workers that do the same, but I can't live without my full Linux desktop environment. This is especially true if you use intelliJ or other desktop apps. Yes you can remote desktop, but it's just not the same.


You can run this workflow with a Linux laptop.


For many people the experience of working from a laptop is poor. Mixing screen resolutions and density. Audio devices constantly changing as a plug into or out of my docking station. Yet another thing to fight for space on the desks they're always trying to densify. Etc. Etc.


For various workflows, the only viable solution is to remote desktop into my workstation and run the browser there. SSH isn't enough. And don't get me started on the disaster of trying to do things locally (Cmd+C) and remotely (Ctrl+C) simultaneously or the awful experience of mixing screen DPI when connecting a linux laptop to external monitors...


VSc over ssh? Jetbrains Gateway? I think both work decently.


VSC Remote I can agree with. Gateway feels like such an after thought. So many bugs and it's very resource intense in my experience. I really like Jetbrains products but their remote offerings are lacking. Makes me wonder if that’s why they are working on Fleet to fill that gap.


If all you need is a browser, you can use SSH as a SOCKS proxy for your browser.


We can't use proxy servers on corp-provisioned machines.


Why do we have to tell other people how to work? Why can we not just empower people to work however they'll be most productive?


full ide on local machine that remotes to a devcontainer.


we do the same. Codespaces, gitpod, or coder.dev all provide for this kind of workflow and honestly it just makes so much sense. Corporate desktop for comms and internet, ssh to a devcontainer (or vm if that is desired) for development activity.


Exactly. It's really just about isolating builds and code that runs from everything else. If you can afford the infrastructure, and can afford to make the infrastructure work well for everyone (arguably much harder), then it's a good idea with few downsides.


I wonder if more package management tools should adopt the zero-install approach of Yarn Berry. You don’t even need a mirror: everything is right there in the repository. Want to update a dependency? That’s a commit, thank you very much.


I imagine parent commenter meant OS-level packages.

I.e. things like installing and updating clang, cmake, etc


Yep. It's one of the few things I'm doing on my workstation that ostensibly needs the internet, except because of the mirror it also doesn't.


How do you remote access an air-gapped workstation? Seems like you'd need to be constantly switching whether your laptop is on the internet-connected network or the isolated network. If the laptop can switch between them automatically, wouldn't that make it possible for an attacker to jump the gap?

Even just having hosts that are sometimes internet connected and sometimes on the airgap network will greatly weaken the isolation. Stuxnet could cross an airgap with just static media, allowing thousands of computers that sometimes connect to the internet across the airgap seems like a fatal weakness.


The other key detail missing is presumably that the workstation is not air gapped at all, just prevented from accessing the internet with some firewall rules.

For those that don't know: air gapped means completely disconnected, as in literally pull the network cable out the back and never back in again. File transfers have to happen using some physical medium (traditionally write-once CDs/DVDs). You can have an air gapped network so long as the machines are just connected to each other and there's no physical route whatsoever to the internet.


The other key detail missing was that this was just an experiment that was going to be active only on a small set of machines for a limited time. They wanted to get data on how much it impacts users workflows, what the opt out rate is, etc.

There's not even a current plan to continue limiting access after the trial period ends, much less a plan for expanding it to more machines.


Unless I've misunderstood the documentation, I think Jetbrain's new Fleet editor is planned to support this sort of workflow.


A lot of people I know don't use them like that. They have a chromebook and RDP onto their cloudtop and never close it. One thing I like(d) about google is that they let you bring whatever workflow you're most productive with, rather than prescribing an "expected" workflow on you.


How do you ssh in to your workstation? Via some internet-connected bastion host?


Tldr: no, but it’s complicated. Have a read of the Beyond Corp paper for more details.


If these machines are denied outside internet access, but still connect to an internal network on which other machines have outside internet access, then that's a firewall, not an air gap.


Exactly. The author doesn't know what an airgap actually means.


Multiple times I've seen companies be the originator. Quite simply, it's insurance fraud. They demand an air gap, you say you have one.

You can certainly argue that, at a general level, air gaps don't work for the same reason teen abstinence doesn't work. They don't happen. But in specifics, particularly for infosec professionals, it should be your job to call out dishonesty. Not to not along sagely and say "air gaps don't always work". If your airgapped machine is breached, there better be a USB lying around, or a physical configuration change.

JFC, my Xbox is not airgapped.

(actually it currently is. But you get the point.)


You know you're in for a treat when you see a thread wanking itself over some industry term. Usually it's some brittle, partial definition being forced onto an entire industry. Idk if you've worked in a modern air gapped networked (jk) but the vast and overwhelming majority of them do in fact have a physical connection to a network capable of reaching the Internet.


Yeah, this discussion is really weird. An actual air gap would be great for security and a huge pita to work on. This half-assed fake airgap seems like a bunch of security theater that's going to make people's day to day jobs a lot more annoying


Thank you pointing this out, I was extremely confused.


If you take away the word “workstation”, this is what many dev environments already look like: a mini version of the production, which includes restrictive network rules.

You use the internet from your laptop, not the workstation.

Also, the article is using “air gap” wrong - it refers to an actual physical disconnect, which is not what this is. Only some firewall rules are apparently getting changed.

Disclaimer: I have no privileged information, only common sense from working in security and at FAANGs.


Yes, but if the headline was "Google plans to firewall build servers outside of the internet" we wouldn't have read the article or done any discussion about it... except maybe for some random "wait, they don't to that?!?"


When I worked on azure compute we had dedicated laptops that were basically bricks. All they did was connect to production. No internet no admin rights limited set of software. This is a pretty reasonable security move.

This approach from google is basically the opposite. No internet on your workstation but your laptop works like normal.


The flip side of this is that you can ease the paranoia on the coding machines. You should probably be ready for 5% of them being compromised at any given time.


The workstations are the coding machines. You're not allowed to have google code on your laptop.


I thought BeyondCorp zero trust was supposed to completely and totally solve all of this such that air-gaps and network compartments are a thing of the past.

Air-gapping is creating a system-low zero access enclave. No different architecturally than running a separate access gateway.

Or is this a case of "yeah nobody ever actually believed that?"

Having run classified networks there's absolutely a need for compartmentalized system-high networks


I always thought BeyondCorp was intended to replace the VPN infrastructure with (effectively) a TLS reverse proxy gated by rules, SSO, device posture inspection, etc.

That’s how I always interpreted the marketing and technical documentation anyway.

I would be surprised if they’re not already running PAWs for things like administrative access to production GCP primitives and similar, even if they’re also running PAM, hardware authentication, and so on. I know Microsoft does for admin privileges to the Azure fabric.


I'm simply responding to Google's own language:

"We are removing the requirement for a privileged intranet and moving our corporate applications to the Internet."

That's from the original ZT paper and is making a HUGE claim that all access in computing is more secure if you use ZT over physically (not just logically) separated networks.


I think "corporate applications" are different from "production infrastructure" and "developer tooling"


I see what you’re saying. I agree.


My interpretation of beyond corp is about creating a very small inner parameter where only the prod machines have access - and those “airgapped” super-admin laptops that the article talks about.

99% of dev and admin work then takes place outside that perimeter, outside the VPN, by authenticating machines and encrypting traffic with TLS. Since you will always expect compromised machines and bad actors in this area outside the inner perimeter, a four-eyes principle for any critical actions, such as code changes and configuration changes, is necessary.


this is historic practise in any secure work environment , like govt departments ,healthcare,financial institutions,pharma and energy. based on level of protection required they lock the access accordingly. allowing USB in work station is not allowed generally by any employer. In most secure environments they disconnect your work station from outside world and is generally connected only to internal network. If you require internet access you can apply for it and get approved,which will be monitored by the network admins.


Why is this called an air gap? It’s not. They are just firewalling off workstations most (not all) of the public internet.




I remember working in 2010 with Tata consultancy services as BofA as customer, they would make the service provider go through these hoops. The employees of TCS would connect over teamviewer into a VDI that would have access to the actual server where they would code. It used to be a hub model i.e. the VDI was somewhere is east or west cost and employees were in india, experience was crappy as the customer was getting billed per hour, no one cared.


That sounds terrible. I worked as a telco where we experimented with having a development environment on VMware, twelve VMs in total per developer. You'd remote desktop to on of the VMs and use Visual Studio from there, it was absolutely terrible. I can't imagine throwing TeamViewer across oceans into the mix, I'd go absolutely nuts from the lag alone.


This kind of development is a terrible experience if the overall infrastructure isn't setup to support development also.

To have it be successful you need to not have persistent VMs that individuals are connecting to individually but rather ephemeral VMs that get created/deleted when a user needs one.

Those ephemeral VMs then need to be able to connect to the rest of the infrastructure that supports development - your artifact repositories, version control system, docker nodes, k8s clusters, etc.

Your artifact repository then needs to mirror public repositories where your source packages can be found, be it pypi, github, golang, helm, docker hub, etc. You will now have to setup your IDE or shell package manager to use this artifact repository as a proxy.

The developer tooling is usually an entire team and the infrastructure for the VMs is also.

Not an easy or cheap thing to setup. But it can be done so that the developer experience is good and so that you don't have developers running random versions of software.


This is how quite some enterprises run their infrastructure


What's a BofA? What's a VDI? I feel like Hacker News is filled with acronyms that take some effort to parse.


BofA is Bank of America.

VDI is Virtual Desktop Infrastructure. https://en.wikipedia.org/wiki/Desktop_virtualization


Bank of America. Virtual Desktop... Interface or somesuch.

I like HN full of acronyms, it teaches me what's important enough in an industry to shorten.


BofA is like Bank of America. VDI is virtual desktop infrastructure (like Citrix)


""" The company will disable internet access on the select desktops, with the exception of internal web-based tools and Google -owned websites like Google Drive and Gmail. Some workers who need the internet to do their job will get exceptions, the company stated in materials.

In addition, some employees will have no root access, meaning they won’t be able to run administrative commands or do things like install software. """

-https://www.cnbc.com/2023/07/18/google-restricting-internet-...

This is pretty standard across the industry for some workstations/build machines.


Where I work restricts admin access. We can request it temporarily and all actions are logged.

Works just fine on macOS!


I thought Google’s security is so impeccable that getting a reproducible RCE on engineering workstations doesn’t matter? https://news.ycombinator.com/item?id=35581532


This is exactly the sort of attack this change is helpful for, as pip would not work at all on these machines. Probably the only way to stop these “social engineering” (as google characterized it) attacks.


The scariest part of that thread is the many people saying that closing your eyes and yelling "zero-trust" means that you don't need to protect your devices from compromise.


When your entire business is built around public internet services and public internet enabled devices in the hands of end users, this approach feels like losing the plot. Compare with something like BeyondCorp, which actually advanced the state of practice in corporate security just a few short years ago, in a super internet-positive way.

What'll be their next old-school corporate move? Time cards? A ban on phones with cameras? A buggy middlebox that forges TLS certs?


I nearly lol'ed at the mention of time cards, but even though it seems kinda like facebook banning facebook, actually all of these could have some merit.


Or like Apple banning smartphones among the people that do iPhone development


If bard will provide at least somewhat useful suggestions to quirks of particular language (I'm looking at you, c++) and/or there will be whitelisted or mirrored sites like cppreference/godbolt, then I suspect I won't need actual big internet for work at all.

Most of the reference documentation is actually internal project/other google sources and internal docs/guides/design docs.


I find that idea fascinating but I doubt it will fly.

Todays LLMs are no suitable replacement for documentation, in my experience, because their knowledge is so sparse. You will not notice it immediately because they fill the gaps with plausible nonsense. Also training a model with domain specific knowledge is not a realistic option for most of us as of today.

For having all the reference documentation locally (possibly indexed in a vector database and accessible to the LLM) I'm doubtful as well, since it is so hard to determine scope beforehand. A couple of years ago I tried to program off-grid and prepared a MacBook with Dash (an OSX offline doc reader) and all the reference docs I thought I need. It was a nightmare, and that is from a dude who learned programming before the Internet, based on offline docs solely.


I worked in offline environment for a few years. It was not a nightmare. It's less productive, that's for sure, but not by far margin. And it compensates by lack of online distractions.


Working offline is perfectly doable. I've grown not to rely on connectivity when I was living in a rural area with spotty/slow 3g connectivity at best.

Not relying on network connectivity in my entire pipeline is still today a big discriminator in the stack I choose.

The biggest downside was, and still is, documentation. I had a script that scavenged the installed packages and downloaded all relevant documentation packages automatically, along with a local indexing service (I'm still using and recommend "recoll").

There is a humongous gap between libc/posix (anything instantly accessible with "man" with stunning quality) and pretty much everything else. Even python, which has a decent "online" (built-in) reference, is not as convenient and easily searchable when looking at the html manual itself.

A local search index doesn't have the inference power of something like google to point you to the right example you're looking for something in cppreference.

And once you pass the top 10 best documented packages you have on system, you realize how downright bad everything else is documented and how much you rely on the search engine to fill the gaps from unofficial sources.

For me at least, documentation was always THE problem.


I work in customer support and while I haven't yet used Bard (because at the time it wasn't available to me), I have used ChatGPT to tell me what the customer is actually asking of me, when I didn't understand their emails. And thus far, it's been quite helpful. I even asked if once to rewrite the greeting part of the email so that it included now empathy (I'm kind of emotionally inhibited, as my doctor put it) and judging based on customer response, they were quite pleased.


How are people getting things like this past management? I work in a big company and it is stressed to us that the last thing we ever want to do is paste proprietary code, customer info, internal documents, etc. into a cloud LLM. Especially since there's every chance that the prompts could become part of the future training set.


They probably don't even know, and why would they care? OP never said they were pasting personal information or proprietary code.


At my job, any information we get from customers is treated as private/personal. It would be a fireable offense to load it up into some third party database or service.

I have not, will not, and am not allowed to copy and paste anything into chatGPT from my company that I wouldn't be comfortable putting on twitter. I agree with this policy.


A support ticket or live chat with a customer is customer data. With things like GDPR you need to know exactly where customer data is stored and for how long. Once you're pasting this in cloud LLMs that data is out of your control and you're breaching GDPR.

The last thing someone expects is that when they're replying to a support ticket is Google or OpenAI being fed that info and it's being put into a training dataset.

We all value our privacy but it seems accepted to breach other people's privacy for the sake of using an LLM to make your job easier.


Nobody really ever cared about privacy, they are perfectly okay with it being violated to the same extent as people living under a totalitarian regime for 8 hours out of every day. The security requirements that would need to be imposed in order to provide the assurance of privacy of data that you have given to a third party is so extreme as to be laughable even to imagine an attempt to implement it. The CIA cannot manage that, what makes you think Bank of America can? I also find it interesting that in attempting to protect a clients privacy, we must give up our own privacy and submit to workplace surveillance. Well, everybody is somebody's client and most everybody has clients, so we have a bit of a paradox on our hands it would seem where everybody has to be monitored at all times in order to ensure that nobody is monitoring others at all times....

If GDPR were actually enforced consistently, or similar laws covering more specific information that exist in the United States, the economy would crawl to a screeching halt and there would be warlords fighting it out over what's left of civilization within a week.

I'm not really sure what the solution to this is, only that we are lying to ourselves that we are anywhere close to having figured it out, and anybody involved that tells you otherwise (people selling ztna, etc) is willfully ignorant or a conman.

There is also interesting discussion to be had about the meaning of all of this. It's not clear at all that we have ever had a society that guaranteed the level of privacy that some seem to expect. We always had other ways of monitoring people and violating privacy that did not require electronics. Community involvement to a level sufficient to ensure nobody is totally screwing everybody else over has always been a requirement if you wanted to not be living in a cardboard box or a prison. I happen to think that is better than a total state or the modern corporation, but recognize that others can have different views on this.


I don't get this perspective at all. Seems like a new breed of hyper-legal worry about litigation yet to happen. I don't want to live in the world you live in, buddy. Please don't take it as a personal attack.


It's not about litigation. Companies have actual secrets, such as upcoming products, mergers, and acquisitions. Putting a bunch of non-public information into these things (often run by competitors!) may inadvertently leak that information on generative text, and that's assuming there's no such thing as corporate espionage.


It's not a new breed of hyper-legal worry. It's basic privacy that we all expect. I don't expect a customer support advisor to be pasting my messages into an LLM without my knowledge. And no sane company would allow employees to do it with any sort of customer data or proprietary information.

We have things like GDPR which are all about protecting your data. People are breaking the law and violating the rights of customers for the sake of making their job a little bit easier.


> judging based on customer response, they were quite pleased

I’d like to know if chatgpt agrees.


In summary, it's important to remember that the grandparent comment may have been written by a Large Language Model. Also, I can't help you do that.


cppreference can be downloaded and used locally.

godbolt would be a weird choice to whitelist if you go through the trouble of airgapping in the first place. All it takes is one accidental copy&paste of a sensitive code snippet and it's there for the world to see with no undo.


I hope they also remove USB drives and don't let them connect unauthorized bluetooth and keep their laptops in the office. Most users would just start working on their home pc and then you have their work account compromised on their home laptop. With USBs, you'd be surprised how many air-gap breaching worms there are.


The entire story isn't about laptops. It's about the workstations that may be used to work on code locally (or also build it locally). These machines already aren't allowed outside of premises.


I work at a bank and I know our security is higher than most, but I'd be shocked if usb drives are allowed in the Google offices.


At Arm IME they weren't banned, just training on not picking up from car park etc. - I'd expect the same of Google, at BAE Systems I physically couldn't access the USB (or any) ports on my desktop computer, but that doesn't really work when you're given a Macbook and able to work from home and coffee shop etc.


You can block USB mass storage devices through OS policies, it doesn't prevent everything though (people have demoed frying computers using capacitors, or using USB keyboards to open the terminal and execute code)


>but that doesn't really work when you're given a Macbook and able to work from home and coffee shop etc.

MDM like jamf and/or EDR/Antivirus like SentinelOne and Carbon Black can disable USB ports/anything in software. If your account is not admin/root you cannot remove them also.


True, I just think it's more likely Google IT provides external hard drives for extra capacity on request than that they block it. (At least for most roles, ones it's not also air-gapping!) I may well be wrong, no insider knowledge, just my outsider impression of the company.


They have yubikeys, so unless they only use nfc it is likely. Also, chromebooks and usb-c.


I worked at a pharma company who disabled internet access for most people. Many downloaded things on their phone's 4G and transferred them to their computer with a USB cable, it was pretty pointless.

That said, I'm sure Google's IT are far more competent than that horror-show.


Anyone mentioning is air gap is often scorned these days, because in practice it is rarely used, used for long, or survives “this one workflow we really, really need and is commercially important”. However there are occasions and situations where it can be workable. Air gap is a weapon. Know when to use it.


Seems like Google could just build some routing around their existing archive service and give all of their employees a stale copy of the web for 99% of cases. Use LAN for communication, etc.


Seems a bit extreme but I suppose they have to try to at least some mitigation strategies.

7th largest DDos attack in history last august, Gmail cyberattack a few hours ago. That's just what we know too.


> Those who choose to participate will have their general internet access removed along with root privileges on their individual boxes if they had that

Is it just a software thing? For example IP blocking via iptables? 0-days in OS kernels are not something super surprising in these days, not really sure a software lock would really help that much.

Maybe they should just give their employees two computers, one air-gapped for accessing internal systems and must be kept in the company facility issued the computer, and another one fully online for accessing Stack Overflow and must not store any company information.


Looks like so:

> Google's tools and office software accessed via the web will still be accessible to those cut off from the internet generally. Workers in the program who require internet access for their jobs will be able to get exceptions.

The headline twists the definition of "air gap".


Eh from their perspective google services are in prod, not on the internet.


Unless your production is some really sensitive also air-gapped setup, perspective doesn't change what's going on.


I think we’ve already established that the term air-gap has been misused in the article.


It's almost certainly a network-based firewall, not a host-based one. And if you use multiple firewalls from multiple vendors with different OSes running on the firewalls, theoretically, an attacker could have zero days on all of them, but it's statistically much less likely than an attacker having a zero day on only one popular consumer/server OS.


/> Maybe they should just give their employees two computers, one air-gapped for accessing internal systems

And why do you think that may not already be the case? Even other big tech companies already give out two computers.


It is the case at Google and none of them are actually air gapped in the real meaning of "air gapped".


Maybe the StackOverflow bit could be replaced with a local LLM assistant?


Or a local copy of SO, there are torrents.


It's conceivable that the Google index has a copy of SO


Or they could use Qubes OS, which in some cases is more secure: https://www.qubes-os.org/faq/#how-does-qubes-os-compare-to-u...


Maybe deploying a train load of commodore 64s would work.


some staff = most engineers who work with the monorepo


If only it included meetings and messaging. This way is silly


We all know how this will be used in the next firing round.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: