Hacker News new | past | comments | ask | show | jobs | submit login

Also, any code that isn't pushed to a remote doesn't exist. Or, it won't when your hard drive unexpectedly fails and your feature branch disappears. =)



You should be using file backup on any system with code on it. It is so cheap to do this today, it is a no brainer. Apple Time Capsule, Crashplan Pro, etc.


The only non-replacable thing on my development machine is the code. It makes more sense to me to always push my feature branches up to the server, which is backed up, than for each dev machine to be backed up separately.


Why don't you then just backup your dev machine's code directories to another machine?

Backup is good because it's automatic. Saving somewhere else is nice too, but it's not really backup.


It'd be more effort for no gain - and it would encourage me to keep changes locally, which is bad for collaboration.


Thrashing the index with half completed features is ALSO bad for collaboration.


It's good that my colleagues can see what I'm working on. It's not going to slow them down when they don't want it to, since git only fetches those branches you ask it for. So no, I have to disagree.


If you rebase a lot, then you want to avoid pushing code constantly until you get it where you want it.


So push to a remote feature branch that is named after your username, then rebase the changes in that feature branch onto the 'trunk' branch and push the result ?

The D in DVCS is all well and good, but there are business reasons for having a canonical store of all source materials.


One of the many reasons not to rebase code that's been pushed already.


As far as I know there's only one reason not to rebase code that's been pushed.

My point was that pushing constantly as a backup mechanism isn't an option if you intend to rebase frequently (unless you're the only developer on the project).


I have a personal clone of whatever project I am working on, any branches that I publish can be changed at any time using a push -f.


Depends on your team. I'm on a small one, and I often rebase and force push a branch I'm working on, because I know nobody else is working on it.


I suppose if you were really paranoid you could use 'git archive' to snapshot to dropbox.


All of my git repos live in a subdirectory of my dropbox as-is. I also compulsively push to remote. I also use Time Machine. I also rsync my home directory to an external drive in case Time Machine implodes. I also rsync my home directory to a server in another hemisphere.

Come at me, bro.


It's been a while, but storing .git files in Dropbox didn't work out so hot for me. Make changes in two places and Dropbox resolves the conflict by renaming one file to "X (Eli's Conflicted Copy)".... Git really doesn't expect to see it's internal files renamed like that.


The question then becomes "Why do I need more than one computer accessing git when computers now weigh 1kg?"


The last point is a great idea. I've had my NAS Time Machine backup get corrupted and TM forcing me to redo a full backup a few times now to the point of me abandoning it entirely..


Pushing to dropbox is a very common part of my workflow: https://github.com/rpetrich/git-dropbox It's not strictly safe to use from multiple machines, but as a simple backup it's very convenient.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: