I can give you an example of when I am glad I rebased. There have been many times I have been working on a feature that was going to take some time to finish. In that case my general workflow is to rebase against main every day or two. It lets me keep track of changes and handle conflicts early and makes the eventual merge much simpler. As for debugging I’ve never personally had to do this, but I imagine git bisect would probably work better with rebased, squashed commits.
They kind of spoke to it. Rebasing to bring in changes from main to a feature branch which is a bit longer running keeps all your changes together.
All the commits for your feature get popped on top the commits you brought in from main. When you are putting together your PR you can more easily squash your commits together and fix up your commit history before putting it out for review.
It is a preference thing for sure but I fall into the atomic, self contained, commits camp and rebase workflows make that much cleaner in my opinion. I have worked with both on large teams and I like rebase more but each have their own tradeoffs
Yes but specifically with a rebase merge the commits aren’t interleaved with the commits brought in from mainline like they are with a merge commit.
EDIT: I may have read more into GPs post but on teams that I have been on that used merge commits we did this flow as well where we merged from main before a PR. Resolving conflicts in the feature branch. So that workflow isn’t unique to using rebase.
But using rebase to do this lets you later more easily rewrite history to cleanup the commits for the feature development.
You'll still get interleaved commits. If I work on a branch for a week, committing daily and merging daily from main, when I merge to main, git log will show one commit of mine, then 3 from someone else, then another of mine, etc. The real history of the main branch is that all my commits went in at the same time, after seven days, even if some of them were much older. Rebase tells the real story in this case, merge does not.
I used hg (mercurial) before git. Every time I see someone make an argument like yours I think "only because git's merge/branch model is bad and so you need hacks to make it acceptable".
Git won, which is why I've been using it for more than 10 years, but that doesn't mean it was ever best, it was just most popular and so the rest of the eco system makes it worth it accepting the flaws (code review tools and CI system both have much better git support - these are two critical things that if you use anything else will work against you).
Not only is git not the best, but one of the central value props of coding agents and chatbots used for programming is not having to use git in order to interact with free code.
FWIW I have used git bisect with merged commits and it works just as well (unless the commit is enormous... nothing like settling on a sprawling 100 file change commit as the culprit... good argument for discrete commits, but then it wouldn't matter if it were rebased or merged)
But that’s just it-as a design aid it can really go off the rails, but as a testing strategy it’s really useful in one domain. Defect fixing. If I can convince a junior engineer that when he gets a bug report to first write a test that shows the problem and then fix it, using the test to prove it’s fixed, it provides immense benefits.
> If I can convince a junior engineer that when he gets a bug report to first write a test that shows the problem and then fix it, using the test to prove it’s fixed, it provides immense benefits.
That's just writing a regression test and making sure it catches the regression. What does that have to do with TDD? Does the philosophy of TDD lay claim to any test written before the bugfix, regardless of how much or little someone subscribes to TDD overall?
I resemble this remark. My wife and I run a feral/stray cat rescue, and anecdotally, this seems to be true, with a few caveats. In particular, the ferals tend to sleep with their backs to a wall, which overrides the left/right preference, the strays not so much.
As I recall, one problem was you got silent corruption if you ran out of disk space during certain operations, and there were things that took significantly more disk space while in flight than when finished, so you wouldn’t even know.
When I was at Microsoft, Source Depot was the nicer of the two version control systems I had to use. The other, Source Library Manager, was much worse.
In my neighborhood in the East Valley in Phoenix, I’ve seen Cooper’s hawks, kestrels, peregrine falcons, zone tailed hawks, merlins, and one immature bald eagle. Along with the numerous turkey vultures and the occasional black vulture.
But see, to the politicians this is a feature, not a bug. It’s the same reason that it’s incredibly expensive in terms of permitting and such to start a brick and mortar business in many cities. They would rather leave the locations unoccupied and available for something that will bring in high tax revenue than tie them up with low revenue occupants.
Both HP and Brother offer this - it goes to a server that then sends to an email you have configured. I’d guess the vast majority of people who use the scanner do this rather than setting up a share on a home network.
You used to be able to get uncleaned late Roman bronze coins from the Balkans for about a buck apiece in the early 2000s, so I’d guess they’d be maybe $3-5 each now. Then you get the fun of very carefully removing the encrustations to reveal the coin underneath. You generally ended up with a coin worth about what you paid for it, but it was fun.
reply