Deep pulls getting and merging EVERY file

Aug 21, 2012 at 10:12 AM

Is there a reason why issuing the "git tf pull --deep" command that the extension looks at every file in the repository rather than using just the files mentioned in the changeset from TFS?

In my case, I'm working against a repository that is 2GB in size and thousands of files (not desirable, but the current situation), yet all recent changesets tend to be limited to a few lines changed in a few text files.  However, because it seems every file is examined, doing a simple sync from TFS that includes 4 or 5 changesets (maybe 20 file changes) takes 10+ minutes.

Incidental, the same thing is probably true for the initial clone (took 5 days) as it had to build up the 15000 change-sets!

Aug 21, 2012 at 3:53 PM

Actually, there is a good reason, yes.  But there's not a great reason.  We changed the architecture of git-tf a bit fairly recently, and as a result, fetching large repositories got slow.

This was one of those classic tradeoffs:  performance, correctness and flexibility in setup:  pick two.  Frankly, we let performance suffer for a little while.

We do know that performance is poor when working with large repositories, and we know that there are definite improvements to be made.  We're working to address these issues.  In the meantime, I apologize for the speed.