[Tfug] Version Control
Tom Rini
trini at kernel.crashing.org
Tue Mar 26 16:33:25 MST 2013
On Tue, Mar 26, 2013 at 6:43 PM, Bexley Hall <bexley401 at yahoo.com> wrote:
> Hi Yan,
>
> On 3/26/2013 11:27 AM, Yan wrote:
>>
>> To throw in my two cents: we use git pretty much exclusively in the lab.
>> We
>> have people that have all their dotfiles in git, we have big projects in
>> git, etc.
>
>
> Again, what do you consider "big"? If you're talking less than a
> millions lines of code, we're comparing apples to crabapples... :>
> (not counting "other objects")
[snip for re-ordering]
>> It can support any file type (although if you're storing movies and crap,
>> there might be performance overhead). Heck, the sparkleshare personal
>> cloud
>> storage stuff is based on it.
>
>
> It's relatively easy to *store* "any file type". A different
> issue is being able to make *sense* of those stored images!
> If all your VCS does is store/retrieve versions of objects
> but provides no meaningful way of comparing them, then it
> does little more than a disk drive does!
[snip for re-ordering]
> My concern with most of the FOSS VC tools that I've surveyed is they
> are heavily oriented towards "writing code". I don't see talk of
> using them to store OpenOffice documents, CAD files, sound files,
> database snapshots, etc.
>
> I.e., the folks *using* them are mostly interested in writing
> and tracking changes to *source code* and little else.
Well, which of these use-cases are you evaluating for? git is able,
but not the best choice for "what did my resume file look like 6
months ago?" but it does it fine (it's just not efficient) or "let me
back up my database" (again, doable but with drawbacks, see
stackoverflow). If you've got a large codebase (or set of codebases)
and docs and related things, and you want one VC for everything, you
have to decide what gets priority. Or pick a VC for "code" and a VC
(or set of backup strategies) for other data. Collaborative document
editing has its own set of headaches.
>> I don't know if anyone's mentioned this, but git (among others as well, of
>> course) is fully distributed. Once you clone the repository from your
>> server, you will still have it (including all the commits and file history
>> and so forth) even if your server explodes. Also, you can go offline,
>> continue working, making commits, etc, and then push that to the main
>> repostory. It's quite nice.
>
>
> I understand. Though colleagues I have spoken with claim this
> to be a *drawback* -- individual developers tend to work in
> isolation "too long" and then there is a "commit frenzy" where
> folks are all trying to merge very different versions of the
> *entire* repository at the same time -- just before a release.
> I.e., because they can freely modify their copy of the *entire*
> repository, they are more likely to make "little tweaks" to
> interfaces, etc. Of course, always *thinking* those tweaks
> are innocuous... until someone else's code *uses* that
> interface (expecting it to be in its untweaked state) and
> finds things suddenly don't work. Then, the blame hunt begins
> (all while management is pushing to get something out the door).
Development silos predate DVCS, based on horror stories from my
colleagues before git was a thing (and somethings I observed myself).
--
Tom
More information about the tfug
mailing list