Hello all,
I am a member of a computational biology lab that models processes in 
developmental biology and cell signaling and calibrates these models with 
microscopy data. I've recently gotten into using version control using git for 
our codes, and I am now trying to determine the best course of action to take 
for the data. These are the tools I'm aware of but have not tested:

The Dat Project https://datproject.org/
Git Large File Storage https://git-lfs.github.com/
Git Annex https://git-annex.branchable.com/
Data Version Control (DVC) https://dvc.org/

All projects seem to be aimed at researchers trying to integrate data 
versioning into their workflow and collaboration, and some seem to have a few 
other bells and whistles.

Now, the only reason I settled on using git for my work is that it seems to be 
the de facto standard version control just about the whole world uses. Using 
this same reasoning, does anyone here have a keen insight into which of the 
data versioning tools listed here or otherwise is (or will most likely become) 
the standard for data version control?
------------------------------------------
The Carpentries: discuss
Permalink: 
https://carpentries.topicbox.com/groups/discuss/Tb776978a905c0bf8-M26854e6b9b3500ea27de1bc9
Delivery options: https://carpentries.topicbox.com/groups/discuss/subscription

Reply via email to