I also question the general strategy here... databases are fairly complete ecosystems and have their own approaches for checkpointing and so on. But whatever, that's not what you asked! This is the closest thing to a universal dump data from this into that:
http://odo.pydata.org/en/latest/ It's easy to add another node to the network if your desired destination is not implemented. You'd have to (I think) write some logic to walk the tables. Databases also tend to have good tools for dumps and the options may make your life easier. For example, msql: mysqldump -u root -p --tab=/var/lib/mysql-files --compatible=postgresql <db-name> pg_dump is a similar tool, and the options may make the output more interpretable for whatever schemes you have in mind. Cheers, D D On Fri, Aug 10, 2018 at 9:50 AM Bennet Fauber <[email protected]> wrote: > Where possible, we used to keep transaction logs in addition to > database dumps. Those can be replayed from the dumped database state > to replay the transactions up to any point by copying and editing the > transaction log(s). > > My best recollection is that they are text and would be suitable for > entry into Git. They are also smaller, and in the case of needing to > recover because of an entry that causes later corruption or problems, > much easier to modify to make corrections or deletions. > > What is the purpose of putting the database into Git? If you create > and keep transaction logs (possibly in Git), and have something akin > to a rakefile that creates the original database structure, then the > database itself becomes a derived file, akin to a .pyc file, and can > be recreated at will, so tracking the database, per se, isn't > necessary. > > Just a thought. > > -- bennet > > > > > > On Fri, Aug 10, 2018 at 8:49 AM Greg Wilson <[email protected]> > wrote: > > > > Hi Tiffany, > > > > For small SQLite databases, the simplest thing is to dump as SQL text > and put that under version control. I've done this with DBs up to 100kb or > so, and it allows diff and merge to work as they usually do. It's...not > horrible. For larger databases, I've seen groups create a database backup > using the DBMS's native tool and then use something like Git LFS to manage > that backup as a binary blob. It works, but you then have to use the > DBMS's own tools for finding differences and reconciling them. > > > > Cheers, > > > > Greg > > > > On 2018-08-09 10:47 PM, Tiffany A. Timbers via discuss wrote: > > > > Hi folks, > > > > I am looking for SQL Database version control tool recommendations? Ones > that work with Git and are open source are ideal. I have never tread in > this territory before, so all opinions/options welcome! > > > > Thanks! > > Tiffany > > > > > > -- > > If you cannot be brave – and it is often hard to be brave – be kind. > > > > The Carpentries / discuss / see discussions + participants + delivery > options Permalink > > ------------------------------------------ > The Carpentries: discuss > Permalink: > https://carpentries.topicbox.com/groups/discuss/Ta7250f4266e508c5-Mbf1e4c2ac14b8dfcb9781826 > Delivery options: > https://carpentries.topicbox.com/groups/discuss/subscription > ------------------------------------------ The Carpentries: discuss Permalink: https://carpentries.topicbox.com/groups/discuss/Ta7250f4266e508c5-Ma84029e1188f98a30f8988a2 Delivery options: https://carpentries.topicbox.com/groups/discuss/subscription
