>>>>> On Mon, 27 Jun 2011 14:03:24 +0100, Gavin McCullagh said:
> 
> I've been experimenting with a migration from MySQL to Postgres.
> 
> One problem I've come across is that there are a handful of duplicate files
> in the Filename table 
> ...
> I could of course prune four duplicate lines from the data before
> inserting, but I'm afraid of the possible effect on a future restore.
> 
> It appears there are duplicate entries in the File database for each time 
> there
> is a duplicate in the Filename table:

These duplicates in the File table are probably generated by the batch insert
code.  Since each pair has the same FileIndex, it should be safe to elide them
them.


> so perhaps this is safe enough.  Does anyone know how these duplicates may
> have arisen and what the best way to proceed is?

It is probably safe the remove the duplicates.

It isn't clear how the original duplicates in the Filename arose though, but
possibly you had table corruption at some point?  Was this database created by
an old version of Bacula?

__Martin

------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security 
threats, fraudulent activity, and more. Splunk takes this data and makes 
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2d-c2
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to