I ran into this also. I didn't find any easy workaround. It is what it  
is.

What I did was to run the following on the Bugzilla repository before  
running bugzilla2trac.py. You have to run it repeatedly until there is  
no output, since (as you found), there are sometimes multiple  
duplicates. All this script does is to find the offending rows and add  
one second to the bug_when value. Quick and dirty, but relatively  
unintrusive.

#!/bin/bash

(echo 'SELECT bug_id, fieldid, bug_when
FROM bugs_activity
GROUP BY bug_id, fieldid, bug_when
HAVING count( * ) >1
LIMIT 0 , 30 ' | mysql -u bugs -pbugs bugs ) | while read Id fld when
do
         echo $Id, $fld, $when

         echo "UPDATE bugs_activity
                SET bug_when = DATE_ADD(bug_when, INTERVAL 1 second)
                WHERE bug_id = $Id
                AND fieldid = $fld
                AND bug_when = '$when'
                LIMIT 1;" | mysql -u bugs -pbugs bugs


done

On Jul 1, 2009, at 11:25 AM, Ryan McGuire wrote:

> I've noticed multiple tickets that were opened by people  
> experiencing a similar issue but have yet to determine any specifics  
> about this issue.
>
> >From what I understand, this is caused by ticket updates in  
> bugzilla that have the same bug_id and timestamp. When I ran a query  
> to count these against my bugzilla database (which has upwards of  
> 2k) I found over 767 rows.
>
> select distinct count(*),bug_id,bug_when from bugs_activity group by  
> bug_when order by count(*) asc;
> ...
> |        5 |     43 | 2006-06-28 11:40:39 |
> +----------+--------+---------------------+
> 767 rows in set (0.00 sec)
>
> The funny thing here is that ticket 43 got inserted fine, but it  
> failed at ticket 205. I deleted 205 hoping it was just hung up on  
> this ticket and it stopped at 204...
>
> Any ideas here? Is there a way I can modify the script to just  
> continue and skip the ticket if it encounters this error?
>
> Here is the traceback:
>
> inserting ticket 204 -- status summary for all servers needed
> Traceback (most recent call last):
>   File "bugzilla2trac.py", line 913, in ?
>     main()
>   File "bugzilla2trac.py", line 910, in main
>     convert(BZ_DB, BZ_HOST, BZ_USER, BZ_PASSWORD, TRAC_ENV,  
> TRAC_CLEAN)
>   File "bugzilla2trac.py", line 795, in convert
>     trac.addTicketChange (**ticketChange)
>   File "bugzilla2trac.py", line 391, in addTicketChange
>     (ticket, datetime2epoch(time), author, field,
>   File "/usr/lib/python2.4/site-packages/Trac-0.11.4-py2.4.egg/trac/ 
> db/util.py", line 50, in execute
>     return self.cursor.execute(sql_escape_percent(sql), args)
>   File "/usr/lib64/python2.4/site-packages/sqlite/main.py", line  
> 255, in execute
>     self.rs = self.con.db.execute(SQL % parms)
> _sqlite.IntegrityError: columns ticket, time, field are not unique
>
> Thanks,
>
> -Ryan
> >


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Trac 
Users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/trac-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to