Hi,

Break up the file into small chunks and then import one by one.


On Wed, Jun 4, 2008 at 10:12 PM, Simon Collins <
[EMAIL PROTECTED]> wrote:

> Dear all,
>
> I'm presently trying to import the full wikipedia dump for one of our
> research users. Unsurprisingly it's a massive import file (2.7T)
>
> Most of the data is importing into a single MyISAM table which has an id
> field and a blob field. There are no constraints / indexes on this table.
> We're using an XFS filesystem.
>
> The import starts of quickly but gets increasingly slower as it progresses,
> starting off at about 60 G per hour but now the MyISAM table is ~1TB it's
> slowed to a load of about 5G per hour. At this rate the import will not
> finish for a considerable time, if at all.
>
> Can anyone suggest to me why this is happening and if there's a way to
> improve performance. If there's a more suitable list to discuss this, please
> let me know.
>
> Regards
>
> Simon
>
> --
> Dr Simon Collins
> Data Grid Consultant
> National Grid Service
> University of Manchester
> Research Computing Services
> Kilburn Building
> Oxford Road
> Manchester
> M13 9PL
>
> Tel 0161 275 0604
>
>
> --
> MySQL General Mailing List
> For list archives: http://lists.mysql.com/mysql
> To unsubscribe:
> http://lists.mysql.com/[EMAIL PROTECTED]
>
>


-- 
Krishna Chandra Prajapati
MySQL DBA,
Ed Ventures e-Learning Pvt.Ltd.
1-8-303/48/15, Sindhi Colony
P.G.Road, Secunderabad.
Pin Code: 500003
Office Number: 040-66489771
Mob: 9912924044
URL: ed-ventures-online.com
Email-id: [EMAIL PROTECTED]

Reply via email to