In general processing almost a gig of CSV data is going to take a 
measurable amount of time.  It's simply unavoidable.

I process similar size CSV files and I have found the Scheduler to be the 
ideal solution.  I don't actually care that much how long it takes -- I 
just want my website to remain responsive while it loads.  This is exactly 
why the scheduler exists, to fork something long-running to another process 
so your site is unaffected.  And it truly is another process -- not a 
thread.  If you have a quad-core you can support a whale of a lot of 
processing using the scheduler.

I built in a little "messaging" table so when my scheduler is done it sends 
a message to me and I can see the results.  It's a marvelous piece of work 
and one of the best features of web2py.

-- Joe

On Sunday, August 25, 2013 3:15:28 PM UTC-7, Sebastian Demian wrote:
>
> Hi guys, I am working on an application using web2py framework, and I need 
> to import a csv file that has a size of 700MB. Is there a faster way to 
> import it into the database ? I have tried the database administration 
> tool, but it takes forever. 
>
> PS: I am a newbie.
>
> Thank you.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to