That was it. It was the tank_data field.

Here is an example of what gets stored in the field. It seems like if I am
able to insert this much data, import_from_csv should be able to import it
as well?

http://gaiaonline.com/chat/gsi/gateway.php?v=json&m=[[6500,[1]],[6510,[%222137385%22,0]],[6511,[%222137385%22,1]],[6512,[%222137385%22,1]],[107,[%22null%22]]]&X=1256164219

-Thadeus




On Thu, Oct 22, 2009 at 10:07 PM, Thadeus Burgess <[email protected]>wrote:

> I tried changing 'wb', to 'w'
>
> Unfortunately, open office freezes when trying to open it....
>
> here is my sql.log http://pastebin.com/m4ea163df
>
> I did not alter any columns that would have truncated data.
>
> There is however, the text field in table booty. tank_data.
>
> This is a massive string of JSON.. massive. like... thats what takes up
> 40mb of my 62mb database.
>
> If anything, this field is causing the error.
>
> Is there any way to not export this column (I don't need the JSON, it will
> get replaced on the next update). Is it save to remove the Field() from the
> db.define_table, delete the database files (to start from fresh) and then
> add the Field declaration again right before import?
>
> -Thadeus
>
>
>
>
>
> On Thu, Oct 22, 2009 at 9:50 PM, mdipierro <[email protected]>wrote:
>
>> think the problem may be that you had corrupted data in the db in
>> the first place (like you stored somet
>>
>
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to